Mar 12 13:09:36 crc systemd[1]: Starting Kubernetes Kubelet... Mar 12 13:09:36 crc restorecon[4746]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:36 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 13:09:37 crc restorecon[4746]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 12 13:09:37 crc restorecon[4746]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 12 13:09:37 crc kubenswrapper[4921]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 13:09:37 crc kubenswrapper[4921]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 12 13:09:37 crc kubenswrapper[4921]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 13:09:37 crc kubenswrapper[4921]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 13:09:37 crc kubenswrapper[4921]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 12 13:09:37 crc kubenswrapper[4921]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.714489 4921 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724720 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724810 4921 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724837 4921 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724846 4921 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724855 4921 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724866 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724904 4921 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724913 4921 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724921 4921 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724930 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724938 4921 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724946 4921 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724954 4921 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724990 4921 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.724998 4921 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725007 4921 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725015 4921 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725023 4921 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725034 4921 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725044 4921 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725081 4921 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725090 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725098 4921 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725105 4921 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725114 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725122 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725130 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725165 4921 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725186 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725194 4921 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725203 4921 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725211 4921 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725219 4921 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725259 4921 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725269 4921 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725278 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725288 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725298 4921 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725308 4921 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725346 4921 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725355 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725364 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725373 4921 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725383 4921 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725394 4921 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725436 4921 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725447 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725456 4921 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725464 4921 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725473 4921 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725480 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725488 4921 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725524 4921 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725532 4921 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725540 4921 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725548 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725556 4921 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725564 4921 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725572 4921 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725579 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725615 4921 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725622 4921 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725632 4921 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725640 4921 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725647 4921 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725656 4921 feature_gate.go:330] unrecognized feature gate: Example Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725664 4921 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725705 4921 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725715 4921 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725723 4921 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.725732 4921 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.726968 4921 flags.go:64] FLAG: --address="0.0.0.0" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.726997 4921 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727014 4921 flags.go:64] FLAG: --anonymous-auth="true" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727027 4921 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727039 4921 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727048 4921 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727062 4921 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727074 4921 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727084 4921 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727093 4921 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727103 4921 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727113 4921 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727153 4921 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727162 4921 flags.go:64] FLAG: --cgroup-root="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727172 4921 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727181 4921 flags.go:64] FLAG: --client-ca-file="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727190 4921 flags.go:64] FLAG: --cloud-config="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727199 4921 flags.go:64] FLAG: --cloud-provider="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727210 4921 flags.go:64] FLAG: --cluster-dns="[]" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727227 4921 flags.go:64] FLAG: --cluster-domain="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727237 4921 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727247 4921 flags.go:64] FLAG: --config-dir="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727257 4921 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727267 4921 flags.go:64] FLAG: --container-log-max-files="5" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727278 4921 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727288 4921 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727298 4921 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727309 4921 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727319 4921 flags.go:64] FLAG: --contention-profiling="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727328 4921 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727338 4921 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727347 4921 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727357 4921 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727369 4921 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727378 4921 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727387 4921 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727396 4921 flags.go:64] FLAG: --enable-load-reader="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727405 4921 flags.go:64] FLAG: --enable-server="true" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727415 4921 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727426 4921 flags.go:64] FLAG: --event-burst="100" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727436 4921 flags.go:64] FLAG: --event-qps="50" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727446 4921 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727455 4921 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727465 4921 flags.go:64] FLAG: --eviction-hard="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727477 4921 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727486 4921 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727495 4921 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727506 4921 flags.go:64] FLAG: --eviction-soft="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727515 4921 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727524 4921 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727534 4921 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727543 4921 flags.go:64] FLAG: --experimental-mounter-path="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727552 4921 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727561 4921 flags.go:64] FLAG: --fail-swap-on="true" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727570 4921 flags.go:64] FLAG: --feature-gates="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727582 4921 flags.go:64] FLAG: --file-check-frequency="20s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727591 4921 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727600 4921 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727609 4921 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727622 4921 flags.go:64] FLAG: --healthz-port="10248" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727631 4921 flags.go:64] FLAG: --help="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727640 4921 flags.go:64] FLAG: --hostname-override="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727648 4921 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727658 4921 flags.go:64] FLAG: --http-check-frequency="20s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727667 4921 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727676 4921 flags.go:64] FLAG: --image-credential-provider-config="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727685 4921 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727694 4921 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727702 4921 flags.go:64] FLAG: --image-service-endpoint="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727712 4921 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727721 4921 flags.go:64] FLAG: --kube-api-burst="100" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727729 4921 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727739 4921 flags.go:64] FLAG: --kube-api-qps="50" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727750 4921 flags.go:64] FLAG: --kube-reserved="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727759 4921 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727768 4921 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727777 4921 flags.go:64] FLAG: --kubelet-cgroups="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727785 4921 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727794 4921 flags.go:64] FLAG: --lock-file="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727803 4921 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727852 4921 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727862 4921 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727888 4921 flags.go:64] FLAG: --log-json-split-stream="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727898 4921 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727908 4921 flags.go:64] FLAG: --log-text-split-stream="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727917 4921 flags.go:64] FLAG: --logging-format="text" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727926 4921 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727936 4921 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727945 4921 flags.go:64] FLAG: --manifest-url="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727954 4921 flags.go:64] FLAG: --manifest-url-header="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727967 4921 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727977 4921 flags.go:64] FLAG: --max-open-files="1000000" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727989 4921 flags.go:64] FLAG: --max-pods="110" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.727998 4921 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728008 4921 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728017 4921 flags.go:64] FLAG: --memory-manager-policy="None" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728027 4921 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728036 4921 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728045 4921 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728055 4921 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728076 4921 flags.go:64] FLAG: --node-status-max-images="50" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728085 4921 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728095 4921 flags.go:64] FLAG: --oom-score-adj="-999" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728105 4921 flags.go:64] FLAG: --pod-cidr="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728114 4921 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728129 4921 flags.go:64] FLAG: --pod-manifest-path="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728139 4921 flags.go:64] FLAG: --pod-max-pids="-1" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728149 4921 flags.go:64] FLAG: --pods-per-core="0" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728159 4921 flags.go:64] FLAG: --port="10250" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728168 4921 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728179 4921 flags.go:64] FLAG: --provider-id="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728188 4921 flags.go:64] FLAG: --qos-reserved="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728198 4921 flags.go:64] FLAG: --read-only-port="10255" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728207 4921 flags.go:64] FLAG: --register-node="true" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728218 4921 flags.go:64] FLAG: --register-schedulable="true" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728228 4921 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728243 4921 flags.go:64] FLAG: --registry-burst="10" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728252 4921 flags.go:64] FLAG: --registry-qps="5" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728261 4921 flags.go:64] FLAG: --reserved-cpus="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728271 4921 flags.go:64] FLAG: --reserved-memory="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728283 4921 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728293 4921 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728326 4921 flags.go:64] FLAG: --rotate-certificates="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728337 4921 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728346 4921 flags.go:64] FLAG: --runonce="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728356 4921 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728366 4921 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728375 4921 flags.go:64] FLAG: --seccomp-default="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728385 4921 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728394 4921 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728404 4921 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728413 4921 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728423 4921 flags.go:64] FLAG: --storage-driver-password="root" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728433 4921 flags.go:64] FLAG: --storage-driver-secure="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728442 4921 flags.go:64] FLAG: --storage-driver-table="stats" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728451 4921 flags.go:64] FLAG: --storage-driver-user="root" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728460 4921 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728470 4921 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728479 4921 flags.go:64] FLAG: --system-cgroups="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728488 4921 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728502 4921 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728511 4921 flags.go:64] FLAG: --tls-cert-file="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728520 4921 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728534 4921 flags.go:64] FLAG: --tls-min-version="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728544 4921 flags.go:64] FLAG: --tls-private-key-file="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728553 4921 flags.go:64] FLAG: --topology-manager-policy="none" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728563 4921 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728572 4921 flags.go:64] FLAG: --topology-manager-scope="container" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728581 4921 flags.go:64] FLAG: --v="2" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728594 4921 flags.go:64] FLAG: --version="false" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728606 4921 flags.go:64] FLAG: --vmodule="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728618 4921 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.728629 4921 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.728898 4921 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.728912 4921 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.728924 4921 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.728934 4921 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.728944 4921 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.728953 4921 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.728962 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.728970 4921 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.728978 4921 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.728987 4921 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.728994 4921 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729002 4921 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729010 4921 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729018 4921 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729026 4921 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729034 4921 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729041 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729050 4921 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729057 4921 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729065 4921 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729073 4921 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729081 4921 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729092 4921 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729102 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729112 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729121 4921 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729131 4921 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729140 4921 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729150 4921 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729160 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729169 4921 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729208 4921 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729218 4921 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729226 4921 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729234 4921 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729242 4921 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729250 4921 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729258 4921 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729267 4921 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729274 4921 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729282 4921 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729291 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729299 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729306 4921 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729314 4921 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729322 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729331 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729339 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729346 4921 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729354 4921 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729361 4921 feature_gate.go:330] unrecognized feature gate: Example Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729369 4921 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729377 4921 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729386 4921 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729394 4921 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729401 4921 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729410 4921 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729418 4921 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729426 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729441 4921 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729449 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729457 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729465 4921 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729473 4921 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729483 4921 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729492 4921 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729502 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729511 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729520 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729528 4921 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.729536 4921 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.729551 4921 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.742105 4921 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.742156 4921 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742291 4921 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742306 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742316 4921 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742326 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742335 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742343 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742351 4921 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742358 4921 feature_gate.go:330] unrecognized feature gate: Example Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742366 4921 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742378 4921 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742391 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742400 4921 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742409 4921 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742419 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742428 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742436 4921 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742444 4921 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742451 4921 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742461 4921 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742472 4921 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742481 4921 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742490 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742498 4921 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742507 4921 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742515 4921 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742524 4921 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742532 4921 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742540 4921 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742548 4921 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742556 4921 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742563 4921 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742571 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742581 4921 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742590 4921 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742603 4921 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742613 4921 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742623 4921 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742633 4921 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742643 4921 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742652 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742661 4921 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742669 4921 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742677 4921 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742686 4921 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742697 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742716 4921 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742732 4921 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742743 4921 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742754 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742763 4921 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742773 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742783 4921 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742792 4921 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742801 4921 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742846 4921 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742863 4921 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742877 4921 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742889 4921 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742900 4921 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742910 4921 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742921 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742933 4921 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742943 4921 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742953 4921 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742965 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742975 4921 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742985 4921 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.742996 4921 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743004 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743013 4921 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743025 4921 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.743040 4921 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743266 4921 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743278 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743289 4921 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743300 4921 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743312 4921 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743325 4921 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743347 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743359 4921 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743370 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743382 4921 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743391 4921 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743401 4921 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743409 4921 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743416 4921 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743425 4921 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743433 4921 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743440 4921 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743448 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743455 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743464 4921 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743471 4921 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743479 4921 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743487 4921 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743494 4921 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743502 4921 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743511 4921 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743518 4921 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743526 4921 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743533 4921 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743540 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743548 4921 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743556 4921 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743563 4921 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743570 4921 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743580 4921 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743588 4921 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743596 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743603 4921 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743610 4921 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743618 4921 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743625 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743633 4921 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743641 4921 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743648 4921 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743656 4921 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743663 4921 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743671 4921 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743683 4921 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743694 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743703 4921 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743712 4921 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743721 4921 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743729 4921 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743738 4921 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743745 4921 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743753 4921 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743761 4921 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743771 4921 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743780 4921 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743788 4921 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743795 4921 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743803 4921 feature_gate.go:330] unrecognized feature gate: Example Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743844 4921 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743855 4921 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743865 4921 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743873 4921 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743881 4921 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743890 4921 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743898 4921 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743906 4921 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.743915 4921 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.743928 4921 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.744260 4921 server.go:940] "Client rotation is on, will bootstrap in background" Mar 12 13:09:37 crc kubenswrapper[4921]: E0312 13:09:37.748925 4921 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.754589 4921 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.754798 4921 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.757038 4921 server.go:997] "Starting client certificate rotation" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.757087 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.757942 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.785355 4921 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 13:09:37 crc kubenswrapper[4921]: E0312 13:09:37.788776 4921 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.789173 4921 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.807732 4921 log.go:25] "Validated CRI v1 runtime API" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.850800 4921 log.go:25] "Validated CRI v1 image API" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.852975 4921 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.859106 4921 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-12-13-02-58-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.859157 4921 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.885878 4921 manager.go:217] Machine: {Timestamp:2026-03-12 13:09:37.882443175 +0000 UTC m=+0.572515226 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2fb4ffa9-fae0-4002-98df-640245dc5e65 BootID:cb0bf9b7-9747-40d7-a967-f44b0632d26d Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:01:a0:34 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:01:a0:34 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e8:34:6c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b2:c0:ff Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:31:9f:0a Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ea:35:cb Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:fd:91:21 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:9e:0b:16:3f:84:6f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:26:59:1e:5e:83:e7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.886285 4921 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.886626 4921 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.888061 4921 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.888440 4921 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.888516 4921 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.889034 4921 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.889061 4921 container_manager_linux.go:303] "Creating device plugin manager" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.889762 4921 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.889864 4921 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.890709 4921 state_mem.go:36] "Initialized new in-memory state store" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.890917 4921 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.894451 4921 kubelet.go:418] "Attempting to sync node with API server" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.894492 4921 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.894604 4921 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.894637 4921 kubelet.go:324] "Adding apiserver pod source" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.894662 4921 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.899534 4921 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.901081 4921 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.901096 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.901282 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:37 crc kubenswrapper[4921]: E0312 13:09:37.901347 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 12 13:09:37 crc kubenswrapper[4921]: E0312 13:09:37.901380 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.903642 4921 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.905624 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.905842 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.905965 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.906098 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.906221 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.906326 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.906427 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.906565 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.906682 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.906788 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.906936 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.907060 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.908086 4921 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.908970 4921 server.go:1280] "Started kubelet" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.909943 4921 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.909918 4921 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 13:09:37 crc systemd[1]: Started Kubernetes Kubelet. Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.913390 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.913782 4921 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.915966 4921 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.916018 4921 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 13:09:37 crc kubenswrapper[4921]: E0312 13:09:37.916332 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.916392 4921 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.916449 4921 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.916656 4921 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 12 13:09:37 crc kubenswrapper[4921]: E0312 13:09:37.916756 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="200ms" Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.916847 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:37 crc kubenswrapper[4921]: E0312 13:09:37.916918 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.923879 4921 factory.go:55] Registering systemd factory Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.923933 4921 factory.go:221] Registration of the systemd container factory successfully Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.924845 4921 factory.go:153] Registering CRI-O factory Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.924891 4921 factory.go:221] Registration of the crio container factory successfully Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.924987 4921 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.925021 4921 factory.go:103] Registering Raw factory Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.925045 4921 manager.go:1196] Started watching for new ooms in manager Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.925780 4921 manager.go:319] Starting recovery of all containers Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.926605 4921 server.go:460] "Adding debug handlers to kubelet server" Mar 12 13:09:37 crc kubenswrapper[4921]: E0312 13:09:37.930268 4921 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c1a08b1027d12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.908923666 +0000 UTC m=+0.598995667,LastTimestamp:2026-03-12 13:09:37.908923666 +0000 UTC m=+0.598995667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.940931 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941059 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941080 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941099 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941117 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941130 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941149 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941248 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941270 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941282 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941300 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941344 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941358 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941381 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941394 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941435 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941451 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941513 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941529 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941543 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941561 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941576 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941635 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941671 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941683 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941750 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941795 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941831 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941854 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941868 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941880 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941956 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941970 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.941988 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942021 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942059 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942076 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942088 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942105 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942144 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942156 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942173 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942186 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942240 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942254 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942268 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942285 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942323 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942339 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942357 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942369 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942391 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942453 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942480 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942503 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942553 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942646 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942665 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942683 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942700 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942717 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942732 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.942745 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945054 4921 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945136 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945165 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945237 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945261 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945282 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945312 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945335 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945381 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945402 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945425 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945469 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945492 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945521 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945583 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945604 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945649 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945670 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945692 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945719 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945741 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945768 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945790 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945853 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945897 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945918 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945945 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945965 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.945988 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946042 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946068 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946096 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946178 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946204 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946225 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946249 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946268 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946288 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946312 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946372 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946493 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946520 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946556 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946581 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946600 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946641 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946668 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946684 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946703 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946723 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946742 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946762 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946777 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946793 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946898 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946915 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946932 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946946 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946961 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946974 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.946985 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947000 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947014 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947030 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947058 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947071 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947088 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947101 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947116 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947129 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947142 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947160 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947174 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947186 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947197 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947208 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947220 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947231 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947247 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947257 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947268 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947282 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947292 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947303 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947315 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947326 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947338 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947356 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947366 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947380 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947393 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947406 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947416 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947426 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947440 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947449 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947462 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947473 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947485 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947498 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947511 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947523 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947535 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947546 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947560 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947571 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947585 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947609 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947619 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947632 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947646 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947656 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947668 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947737 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947751 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947762 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947771 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947783 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947867 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.947883 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949113 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949177 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949195 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949212 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949230 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949245 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949259 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949276 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949292 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949307 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949332 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949348 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949364 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949424 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949443 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949461 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949477 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949495 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949509 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949525 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949541 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949558 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949576 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949592 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949607 4921 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949620 4921 reconstruct.go:97] "Volume reconstruction finished" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.949631 4921 reconciler.go:26] "Reconciler: start to sync state" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.950862 4921 manager.go:324] Recovery completed Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.959941 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.964187 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.964227 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.964240 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.966132 4921 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.966152 4921 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.966183 4921 state_mem.go:36] "Initialized new in-memory state store" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.979301 4921 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.981667 4921 policy_none.go:49] "None policy: Start" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.981946 4921 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.982021 4921 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.982057 4921 kubelet.go:2335] "Starting kubelet main sync loop" Mar 12 13:09:37 crc kubenswrapper[4921]: E0312 13:09:37.982136 4921 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.982587 4921 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 12 13:09:37 crc kubenswrapper[4921]: I0312 13:09:37.982632 4921 state_mem.go:35] "Initializing new in-memory state store" Mar 12 13:09:37 crc kubenswrapper[4921]: W0312 13:09:37.982672 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:37 crc kubenswrapper[4921]: E0312 13:09:37.982744 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 12 13:09:38 crc kubenswrapper[4921]: E0312 13:09:38.017192 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.055397 4921 manager.go:334] "Starting Device Plugin manager" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.055530 4921 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.055551 4921 server.go:79] "Starting device plugin registration server" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.056285 4921 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.056304 4921 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.056521 4921 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.056761 4921 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.056782 4921 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 13:09:38 crc kubenswrapper[4921]: E0312 13:09:38.065566 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.083617 4921 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.083801 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.085702 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.085769 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.085793 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.085983 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.086242 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.086297 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.086969 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.087052 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.087072 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.087323 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.087349 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.087372 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.087381 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.087524 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.087585 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.088714 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.088740 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.088753 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.088859 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.088884 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.088923 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.088985 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.089160 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.089232 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.090333 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.090378 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.090394 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.090794 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.090841 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.090855 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.091106 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.091336 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.091428 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.091992 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.092029 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.092047 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.092404 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.092447 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.092596 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.092676 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.092698 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.093831 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.093878 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.093897 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:38 crc kubenswrapper[4921]: E0312 13:09:38.119001 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="400ms" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.151860 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.151934 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.151979 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.152023 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.152091 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.152164 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.152191 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.152215 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.152305 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.152508 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.152678 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.152715 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.152785 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.152887 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.152966 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.157204 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.158941 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.159007 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.159026 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.159069 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:09:38 crc kubenswrapper[4921]: E0312 13:09:38.159803 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.254513 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.254604 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.254654 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.254690 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.254723 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.254756 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.254791 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.254829 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.254884 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.254876 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.254889 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.254941 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.254832 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255016 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255028 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255052 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255060 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255113 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255149 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255218 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255249 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255286 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255307 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255317 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255333 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255347 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255359 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255394 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255472 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.255514 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.361018 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.362805 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.362901 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.362928 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.362978 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:09:38 crc kubenswrapper[4921]: E0312 13:09:38.363809 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.427040 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.434889 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.453219 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.470352 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: W0312 13:09:38.476649 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ce5e7b5e734162c484132c64f86f006c5b13a7cc4fa2d547187b5b17c8f4d44d WatchSource:0}: Error finding container ce5e7b5e734162c484132c64f86f006c5b13a7cc4fa2d547187b5b17c8f4d44d: Status 404 returned error can't find the container with id ce5e7b5e734162c484132c64f86f006c5b13a7cc4fa2d547187b5b17c8f4d44d Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.479011 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 13:09:38 crc kubenswrapper[4921]: W0312 13:09:38.480127 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-186c4cf5c4aa5a9f47fa41ba7e0fd2983ed2c3e9b0652598a133d271673a1586 WatchSource:0}: Error finding container 186c4cf5c4aa5a9f47fa41ba7e0fd2983ed2c3e9b0652598a133d271673a1586: Status 404 returned error can't find the container with id 186c4cf5c4aa5a9f47fa41ba7e0fd2983ed2c3e9b0652598a133d271673a1586 Mar 12 13:09:38 crc kubenswrapper[4921]: W0312 13:09:38.487708 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-b700691b7f77c4c918ad1e1deab7de223ed0acc92bf7d352cc231df31dcce01c WatchSource:0}: Error finding container b700691b7f77c4c918ad1e1deab7de223ed0acc92bf7d352cc231df31dcce01c: Status 404 returned error can't find the container with id b700691b7f77c4c918ad1e1deab7de223ed0acc92bf7d352cc231df31dcce01c Mar 12 13:09:38 crc kubenswrapper[4921]: W0312 13:09:38.500257 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-1a700fb669a5e6c0a5a3d984e589d509d4a43c686e05dd65b59f5219df8ec3c6 WatchSource:0}: Error finding container 1a700fb669a5e6c0a5a3d984e589d509d4a43c686e05dd65b59f5219df8ec3c6: Status 404 returned error can't find the container with id 1a700fb669a5e6c0a5a3d984e589d509d4a43c686e05dd65b59f5219df8ec3c6 Mar 12 13:09:38 crc kubenswrapper[4921]: W0312 13:09:38.508782 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-83a80123161de1b2a0bc7026b9f676c4980a6b318bb123b237545dc0b993ab7b WatchSource:0}: Error finding container 83a80123161de1b2a0bc7026b9f676c4980a6b318bb123b237545dc0b993ab7b: Status 404 returned error can't find the container with id 83a80123161de1b2a0bc7026b9f676c4980a6b318bb123b237545dc0b993ab7b Mar 12 13:09:38 crc kubenswrapper[4921]: E0312 13:09:38.519807 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="800ms" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.764412 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.766786 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.766861 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.766876 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.766914 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:09:38 crc kubenswrapper[4921]: E0312 13:09:38.767376 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.915280 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:38 crc kubenswrapper[4921]: W0312 13:09:38.974247 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:38 crc kubenswrapper[4921]: E0312 13:09:38.974410 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.987292 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"83a80123161de1b2a0bc7026b9f676c4980a6b318bb123b237545dc0b993ab7b"} Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.989136 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1a700fb669a5e6c0a5a3d984e589d509d4a43c686e05dd65b59f5219df8ec3c6"} Mar 12 13:09:38 crc kubenswrapper[4921]: W0312 13:09:38.989637 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:38 crc kubenswrapper[4921]: E0312 13:09:38.989699 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.991710 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b700691b7f77c4c918ad1e1deab7de223ed0acc92bf7d352cc231df31dcce01c"} Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.994636 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"186c4cf5c4aa5a9f47fa41ba7e0fd2983ed2c3e9b0652598a133d271673a1586"} Mar 12 13:09:38 crc kubenswrapper[4921]: I0312 13:09:38.997282 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ce5e7b5e734162c484132c64f86f006c5b13a7cc4fa2d547187b5b17c8f4d44d"} Mar 12 13:09:39 crc kubenswrapper[4921]: W0312 13:09:39.200096 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:39 crc kubenswrapper[4921]: E0312 13:09:39.200193 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 12 13:09:39 crc kubenswrapper[4921]: W0312 13:09:39.259518 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:39 crc kubenswrapper[4921]: E0312 13:09:39.259639 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 12 13:09:39 crc kubenswrapper[4921]: E0312 13:09:39.321617 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="1.6s" Mar 12 13:09:39 crc kubenswrapper[4921]: I0312 13:09:39.567757 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:39 crc kubenswrapper[4921]: I0312 13:09:39.570497 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:39 crc kubenswrapper[4921]: I0312 13:09:39.570554 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:39 crc kubenswrapper[4921]: I0312 13:09:39.570568 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:39 crc kubenswrapper[4921]: I0312 13:09:39.570604 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:09:39 crc kubenswrapper[4921]: E0312 13:09:39.571229 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 12 13:09:39 crc kubenswrapper[4921]: I0312 13:09:39.915297 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:39 crc kubenswrapper[4921]: I0312 13:09:39.929624 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 13:09:39 crc kubenswrapper[4921]: E0312 13:09:39.930473 4921 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.003606 4921 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="32b279ed75332d109ab755962b567e565567c6137298ff5a2f134a3ce73578e7" exitCode=0 Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.003687 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"32b279ed75332d109ab755962b567e565567c6137298ff5a2f134a3ce73578e7"} Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.003746 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.005388 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.005436 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.005498 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.008079 4921 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="3ce75a021f0e1c5dd36a8d46fd1236f50ad5bbfbec12e8e6424817a9b4eeed1b" exitCode=0 Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.008163 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"3ce75a021f0e1c5dd36a8d46fd1236f50ad5bbfbec12e8e6424817a9b4eeed1b"} Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.008267 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.010052 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.010104 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.010124 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.012957 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"83573819f68113e48dd2f5d7107b7849f4ce3d6136e1c60c3b0fdbc592a3366e"} Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.013008 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.013020 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"379369622b29bf220970a0da26d04e99fd525d3225390a8d8226e587d1f50e73"} Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.013038 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"71e16d390453f815fe97adc237523571ff4da158f8bde8bb89cbf2b411fc7be9"} Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.013049 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"40fd6609bda9c83a226d0c5c926067a50655b35b1b92b8cf4eaf211900ae707c"} Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.014184 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.014210 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.014220 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.016574 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="afb697f6d82069298998c403418b1110436b90fdf525301862cdb1bdb49ceeca" exitCode=0 Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.016624 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"afb697f6d82069298998c403418b1110436b90fdf525301862cdb1bdb49ceeca"} Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.016720 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.017763 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.017784 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.017801 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.018707 4921 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4867bd95bbb7f8492762bc87122641bfe7fd05035620817688fc4b3329edf9ae" exitCode=0 Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.018753 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4867bd95bbb7f8492762bc87122641bfe7fd05035620817688fc4b3329edf9ae"} Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.018834 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.019032 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.019667 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.019711 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.019729 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.020651 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.020693 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.020716 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:40 crc kubenswrapper[4921]: W0312 13:09:40.731556 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:40 crc kubenswrapper[4921]: E0312 13:09:40.731885 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 12 13:09:40 crc kubenswrapper[4921]: W0312 13:09:40.830564 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:40 crc kubenswrapper[4921]: E0312 13:09:40.830641 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 12 13:09:40 crc kubenswrapper[4921]: I0312 13:09:40.915059 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:40 crc kubenswrapper[4921]: E0312 13:09:40.922971 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="3.2s" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.024311 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a72b00e8ff69caef15ce25a111a11566a3953665fa2fec5d041a3a61de055015"} Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.024359 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cf5d4b764b31e2cddd11b204e7d6b3c141089aa79a13a1730f091f2e13d99ded"} Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.024370 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1c0c30ab15925d0b815b23899400f7a94024971690291c534d810dbfe3dc3dc2"} Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.024493 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.025959 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.025989 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.026000 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.027879 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c288c93b29234471cc7fc9ed47f129c3de42b7d92e2197a52b14978e3b892d00"} Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.027921 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"06a9f9dc3ddcc223d93884c566c8a5eb6afc46157cd8bb98148d461b90f859c6"} Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.027933 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"420a0b9ad4ac14e41f95cb652a63f4511903be6dd56ae8b3158029d208e2af60"} Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.027947 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bd41c2aa1fad49385a95b1988c6aabd969696c85268947327ec1e6149cac6aa9"} Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.027957 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"72d5a41ba5e6693fddc4ac804a3ac70e84fba4c345d616f9c7ad0edf9cd18636"} Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.027981 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.028905 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.028927 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.028937 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.029695 4921 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="58f3e6e8d107fe93cc89c64fa40899e4447f066ebcf868ab3a277ad518eeee77" exitCode=0 Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.029777 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"58f3e6e8d107fe93cc89c64fa40899e4447f066ebcf868ab3a277ad518eeee77"} Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.029865 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.030823 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.030847 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.030857 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.031342 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9b6d02cfa996acf174ffcdcee188aa51ecf89ff3608ac81546a618ba36ffee2c"} Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.031373 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.031377 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.032384 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.032407 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.032412 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.032494 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.032527 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.032546 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.135923 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.171478 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.173232 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.173286 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.173305 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.173346 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:09:41 crc kubenswrapper[4921]: E0312 13:09:41.173933 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 12 13:09:41 crc kubenswrapper[4921]: W0312 13:09:41.565540 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 12 13:09:41 crc kubenswrapper[4921]: E0312 13:09:41.565946 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.684803 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.696519 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:09:41 crc kubenswrapper[4921]: I0312 13:09:41.841112 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.036352 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.039420 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c288c93b29234471cc7fc9ed47f129c3de42b7d92e2197a52b14978e3b892d00" exitCode=255 Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.039541 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.039538 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c288c93b29234471cc7fc9ed47f129c3de42b7d92e2197a52b14978e3b892d00"} Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.040796 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.040870 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.040887 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.041789 4921 scope.go:117] "RemoveContainer" containerID="c288c93b29234471cc7fc9ed47f129c3de42b7d92e2197a52b14978e3b892d00" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.043391 4921 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6d6fba3d362e2f33ddf059151214586a60c73f9618ade645f389eecf93176de9" exitCode=0 Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.043496 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.043537 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.043973 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6d6fba3d362e2f33ddf059151214586a60c73f9618ade645f389eecf93176de9"} Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.044119 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.044152 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.044121 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.044730 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.044773 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.044791 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.045308 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.045342 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.045361 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.045420 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.045470 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.045495 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.046387 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.046413 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.046426 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:42 crc kubenswrapper[4921]: I0312 13:09:42.537674 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.052631 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.055415 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0884259a04a812a0f29acd6506eaa5adf7ab21183e8009116d00fa45e28b43a7"} Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.055537 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.055608 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.057576 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.057610 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.057621 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.061378 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e8d442240f2c0c70653416d87f7a2c567be60e08ca508a597c31bceada42420e"} Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.061451 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"91325fecf5db8af4ad5f4b802c9815d7896a52b07bafafee5b41b1b8e81b3611"} Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.061472 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5f9f2b548a2318cb897cb426d52944cc615a6651e65b576a9c7a7ac5d5c3ae62"} Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.061490 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f83bbddf2f16207a6ac87c9b21cf7ed9c67541034e8469a097c5e04c214ef714"} Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.061504 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.061587 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.063034 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.063097 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.063116 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:43 crc kubenswrapper[4921]: I0312 13:09:43.839390 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.068859 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cc46d1c8a75facb0ebe7e63818b1ae43719402cf4c44d71209a4547fe31785b4"} Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.068896 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.068967 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.069003 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.069020 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.069040 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.070532 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.070579 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.070596 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.070532 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.070656 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.070662 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.070691 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.070704 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.070674 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.136084 4921 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.136197 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.177119 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.374376 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.375548 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.375603 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.375612 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:44 crc kubenswrapper[4921]: I0312 13:09:44.375632 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:09:45 crc kubenswrapper[4921]: I0312 13:09:45.071691 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:45 crc kubenswrapper[4921]: I0312 13:09:45.072890 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:45 crc kubenswrapper[4921]: I0312 13:09:45.072952 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:45 crc kubenswrapper[4921]: I0312 13:09:45.072972 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:46 crc kubenswrapper[4921]: I0312 13:09:46.303484 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:46 crc kubenswrapper[4921]: I0312 13:09:46.303784 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:46 crc kubenswrapper[4921]: I0312 13:09:46.305504 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:46 crc kubenswrapper[4921]: I0312 13:09:46.305570 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:46 crc kubenswrapper[4921]: I0312 13:09:46.305591 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.377741 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.378136 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.379420 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.379475 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.379489 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.813892 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.814093 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.815457 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.815494 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.815505 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.910690 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.910962 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.912524 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.912560 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:47 crc kubenswrapper[4921]: I0312 13:09:47.912571 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:48 crc kubenswrapper[4921]: E0312 13:09:48.065978 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 13:09:48 crc kubenswrapper[4921]: I0312 13:09:48.453784 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 12 13:09:48 crc kubenswrapper[4921]: I0312 13:09:48.454062 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:48 crc kubenswrapper[4921]: I0312 13:09:48.456199 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:48 crc kubenswrapper[4921]: I0312 13:09:48.456246 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:48 crc kubenswrapper[4921]: I0312 13:09:48.456260 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:51 crc kubenswrapper[4921]: I0312 13:09:51.842022 4921 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 13:09:51 crc kubenswrapper[4921]: I0312 13:09:51.842146 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 13:09:51 crc kubenswrapper[4921]: W0312 13:09:51.866104 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 12 13:09:51 crc kubenswrapper[4921]: I0312 13:09:51.866203 4921 trace.go:236] Trace[1296050985]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (12-Mar-2026 13:09:41.864) (total time: 10001ms): Mar 12 13:09:51 crc kubenswrapper[4921]: Trace[1296050985]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:09:51.866) Mar 12 13:09:51 crc kubenswrapper[4921]: Trace[1296050985]: [10.001634035s] [10.001634035s] END Mar 12 13:09:51 crc kubenswrapper[4921]: E0312 13:09:51.866258 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 12 13:09:51 crc kubenswrapper[4921]: I0312 13:09:51.915426 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 12 13:09:52 crc kubenswrapper[4921]: E0312 13:09:52.317334 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:52Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 12 13:09:52 crc kubenswrapper[4921]: E0312 13:09:52.321289 4921 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:09:52 crc kubenswrapper[4921]: E0312 13:09:52.324157 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:52Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 13:09:52 crc kubenswrapper[4921]: E0312 13:09:52.327024 4921 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:52Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c1a08b1027d12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.908923666 +0000 UTC m=+0.598995667,LastTimestamp:2026-03-12 13:09:37.908923666 +0000 UTC m=+0.598995667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:09:52 crc kubenswrapper[4921]: W0312 13:09:52.332869 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:52Z is after 2026-02-23T05:33:13Z Mar 12 13:09:52 crc kubenswrapper[4921]: E0312 13:09:52.332968 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:09:52 crc kubenswrapper[4921]: W0312 13:09:52.336332 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:52Z is after 2026-02-23T05:33:13Z Mar 12 13:09:52 crc kubenswrapper[4921]: E0312 13:09:52.336473 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:09:52 crc kubenswrapper[4921]: I0312 13:09:52.339640 4921 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 13:09:52 crc kubenswrapper[4921]: I0312 13:09:52.339834 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 12 13:09:52 crc kubenswrapper[4921]: W0312 13:09:52.340987 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:52Z is after 2026-02-23T05:33:13Z Mar 12 13:09:52 crc kubenswrapper[4921]: E0312 13:09:52.341083 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:52Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:09:52 crc kubenswrapper[4921]: I0312 13:09:52.919057 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:52Z is after 2026-02-23T05:33:13Z Mar 12 13:09:53 crc kubenswrapper[4921]: I0312 13:09:53.098678 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 12 13:09:53 crc kubenswrapper[4921]: I0312 13:09:53.099518 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 12 13:09:53 crc kubenswrapper[4921]: I0312 13:09:53.103375 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0884259a04a812a0f29acd6506eaa5adf7ab21183e8009116d00fa45e28b43a7" exitCode=255 Mar 12 13:09:53 crc kubenswrapper[4921]: I0312 13:09:53.103445 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0884259a04a812a0f29acd6506eaa5adf7ab21183e8009116d00fa45e28b43a7"} Mar 12 13:09:53 crc kubenswrapper[4921]: I0312 13:09:53.103509 4921 scope.go:117] "RemoveContainer" containerID="c288c93b29234471cc7fc9ed47f129c3de42b7d92e2197a52b14978e3b892d00" Mar 12 13:09:53 crc kubenswrapper[4921]: I0312 13:09:53.103934 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:53 crc kubenswrapper[4921]: I0312 13:09:53.105248 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:53 crc kubenswrapper[4921]: I0312 13:09:53.105303 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:53 crc kubenswrapper[4921]: I0312 13:09:53.105320 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:53 crc kubenswrapper[4921]: I0312 13:09:53.106186 4921 scope.go:117] "RemoveContainer" containerID="0884259a04a812a0f29acd6506eaa5adf7ab21183e8009116d00fa45e28b43a7" Mar 12 13:09:53 crc kubenswrapper[4921]: E0312 13:09:53.106891 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 13:09:53 crc kubenswrapper[4921]: I0312 13:09:53.918155 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:53Z is after 2026-02-23T05:33:13Z Mar 12 13:09:54 crc kubenswrapper[4921]: I0312 13:09:54.110102 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 12 13:09:54 crc kubenswrapper[4921]: I0312 13:09:54.136228 4921 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 13:09:54 crc kubenswrapper[4921]: I0312 13:09:54.136334 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 13:09:54 crc kubenswrapper[4921]: I0312 13:09:54.919645 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:54Z is after 2026-02-23T05:33:13Z Mar 12 13:09:54 crc kubenswrapper[4921]: I0312 13:09:54.952131 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:54 crc kubenswrapper[4921]: I0312 13:09:54.952357 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:54 crc kubenswrapper[4921]: I0312 13:09:54.953948 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:54 crc kubenswrapper[4921]: I0312 13:09:54.953987 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:54 crc kubenswrapper[4921]: I0312 13:09:54.954002 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:54 crc kubenswrapper[4921]: I0312 13:09:54.954635 4921 scope.go:117] "RemoveContainer" containerID="0884259a04a812a0f29acd6506eaa5adf7ab21183e8009116d00fa45e28b43a7" Mar 12 13:09:54 crc kubenswrapper[4921]: E0312 13:09:54.954868 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 13:09:55 crc kubenswrapper[4921]: W0312 13:09:55.560345 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:55Z is after 2026-02-23T05:33:13Z Mar 12 13:09:55 crc kubenswrapper[4921]: E0312 13:09:55.560493 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:09:55 crc kubenswrapper[4921]: I0312 13:09:55.919538 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:55Z is after 2026-02-23T05:33:13Z Mar 12 13:09:56 crc kubenswrapper[4921]: I0312 13:09:56.850807 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:56 crc kubenswrapper[4921]: I0312 13:09:56.851125 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:56 crc kubenswrapper[4921]: I0312 13:09:56.852912 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:56 crc kubenswrapper[4921]: I0312 13:09:56.852969 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:56 crc kubenswrapper[4921]: I0312 13:09:56.852986 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:56 crc kubenswrapper[4921]: I0312 13:09:56.853736 4921 scope.go:117] "RemoveContainer" containerID="0884259a04a812a0f29acd6506eaa5adf7ab21183e8009116d00fa45e28b43a7" Mar 12 13:09:56 crc kubenswrapper[4921]: E0312 13:09:56.854060 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 13:09:56 crc kubenswrapper[4921]: I0312 13:09:56.857483 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:09:56 crc kubenswrapper[4921]: I0312 13:09:56.920117 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:56Z is after 2026-02-23T05:33:13Z Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.122486 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.124060 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.124129 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.124149 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.125412 4921 scope.go:117] "RemoveContainer" containerID="0884259a04a812a0f29acd6506eaa5adf7ab21183e8009116d00fa45e28b43a7" Mar 12 13:09:57 crc kubenswrapper[4921]: E0312 13:09:57.125918 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.851792 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.852108 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.853684 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.853740 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.853761 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.872457 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.918197 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.918398 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.919340 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:57Z is after 2026-02-23T05:33:13Z Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.920008 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.920266 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:57 crc kubenswrapper[4921]: I0312 13:09:57.920549 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:58 crc kubenswrapper[4921]: E0312 13:09:58.066138 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 13:09:58 crc kubenswrapper[4921]: I0312 13:09:58.124548 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:58 crc kubenswrapper[4921]: I0312 13:09:58.125331 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:58 crc kubenswrapper[4921]: I0312 13:09:58.125383 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:58 crc kubenswrapper[4921]: I0312 13:09:58.125401 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:58 crc kubenswrapper[4921]: E0312 13:09:58.722966 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:58Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 12 13:09:58 crc kubenswrapper[4921]: I0312 13:09:58.725230 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:09:58 crc kubenswrapper[4921]: I0312 13:09:58.727167 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:09:58 crc kubenswrapper[4921]: I0312 13:09:58.727235 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:09:58 crc kubenswrapper[4921]: I0312 13:09:58.727263 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:09:58 crc kubenswrapper[4921]: I0312 13:09:58.727323 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:09:58 crc kubenswrapper[4921]: E0312 13:09:58.731594 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:58Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 13:09:58 crc kubenswrapper[4921]: I0312 13:09:58.918275 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:58Z is after 2026-02-23T05:33:13Z Mar 12 13:09:59 crc kubenswrapper[4921]: W0312 13:09:59.455501 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:59Z is after 2026-02-23T05:33:13Z Mar 12 13:09:59 crc kubenswrapper[4921]: E0312 13:09:59.455675 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:09:59 crc kubenswrapper[4921]: I0312 13:09:59.919591 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:09:59Z is after 2026-02-23T05:33:13Z Mar 12 13:10:00 crc kubenswrapper[4921]: I0312 13:10:00.919795 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:00Z is after 2026-02-23T05:33:13Z Mar 12 13:10:00 crc kubenswrapper[4921]: I0312 13:10:00.921998 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 13:10:00 crc kubenswrapper[4921]: E0312 13:10:00.927607 4921 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:10:01 crc kubenswrapper[4921]: I0312 13:10:01.920604 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:01Z is after 2026-02-23T05:33:13Z Mar 12 13:10:02 crc kubenswrapper[4921]: W0312 13:10:02.246275 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:02Z is after 2026-02-23T05:33:13Z Mar 12 13:10:02 crc kubenswrapper[4921]: E0312 13:10:02.246380 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:10:02 crc kubenswrapper[4921]: E0312 13:10:02.333008 4921 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:02Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c1a08b1027d12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.908923666 +0000 UTC m=+0.598995667,LastTimestamp:2026-03-12 13:09:37.908923666 +0000 UTC m=+0.598995667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:02 crc kubenswrapper[4921]: I0312 13:10:02.919510 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:02Z is after 2026-02-23T05:33:13Z Mar 12 13:10:03 crc kubenswrapper[4921]: W0312 13:10:03.285101 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:03Z is after 2026-02-23T05:33:13Z Mar 12 13:10:03 crc kubenswrapper[4921]: E0312 13:10:03.285232 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:10:03 crc kubenswrapper[4921]: I0312 13:10:03.919063 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:03Z is after 2026-02-23T05:33:13Z Mar 12 13:10:04 crc kubenswrapper[4921]: I0312 13:10:04.137594 4921 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 13:10:04 crc kubenswrapper[4921]: I0312 13:10:04.137727 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 13:10:04 crc kubenswrapper[4921]: I0312 13:10:04.137869 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:10:04 crc kubenswrapper[4921]: I0312 13:10:04.138028 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:04 crc kubenswrapper[4921]: I0312 13:10:04.139097 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:04 crc kubenswrapper[4921]: I0312 13:10:04.139140 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:04 crc kubenswrapper[4921]: I0312 13:10:04.139155 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:04 crc kubenswrapper[4921]: I0312 13:10:04.139705 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"71e16d390453f815fe97adc237523571ff4da158f8bde8bb89cbf2b411fc7be9"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 12 13:10:04 crc kubenswrapper[4921]: I0312 13:10:04.139889 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://71e16d390453f815fe97adc237523571ff4da158f8bde8bb89cbf2b411fc7be9" gracePeriod=30 Mar 12 13:10:04 crc kubenswrapper[4921]: W0312 13:10:04.536938 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:04Z is after 2026-02-23T05:33:13Z Mar 12 13:10:04 crc kubenswrapper[4921]: E0312 13:10:04.537051 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:10:04 crc kubenswrapper[4921]: I0312 13:10:04.921310 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:04Z is after 2026-02-23T05:33:13Z Mar 12 13:10:05 crc kubenswrapper[4921]: I0312 13:10:05.146539 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 12 13:10:05 crc kubenswrapper[4921]: I0312 13:10:05.147087 4921 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="71e16d390453f815fe97adc237523571ff4da158f8bde8bb89cbf2b411fc7be9" exitCode=255 Mar 12 13:10:05 crc kubenswrapper[4921]: I0312 13:10:05.147150 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"71e16d390453f815fe97adc237523571ff4da158f8bde8bb89cbf2b411fc7be9"} Mar 12 13:10:05 crc kubenswrapper[4921]: I0312 13:10:05.147195 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ef85ddea9ad217de7e7535c7f3402ce7b136a8b3779025e0f48a59124968e814"} Mar 12 13:10:05 crc kubenswrapper[4921]: I0312 13:10:05.147340 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:05 crc kubenswrapper[4921]: I0312 13:10:05.148731 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:05 crc kubenswrapper[4921]: I0312 13:10:05.148842 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:05 crc kubenswrapper[4921]: I0312 13:10:05.148864 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:05 crc kubenswrapper[4921]: E0312 13:10:05.730138 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:05Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 12 13:10:05 crc kubenswrapper[4921]: I0312 13:10:05.732238 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:05 crc kubenswrapper[4921]: I0312 13:10:05.734496 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:05 crc kubenswrapper[4921]: I0312 13:10:05.734543 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:05 crc kubenswrapper[4921]: I0312 13:10:05.734561 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:05 crc kubenswrapper[4921]: I0312 13:10:05.734595 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:10:05 crc kubenswrapper[4921]: E0312 13:10:05.739675 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:05Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 13:10:05 crc kubenswrapper[4921]: I0312 13:10:05.920380 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:05Z is after 2026-02-23T05:33:13Z Mar 12 13:10:06 crc kubenswrapper[4921]: I0312 13:10:06.917372 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:06Z is after 2026-02-23T05:33:13Z Mar 12 13:10:07 crc kubenswrapper[4921]: I0312 13:10:07.918209 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:07Z is after 2026-02-23T05:33:13Z Mar 12 13:10:08 crc kubenswrapper[4921]: E0312 13:10:08.066333 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 13:10:08 crc kubenswrapper[4921]: I0312 13:10:08.917356 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:08Z is after 2026-02-23T05:33:13Z Mar 12 13:10:09 crc kubenswrapper[4921]: I0312 13:10:09.919319 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:09Z is after 2026-02-23T05:33:13Z Mar 12 13:10:10 crc kubenswrapper[4921]: I0312 13:10:10.919701 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:10Z is after 2026-02-23T05:33:13Z Mar 12 13:10:10 crc kubenswrapper[4921]: I0312 13:10:10.982412 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:10 crc kubenswrapper[4921]: I0312 13:10:10.984154 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:10 crc kubenswrapper[4921]: I0312 13:10:10.984238 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:10 crc kubenswrapper[4921]: I0312 13:10:10.984258 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:10 crc kubenswrapper[4921]: I0312 13:10:10.985323 4921 scope.go:117] "RemoveContainer" containerID="0884259a04a812a0f29acd6506eaa5adf7ab21183e8009116d00fa45e28b43a7" Mar 12 13:10:11 crc kubenswrapper[4921]: I0312 13:10:11.136719 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:10:11 crc kubenswrapper[4921]: I0312 13:10:11.137050 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:11 crc kubenswrapper[4921]: I0312 13:10:11.138618 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:11 crc kubenswrapper[4921]: I0312 13:10:11.138679 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:11 crc kubenswrapper[4921]: I0312 13:10:11.138695 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:11 crc kubenswrapper[4921]: I0312 13:10:11.917114 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:11Z is after 2026-02-23T05:33:13Z Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.171181 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.172285 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.174921 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="88e1c0d8f62fdd3d054bda0ccb2cff60ec0d55f0480fbaa30ccaccf275cfceec" exitCode=255 Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.175011 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"88e1c0d8f62fdd3d054bda0ccb2cff60ec0d55f0480fbaa30ccaccf275cfceec"} Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.175077 4921 scope.go:117] "RemoveContainer" containerID="0884259a04a812a0f29acd6506eaa5adf7ab21183e8009116d00fa45e28b43a7" Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.175251 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.176413 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.176464 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.176475 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.176966 4921 scope.go:117] "RemoveContainer" containerID="88e1c0d8f62fdd3d054bda0ccb2cff60ec0d55f0480fbaa30ccaccf275cfceec" Mar 12 13:10:12 crc kubenswrapper[4921]: E0312 13:10:12.177248 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 13:10:12 crc kubenswrapper[4921]: E0312 13:10:12.340187 4921 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:12Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c1a08b1027d12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.908923666 +0000 UTC m=+0.598995667,LastTimestamp:2026-03-12 13:09:37.908923666 +0000 UTC m=+0.598995667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:12 crc kubenswrapper[4921]: E0312 13:10:12.737263 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:12Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.740514 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.742229 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.742312 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.742337 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.742379 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:10:12 crc kubenswrapper[4921]: E0312 13:10:12.747424 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:12Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 13:10:12 crc kubenswrapper[4921]: I0312 13:10:12.919766 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:12Z is after 2026-02-23T05:33:13Z Mar 12 13:10:13 crc kubenswrapper[4921]: I0312 13:10:13.181791 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 13:10:13 crc kubenswrapper[4921]: I0312 13:10:13.840104 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:10:13 crc kubenswrapper[4921]: I0312 13:10:13.840431 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:13 crc kubenswrapper[4921]: I0312 13:10:13.842253 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:13 crc kubenswrapper[4921]: I0312 13:10:13.842310 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:13 crc kubenswrapper[4921]: I0312 13:10:13.842319 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:13 crc kubenswrapper[4921]: I0312 13:10:13.917998 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:13Z is after 2026-02-23T05:33:13Z Mar 12 13:10:14 crc kubenswrapper[4921]: I0312 13:10:14.137517 4921 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 13:10:14 crc kubenswrapper[4921]: I0312 13:10:14.137630 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 13:10:14 crc kubenswrapper[4921]: I0312 13:10:14.920940 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:14Z is after 2026-02-23T05:33:13Z Mar 12 13:10:14 crc kubenswrapper[4921]: I0312 13:10:14.951749 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:10:14 crc kubenswrapper[4921]: I0312 13:10:14.952119 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:14 crc kubenswrapper[4921]: I0312 13:10:14.954136 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:14 crc kubenswrapper[4921]: I0312 13:10:14.954202 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:14 crc kubenswrapper[4921]: I0312 13:10:14.954224 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:14 crc kubenswrapper[4921]: I0312 13:10:14.955391 4921 scope.go:117] "RemoveContainer" containerID="88e1c0d8f62fdd3d054bda0ccb2cff60ec0d55f0480fbaa30ccaccf275cfceec" Mar 12 13:10:14 crc kubenswrapper[4921]: E0312 13:10:14.955710 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 13:10:15 crc kubenswrapper[4921]: W0312 13:10:15.323247 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:15Z is after 2026-02-23T05:33:13Z Mar 12 13:10:15 crc kubenswrapper[4921]: E0312 13:10:15.323361 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:10:15 crc kubenswrapper[4921]: I0312 13:10:15.920185 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:15Z is after 2026-02-23T05:33:13Z Mar 12 13:10:16 crc kubenswrapper[4921]: I0312 13:10:16.304299 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:10:16 crc kubenswrapper[4921]: I0312 13:10:16.304524 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:16 crc kubenswrapper[4921]: I0312 13:10:16.306327 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:16 crc kubenswrapper[4921]: I0312 13:10:16.306396 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:16 crc kubenswrapper[4921]: I0312 13:10:16.306423 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:16 crc kubenswrapper[4921]: I0312 13:10:16.307359 4921 scope.go:117] "RemoveContainer" containerID="88e1c0d8f62fdd3d054bda0ccb2cff60ec0d55f0480fbaa30ccaccf275cfceec" Mar 12 13:10:16 crc kubenswrapper[4921]: E0312 13:10:16.308232 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 13:10:16 crc kubenswrapper[4921]: I0312 13:10:16.920427 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:16Z is after 2026-02-23T05:33:13Z Mar 12 13:10:16 crc kubenswrapper[4921]: W0312 13:10:16.952552 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:16Z is after 2026-02-23T05:33:13Z Mar 12 13:10:16 crc kubenswrapper[4921]: E0312 13:10:16.952698 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:10:17 crc kubenswrapper[4921]: I0312 13:10:17.919692 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:17Z is after 2026-02-23T05:33:13Z Mar 12 13:10:18 crc kubenswrapper[4921]: E0312 13:10:18.066802 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 13:10:18 crc kubenswrapper[4921]: I0312 13:10:18.204099 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 13:10:18 crc kubenswrapper[4921]: E0312 13:10:18.210079 4921 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:10:18 crc kubenswrapper[4921]: E0312 13:10:18.211336 4921 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 12 13:10:18 crc kubenswrapper[4921]: I0312 13:10:18.922001 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:18Z is after 2026-02-23T05:33:13Z Mar 12 13:10:19 crc kubenswrapper[4921]: E0312 13:10:19.743370 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:19Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 12 13:10:19 crc kubenswrapper[4921]: I0312 13:10:19.748486 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:19 crc kubenswrapper[4921]: I0312 13:10:19.750203 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:19 crc kubenswrapper[4921]: I0312 13:10:19.750267 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:19 crc kubenswrapper[4921]: I0312 13:10:19.750286 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:19 crc kubenswrapper[4921]: I0312 13:10:19.750318 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:10:19 crc kubenswrapper[4921]: E0312 13:10:19.755379 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:19Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 13:10:19 crc kubenswrapper[4921]: I0312 13:10:19.920022 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:19Z is after 2026-02-23T05:33:13Z Mar 12 13:10:20 crc kubenswrapper[4921]: I0312 13:10:20.920463 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:20Z is after 2026-02-23T05:33:13Z Mar 12 13:10:21 crc kubenswrapper[4921]: I0312 13:10:21.919295 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:21Z is after 2026-02-23T05:33:13Z Mar 12 13:10:22 crc kubenswrapper[4921]: E0312 13:10:22.346971 4921 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:22Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c1a08b1027d12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.908923666 +0000 UTC m=+0.598995667,LastTimestamp:2026-03-12 13:09:37.908923666 +0000 UTC m=+0.598995667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:22 crc kubenswrapper[4921]: I0312 13:10:22.919289 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:22Z is after 2026-02-23T05:33:13Z Mar 12 13:10:23 crc kubenswrapper[4921]: I0312 13:10:23.919518 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:23Z is after 2026-02-23T05:33:13Z Mar 12 13:10:24 crc kubenswrapper[4921]: I0312 13:10:24.137146 4921 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 13:10:24 crc kubenswrapper[4921]: I0312 13:10:24.137313 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 13:10:24 crc kubenswrapper[4921]: I0312 13:10:24.919394 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:24Z is after 2026-02-23T05:33:13Z Mar 12 13:10:24 crc kubenswrapper[4921]: W0312 13:10:24.965163 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:24Z is after 2026-02-23T05:33:13Z Mar 12 13:10:24 crc kubenswrapper[4921]: E0312 13:10:24.965270 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:10:25 crc kubenswrapper[4921]: I0312 13:10:25.920080 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:25Z is after 2026-02-23T05:33:13Z Mar 12 13:10:26 crc kubenswrapper[4921]: E0312 13:10:26.749292 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:26Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 12 13:10:26 crc kubenswrapper[4921]: I0312 13:10:26.756447 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:26 crc kubenswrapper[4921]: I0312 13:10:26.758416 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:26 crc kubenswrapper[4921]: I0312 13:10:26.758498 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:26 crc kubenswrapper[4921]: I0312 13:10:26.758523 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:26 crc kubenswrapper[4921]: I0312 13:10:26.758563 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:10:26 crc kubenswrapper[4921]: E0312 13:10:26.763733 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:26Z is after 2026-02-23T05:33:13Z" node="crc" Mar 12 13:10:26 crc kubenswrapper[4921]: I0312 13:10:26.919250 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:26Z is after 2026-02-23T05:33:13Z Mar 12 13:10:27 crc kubenswrapper[4921]: I0312 13:10:27.384973 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 12 13:10:27 crc kubenswrapper[4921]: I0312 13:10:27.385201 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:27 crc kubenswrapper[4921]: I0312 13:10:27.386804 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:27 crc kubenswrapper[4921]: I0312 13:10:27.386898 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:27 crc kubenswrapper[4921]: I0312 13:10:27.386922 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:27 crc kubenswrapper[4921]: W0312 13:10:27.737247 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:27Z is after 2026-02-23T05:33:13Z Mar 12 13:10:27 crc kubenswrapper[4921]: E0312 13:10:27.737369 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:27Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 12 13:10:27 crc kubenswrapper[4921]: I0312 13:10:27.919609 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:27Z is after 2026-02-23T05:33:13Z Mar 12 13:10:28 crc kubenswrapper[4921]: E0312 13:10:28.067368 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 13:10:28 crc kubenswrapper[4921]: I0312 13:10:28.918230 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:28Z is after 2026-02-23T05:33:13Z Mar 12 13:10:29 crc kubenswrapper[4921]: I0312 13:10:29.917462 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:29Z is after 2026-02-23T05:33:13Z Mar 12 13:10:30 crc kubenswrapper[4921]: I0312 13:10:30.920106 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:10:30Z is after 2026-02-23T05:33:13Z Mar 12 13:10:30 crc kubenswrapper[4921]: I0312 13:10:30.982659 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:30 crc kubenswrapper[4921]: I0312 13:10:30.984470 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:30 crc kubenswrapper[4921]: I0312 13:10:30.984533 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:30 crc kubenswrapper[4921]: I0312 13:10:30.984553 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:30 crc kubenswrapper[4921]: I0312 13:10:30.985470 4921 scope.go:117] "RemoveContainer" containerID="88e1c0d8f62fdd3d054bda0ccb2cff60ec0d55f0480fbaa30ccaccf275cfceec" Mar 12 13:10:30 crc kubenswrapper[4921]: E0312 13:10:30.985766 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 13:10:31 crc kubenswrapper[4921]: I0312 13:10:31.922022 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.355152 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b1027d12 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.908923666 +0000 UTC m=+0.598995667,LastTimestamp:2026-03-12 13:09:37.908923666 +0000 UTC m=+0.598995667,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.363579 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e38ca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.96421857 +0000 UTC m=+0.654290551,LastTimestamp:2026-03-12 13:09:37.96421857 +0000 UTC m=+0.654290551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.370698 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e7adc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964235484 +0000 UTC m=+0.654307455,LastTimestamp:2026-03-12 13:09:37.964235484 +0000 UTC m=+0.654307455,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.377122 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44ea322 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964245794 +0000 UTC m=+0.654317765,LastTimestamp:2026-03-12 13:09:37.964245794 +0000 UTC m=+0.654317765,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.383915 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b9f79d79 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:38.059206009 +0000 UTC m=+0.749277990,LastTimestamp:2026-03-12 13:09:38.059206009 +0000 UTC m=+0.749277990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.391411 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44e38ca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e38ca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.96421857 +0000 UTC m=+0.654290551,LastTimestamp:2026-03-12 13:09:38.085746571 +0000 UTC m=+0.775818563,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.398472 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44e7adc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e7adc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964235484 +0000 UTC m=+0.654307455,LastTimestamp:2026-03-12 13:09:38.085782542 +0000 UTC m=+0.775854523,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.405287 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44ea322\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44ea322 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964245794 +0000 UTC m=+0.654317765,LastTimestamp:2026-03-12 13:09:38.085801437 +0000 UTC m=+0.775873418,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.412204 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44e38ca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e38ca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.96421857 +0000 UTC m=+0.654290551,LastTimestamp:2026-03-12 13:09:38.087028529 +0000 UTC m=+0.777100530,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.420781 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44e7adc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e7adc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964235484 +0000 UTC m=+0.654307455,LastTimestamp:2026-03-12 13:09:38.087065049 +0000 UTC m=+0.777137060,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.424168 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44ea322\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44ea322 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964245794 +0000 UTC m=+0.654317765,LastTimestamp:2026-03-12 13:09:38.087083434 +0000 UTC m=+0.777155445,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.430935 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44e38ca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e38ca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.96421857 +0000 UTC m=+0.654290551,LastTimestamp:2026-03-12 13:09:38.087364852 +0000 UTC m=+0.777436813,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.437610 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44e7adc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e7adc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964235484 +0000 UTC m=+0.654307455,LastTimestamp:2026-03-12 13:09:38.087377696 +0000 UTC m=+0.777449667,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.445074 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44ea322\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44ea322 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964245794 +0000 UTC m=+0.654317765,LastTimestamp:2026-03-12 13:09:38.087386948 +0000 UTC m=+0.777458919,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.452788 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44e38ca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e38ca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.96421857 +0000 UTC m=+0.654290551,LastTimestamp:2026-03-12 13:09:38.088736273 +0000 UTC m=+0.778808254,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.459079 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44e38ca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e38ca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.96421857 +0000 UTC m=+0.654290551,LastTimestamp:2026-03-12 13:09:38.088804322 +0000 UTC m=+0.778876323,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.464521 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44e7adc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e7adc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964235484 +0000 UTC m=+0.654307455,LastTimestamp:2026-03-12 13:09:38.088851946 +0000 UTC m=+0.778923917,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.471026 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44ea322\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44ea322 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964245794 +0000 UTC m=+0.654317765,LastTimestamp:2026-03-12 13:09:38.088866169 +0000 UTC m=+0.778938140,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.478104 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44e7adc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e7adc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964235484 +0000 UTC m=+0.654307455,LastTimestamp:2026-03-12 13:09:38.088908231 +0000 UTC m=+0.778980282,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.484766 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44ea322\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44ea322 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964245794 +0000 UTC m=+0.654317765,LastTimestamp:2026-03-12 13:09:38.088954874 +0000 UTC m=+0.779026875,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.489127 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44e38ca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e38ca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.96421857 +0000 UTC m=+0.654290551,LastTimestamp:2026-03-12 13:09:38.090357645 +0000 UTC m=+0.780429636,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.495498 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44e7adc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e7adc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964235484 +0000 UTC m=+0.654307455,LastTimestamp:2026-03-12 13:09:38.090387493 +0000 UTC m=+0.780459474,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.503531 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44ea322\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44ea322 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964245794 +0000 UTC m=+0.654317765,LastTimestamp:2026-03-12 13:09:38.090406979 +0000 UTC m=+0.780478960,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.511087 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44e38ca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e38ca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.96421857 +0000 UTC m=+0.654290551,LastTimestamp:2026-03-12 13:09:38.090833727 +0000 UTC m=+0.780905698,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.516731 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c1a08b44e7adc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c1a08b44e7adc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:37.964235484 +0000 UTC m=+0.654307455,LastTimestamp:2026-03-12 13:09:38.090850642 +0000 UTC m=+0.780922613,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.525469 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a08d374c722 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:38.486839074 +0000 UTC m=+1.176911035,LastTimestamp:2026-03-12 13:09:38.486839074 +0000 UTC m=+1.176911035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.529465 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c1a08d374f728 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:38.486851368 +0000 UTC m=+1.176923339,LastTimestamp:2026-03-12 13:09:38.486851368 +0000 UTC m=+1.176923339,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.531511 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a08d3bfce3c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:38.491756092 +0000 UTC m=+1.181828063,LastTimestamp:2026-03-12 13:09:38.491756092 +0000 UTC m=+1.181828063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.535705 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a08d4bf781f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:38.508511263 +0000 UTC m=+1.198583234,LastTimestamp:2026-03-12 13:09:38.508511263 +0000 UTC m=+1.198583234,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.537918 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1a08d53823e1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:38.516419553 +0000 UTC m=+1.206491524,LastTimestamp:2026-03-12 13:09:38.516419553 +0000 UTC m=+1.206491524,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.544069 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a08facb20bb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.146809531 +0000 UTC m=+1.836881502,LastTimestamp:2026-03-12 13:09:39.146809531 +0000 UTC m=+1.836881502,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.549002 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a08fad32d9b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.147337115 +0000 UTC m=+1.837409096,LastTimestamp:2026-03-12 13:09:39.147337115 +0000 UTC m=+1.837409096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.556429 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c1a08fad50965 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.147458917 +0000 UTC m=+1.837530889,LastTimestamp:2026-03-12 13:09:39.147458917 +0000 UTC m=+1.837530889,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.561563 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a08fad76a54 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.147614804 +0000 UTC m=+1.837686785,LastTimestamp:2026-03-12 13:09:39.147614804 +0000 UTC m=+1.837686785,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.573841 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1a08fad9a142 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.147759938 +0000 UTC m=+1.837831929,LastTimestamp:2026-03-12 13:09:39.147759938 +0000 UTC m=+1.837831929,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.584855 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a08fb5bdad5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.156294357 +0000 UTC m=+1.846366328,LastTimestamp:2026-03-12 13:09:39.156294357 +0000 UTC m=+1.846366328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.592805 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a08fb75e8a0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.158001824 +0000 UTC m=+1.848073795,LastTimestamp:2026-03-12 13:09:39.158001824 +0000 UTC m=+1.848073795,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.603413 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1a08fb84d7ed openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.158980589 +0000 UTC m=+1.849052550,LastTimestamp:2026-03-12 13:09:39.158980589 +0000 UTC m=+1.849052550,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.608771 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a08fb9c0f0c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.160502028 +0000 UTC m=+1.850573999,LastTimestamp:2026-03-12 13:09:39.160502028 +0000 UTC m=+1.850573999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.614034 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a08fba893fc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.161322492 +0000 UTC m=+1.851394463,LastTimestamp:2026-03-12 13:09:39.161322492 +0000 UTC m=+1.851394463,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.621244 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c1a08fbab7a57 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.161512535 +0000 UTC m=+1.851584516,LastTimestamp:2026-03-12 13:09:39.161512535 +0000 UTC m=+1.851584516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.625931 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a090f98ff34 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.495845684 +0000 UTC m=+2.185917695,LastTimestamp:2026-03-12 13:09:39.495845684 +0000 UTC m=+2.185917695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.630521 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a0910656b98 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.509242776 +0000 UTC m=+2.199314777,LastTimestamp:2026-03-12 13:09:39.509242776 +0000 UTC m=+2.199314777,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.632200 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a09108346f4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.511199476 +0000 UTC m=+2.201271447,LastTimestamp:2026-03-12 13:09:39.511199476 +0000 UTC m=+2.201271447,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.635842 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a091d55a3d7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.726312407 +0000 UTC m=+2.416384388,LastTimestamp:2026-03-12 13:09:39.726312407 +0000 UTC m=+2.416384388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.640180 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a091de91b45 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.735976773 +0000 UTC m=+2.426048744,LastTimestamp:2026-03-12 13:09:39.735976773 +0000 UTC m=+2.426048744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.644671 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a091dfc0834 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.737217076 +0000 UTC m=+2.427289057,LastTimestamp:2026-03-12 13:09:39.737217076 +0000 UTC m=+2.427289057,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.648576 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a0927f1e3d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.904324566 +0000 UTC m=+2.594396557,LastTimestamp:2026-03-12 13:09:39.904324566 +0000 UTC m=+2.594396557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.652228 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a0928924f02 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.914837762 +0000 UTC m=+2.604909733,LastTimestamp:2026-03-12 13:09:39.914837762 +0000 UTC m=+2.604909733,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.656174 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c1a092e174ce8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.007439592 +0000 UTC m=+2.697511553,LastTimestamp:2026-03-12 13:09:40.007439592 +0000 UTC m=+2.697511553,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.659687 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1a092e5cd6db openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.011996891 +0000 UTC m=+2.702068892,LastTimestamp:2026-03-12 13:09:40.011996891 +0000 UTC m=+2.702068892,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.663856 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a092ec34f92 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.018712466 +0000 UTC m=+2.708784437,LastTimestamp:2026-03-12 13:09:40.018712466 +0000 UTC m=+2.708784437,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.668074 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a092f69c41c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.029621276 +0000 UTC m=+2.719693287,LastTimestamp:2026-03-12 13:09:40.029621276 +0000 UTC m=+2.719693287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.674298 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1a093c21bf7d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.243005309 +0000 UTC m=+2.933077270,LastTimestamp:2026-03-12 13:09:40.243005309 +0000 UTC m=+2.933077270,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.677968 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a093c3620e5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.244340965 +0000 UTC m=+2.934412936,LastTimestamp:2026-03-12 13:09:40.244340965 +0000 UTC m=+2.934412936,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.681390 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a093c37e4fb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.244456699 +0000 UTC m=+2.934528670,LastTimestamp:2026-03-12 13:09:40.244456699 +0000 UTC m=+2.934528670,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.685615 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c1a093c682644 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.24761914 +0000 UTC m=+2.937691111,LastTimestamp:2026-03-12 13:09:40.24761914 +0000 UTC m=+2.937691111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.689297 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1a093d0e440c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.25850574 +0000 UTC m=+2.948577711,LastTimestamp:2026-03-12 13:09:40.25850574 +0000 UTC m=+2.948577711,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.693349 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1a093d1f8509 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.259636489 +0000 UTC m=+2.949708460,LastTimestamp:2026-03-12 13:09:40.259636489 +0000 UTC m=+2.949708460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.697714 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a093d3cba90 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.261550736 +0000 UTC m=+2.951622707,LastTimestamp:2026-03-12 13:09:40.261550736 +0000 UTC m=+2.951622707,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.701485 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a093d48b8de openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.262336734 +0000 UTC m=+2.952408705,LastTimestamp:2026-03-12 13:09:40.262336734 +0000 UTC m=+2.952408705,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.706320 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c1a093d9653f7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.267422711 +0000 UTC m=+2.957494682,LastTimestamp:2026-03-12 13:09:40.267422711 +0000 UTC m=+2.957494682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.710579 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a093e0d44ae openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.275217582 +0000 UTC m=+2.965289553,LastTimestamp:2026-03-12 13:09:40.275217582 +0000 UTC m=+2.965289553,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.716102 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a0948247a08 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.444510728 +0000 UTC m=+3.134582699,LastTimestamp:2026-03-12 13:09:40.444510728 +0000 UTC m=+3.134582699,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.721071 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a0949280fa3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.461522851 +0000 UTC m=+3.151594822,LastTimestamp:2026-03-12 13:09:40.461522851 +0000 UTC m=+3.151594822,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.725605 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a094937ad83 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.462546307 +0000 UTC m=+3.152618278,LastTimestamp:2026-03-12 13:09:40.462546307 +0000 UTC m=+3.152618278,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.730309 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1a09493ed1db openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.463014363 +0000 UTC m=+3.153086334,LastTimestamp:2026-03-12 13:09:40.463014363 +0000 UTC m=+3.153086334,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.734295 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1a094a860e13 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.484460051 +0000 UTC m=+3.174532032,LastTimestamp:2026-03-12 13:09:40.484460051 +0000 UTC m=+3.174532032,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.736880 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1a094a95422a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.485456426 +0000 UTC m=+3.175528397,LastTimestamp:2026-03-12 13:09:40.485456426 +0000 UTC m=+3.175528397,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.739410 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a09549a0678 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.653540984 +0000 UTC m=+3.343612955,LastTimestamp:2026-03-12 13:09:40.653540984 +0000 UTC m=+3.343612955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.741972 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1a0954aada95 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.654643861 +0000 UTC m=+3.344715842,LastTimestamp:2026-03-12 13:09:40.654643861 +0000 UTC m=+3.344715842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.743297 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a0955716ae0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.667656928 +0000 UTC m=+3.357728899,LastTimestamp:2026-03-12 13:09:40.667656928 +0000 UTC m=+3.357728899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.746944 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a095583805d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.668842077 +0000 UTC m=+3.358914048,LastTimestamp:2026-03-12 13:09:40.668842077 +0000 UTC m=+3.358914048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.748919 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c1a0955a0eb0b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.670769931 +0000 UTC m=+3.360841902,LastTimestamp:2026-03-12 13:09:40.670769931 +0000 UTC m=+3.360841902,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.752136 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a095e8c63b5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.820419509 +0000 UTC m=+3.510491480,LastTimestamp:2026-03-12 13:09:40.820419509 +0000 UTC m=+3.510491480,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.753401 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a095f0a49ca openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.82867041 +0000 UTC m=+3.518742371,LastTimestamp:2026-03-12 13:09:40.82867041 +0000 UTC m=+3.518742371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.756059 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a095f1c3c3e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.82984659 +0000 UTC m=+3.519918561,LastTimestamp:2026-03-12 13:09:40.82984659 +0000 UTC m=+3.519918561,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.758381 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a096892fcce openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.988624078 +0000 UTC m=+3.678696049,LastTimestamp:2026-03-12 13:09:40.988624078 +0000 UTC m=+3.678696049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.759699 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a0969541b1d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:41.001280285 +0000 UTC m=+3.691352256,LastTimestamp:2026-03-12 13:09:41.001280285 +0000 UTC m=+3.691352256,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.763188 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a096b2e2b96 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:41.032348566 +0000 UTC m=+3.722420537,LastTimestamp:2026-03-12 13:09:41.032348566 +0000 UTC m=+3.722420537,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.765609 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a0975eecf4a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:41.212745546 +0000 UTC m=+3.902817507,LastTimestamp:2026-03-12 13:09:41.212745546 +0000 UTC m=+3.902817507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.772133 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a0976f86948 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:41.230152008 +0000 UTC m=+3.920223969,LastTimestamp:2026-03-12 13:09:41.230152008 +0000 UTC m=+3.920223969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.778406 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c1a095f1c3c3e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a095f1c3c3e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.82984659 +0000 UTC m=+3.519918561,LastTimestamp:2026-03-12 13:09:42.043443872 +0000 UTC m=+4.733515853,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.785160 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09a7b12a02 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:42.047566338 +0000 UTC m=+4.737638359,LastTimestamp:2026-03-12 13:09:42.047566338 +0000 UTC m=+4.737638359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.790894 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c1a096892fcce\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a096892fcce openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:40.988624078 +0000 UTC m=+3.678696049,LastTimestamp:2026-03-12 13:09:42.278960074 +0000 UTC m=+4.969032085,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.796840 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09b5826088 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:42.279381128 +0000 UTC m=+4.969453129,LastTimestamp:2026-03-12 13:09:42.279381128 +0000 UTC m=+4.969453129,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.802539 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c1a0969541b1d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a0969541b1d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:41.001280285 +0000 UTC m=+3.691352256,LastTimestamp:2026-03-12 13:09:42.290857295 +0000 UTC m=+4.980929306,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.807137 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09b6758af9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:42.295317241 +0000 UTC m=+4.985389212,LastTimestamp:2026-03-12 13:09:42.295317241 +0000 UTC m=+4.985389212,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.811170 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09b689d67f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:42.296647295 +0000 UTC m=+4.986719276,LastTimestamp:2026-03-12 13:09:42.296647295 +0000 UTC m=+4.986719276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.816528 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09c2b08cf7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:42.500510967 +0000 UTC m=+5.190582938,LastTimestamp:2026-03-12 13:09:42.500510967 +0000 UTC m=+5.190582938,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.821116 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09c3a7d6b7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:42.516717239 +0000 UTC m=+5.206789210,LastTimestamp:2026-03-12 13:09:42.516717239 +0000 UTC m=+5.206789210,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.825618 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09c3bd29b0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:42.518114736 +0000 UTC m=+5.208186707,LastTimestamp:2026-03-12 13:09:42.518114736 +0000 UTC m=+5.208186707,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.829483 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09d151bcce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:42.745955534 +0000 UTC m=+5.436027505,LastTimestamp:2026-03-12 13:09:42.745955534 +0000 UTC m=+5.436027505,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.833415 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09d20e0379 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:42.758294393 +0000 UTC m=+5.448366374,LastTimestamp:2026-03-12 13:09:42.758294393 +0000 UTC m=+5.448366374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.836889 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09d2207a38 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:42.75950444 +0000 UTC m=+5.449576421,LastTimestamp:2026-03-12 13:09:42.75950444 +0000 UTC m=+5.449576421,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.841465 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09df7d84eb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:42.983705835 +0000 UTC m=+5.673777816,LastTimestamp:2026-03-12 13:09:42.983705835 +0000 UTC m=+5.673777816,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.845265 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09e059fb2b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:42.998154027 +0000 UTC m=+5.688226008,LastTimestamp:2026-03-12 13:09:42.998154027 +0000 UTC m=+5.688226008,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.849178 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09e06bb6c3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:42.999316163 +0000 UTC m=+5.689388144,LastTimestamp:2026-03-12 13:09:42.999316163 +0000 UTC m=+5.689388144,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.853732 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09ed192dfe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:43.212011006 +0000 UTC m=+5.902082997,LastTimestamp:2026-03-12 13:09:43.212011006 +0000 UTC m=+5.902082997,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.857222 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c1a09ee1d781b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:43.229069339 +0000 UTC m=+5.919141350,LastTimestamp:2026-03-12 13:09:43.229069339 +0000 UTC m=+5.919141350,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.861501 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 13:10:32 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-controller-manager-crc.189c1a0a242e8de5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 12 13:10:32 crc kubenswrapper[4921]: body: Mar 12 13:10:32 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:44.136158693 +0000 UTC m=+6.826230674,LastTimestamp:2026-03-12 13:09:44.136158693 +0000 UTC m=+6.826230674,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 13:10:32 crc kubenswrapper[4921]: > Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.865498 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a0a242fe3dc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:44.136246236 +0000 UTC m=+6.826318227,LastTimestamp:2026-03-12 13:09:44.136246236 +0000 UTC m=+6.826318227,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.870190 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 12 13:10:32 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-apiserver-crc.189c1a0bef7de1d3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:6443/livez": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 13:10:32 crc kubenswrapper[4921]: body: Mar 12 13:10:32 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:51.842099667 +0000 UTC m=+14.532171678,LastTimestamp:2026-03-12 13:09:51.842099667 +0000 UTC m=+14.532171678,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 13:10:32 crc kubenswrapper[4921]: > Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.874210 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a0bef7f4bde openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:51.84219235 +0000 UTC m=+14.532264351,LastTimestamp:2026-03-12 13:09:51.84219235 +0000 UTC m=+14.532264351,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.877920 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 12 13:10:32 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-apiserver-crc.189c1a0c0d2800a3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 12 13:10:32 crc kubenswrapper[4921]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 12 13:10:32 crc kubenswrapper[4921]: Mar 12 13:10:32 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:52.339787939 +0000 UTC m=+15.029859940,LastTimestamp:2026-03-12 13:09:52.339787939 +0000 UTC m=+15.029859940,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 13:10:32 crc kubenswrapper[4921]: > Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.882033 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c1a0c0d2a1e73 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:52.339926643 +0000 UTC m=+15.029998624,LastTimestamp:2026-03-12 13:09:52.339926643 +0000 UTC m=+15.029998624,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.886916 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 13:10:32 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-controller-manager-crc.189c1a0c783c8a96 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 13:10:32 crc kubenswrapper[4921]: body: Mar 12 13:10:32 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:54.136296086 +0000 UTC m=+16.826368057,LastTimestamp:2026-03-12 13:09:54.136296086 +0000 UTC m=+16.826368057,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 13:10:32 crc kubenswrapper[4921]: > Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.890977 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a0c783d8498 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:54.136360088 +0000 UTC m=+16.826432059,LastTimestamp:2026-03-12 13:09:54.136360088 +0000 UTC m=+16.826432059,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.897680 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c1a0c783c8a96\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 13:10:32 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-controller-manager-crc.189c1a0c783c8a96 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 13:10:32 crc kubenswrapper[4921]: body: Mar 12 13:10:32 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:54.136296086 +0000 UTC m=+16.826368057,LastTimestamp:2026-03-12 13:10:04.137667975 +0000 UTC m=+26.827739976,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 13:10:32 crc kubenswrapper[4921]: > Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.903538 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c1a0c783d8498\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a0c783d8498 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:54.136360088 +0000 UTC m=+16.826432059,LastTimestamp:2026-03-12 13:10:04.137809799 +0000 UTC m=+26.827881800,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.908323 4921 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a0ecc7efe45 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:10:04.139871813 +0000 UTC m=+26.829943794,LastTimestamp:2026-03-12 13:10:04.139871813 +0000 UTC m=+26.829943794,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.913274 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c1a08fb75e8a0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a08fb75e8a0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.158001824 +0000 UTC m=+1.848073795,LastTimestamp:2026-03-12 13:10:04.257461229 +0000 UTC m=+26.947533200,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.917265 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c1a090f98ff34\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a090f98ff34 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.495845684 +0000 UTC m=+2.185917695,LastTimestamp:2026-03-12 13:10:04.469047376 +0000 UTC m=+27.159119347,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: I0312 13:10:32.917327 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.918801 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c1a0910656b98\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a0910656b98 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:39.509242776 +0000 UTC m=+2.199314777,LastTimestamp:2026-03-12 13:10:04.478575109 +0000 UTC m=+27.168647100,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.923166 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c1a0c783c8a96\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 13:10:32 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-controller-manager-crc.189c1a0c783c8a96 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 13:10:32 crc kubenswrapper[4921]: body: Mar 12 13:10:32 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:54.136296086 +0000 UTC m=+16.826368057,LastTimestamp:2026-03-12 13:10:14.137602648 +0000 UTC m=+36.827674639,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 13:10:32 crc kubenswrapper[4921]: > Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.925482 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c1a0c783d8498\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c1a0c783d8498 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:54.136360088 +0000 UTC m=+16.826432059,LastTimestamp:2026-03-12 13:10:14.13766541 +0000 UTC m=+36.827737381,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:10:32 crc kubenswrapper[4921]: E0312 13:10:32.928163 4921 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c1a0c783c8a96\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 12 13:10:32 crc kubenswrapper[4921]: &Event{ObjectMeta:{kube-controller-manager-crc.189c1a0c783c8a96 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 12 13:10:32 crc kubenswrapper[4921]: body: Mar 12 13:10:32 crc kubenswrapper[4921]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:09:54.136296086 +0000 UTC m=+16.826368057,LastTimestamp:2026-03-12 13:10:24.137260136 +0000 UTC m=+46.827332157,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 12 13:10:32 crc kubenswrapper[4921]: > Mar 12 13:10:33 crc kubenswrapper[4921]: E0312 13:10:33.757424 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 13:10:33 crc kubenswrapper[4921]: I0312 13:10:33.764486 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:33 crc kubenswrapper[4921]: I0312 13:10:33.766482 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:33 crc kubenswrapper[4921]: I0312 13:10:33.766544 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:33 crc kubenswrapper[4921]: I0312 13:10:33.766562 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:33 crc kubenswrapper[4921]: I0312 13:10:33.766599 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:10:33 crc kubenswrapper[4921]: E0312 13:10:33.773363 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 13:10:33 crc kubenswrapper[4921]: I0312 13:10:33.919205 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.137750 4921 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.137879 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.137985 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.138172 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.139682 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.139731 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.139756 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.140525 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ef85ddea9ad217de7e7535c7f3402ce7b136a8b3779025e0f48a59124968e814"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.140683 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ef85ddea9ad217de7e7535c7f3402ce7b136a8b3779025e0f48a59124968e814" gracePeriod=30 Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.322237 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.324425 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.325187 4921 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ef85ddea9ad217de7e7535c7f3402ce7b136a8b3779025e0f48a59124968e814" exitCode=255 Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.325239 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ef85ddea9ad217de7e7535c7f3402ce7b136a8b3779025e0f48a59124968e814"} Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.325353 4921 scope.go:117] "RemoveContainer" containerID="71e16d390453f815fe97adc237523571ff4da158f8bde8bb89cbf2b411fc7be9" Mar 12 13:10:34 crc kubenswrapper[4921]: I0312 13:10:34.922766 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:35 crc kubenswrapper[4921]: I0312 13:10:35.332257 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 12 13:10:35 crc kubenswrapper[4921]: I0312 13:10:35.334775 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"99630fb5afd3ee96ce0624f0b97eab2bc4d5c61b3ebaf9f14bda2bc248446642"} Mar 12 13:10:35 crc kubenswrapper[4921]: I0312 13:10:35.334946 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:35 crc kubenswrapper[4921]: I0312 13:10:35.336563 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:35 crc kubenswrapper[4921]: I0312 13:10:35.336624 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:35 crc kubenswrapper[4921]: I0312 13:10:35.336642 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:35 crc kubenswrapper[4921]: I0312 13:10:35.920119 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:36 crc kubenswrapper[4921]: I0312 13:10:36.337421 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:36 crc kubenswrapper[4921]: I0312 13:10:36.338764 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:36 crc kubenswrapper[4921]: I0312 13:10:36.338846 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:36 crc kubenswrapper[4921]: I0312 13:10:36.339049 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:36 crc kubenswrapper[4921]: I0312 13:10:36.922626 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:37 crc kubenswrapper[4921]: I0312 13:10:37.919930 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:38 crc kubenswrapper[4921]: E0312 13:10:38.067722 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 13:10:38 crc kubenswrapper[4921]: I0312 13:10:38.919443 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:39 crc kubenswrapper[4921]: I0312 13:10:39.921681 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:40 crc kubenswrapper[4921]: E0312 13:10:40.764439 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 13:10:40 crc kubenswrapper[4921]: I0312 13:10:40.773897 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:40 crc kubenswrapper[4921]: I0312 13:10:40.776437 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:40 crc kubenswrapper[4921]: I0312 13:10:40.776493 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:40 crc kubenswrapper[4921]: I0312 13:10:40.776505 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:40 crc kubenswrapper[4921]: I0312 13:10:40.776545 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:10:40 crc kubenswrapper[4921]: E0312 13:10:40.782228 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 13:10:40 crc kubenswrapper[4921]: I0312 13:10:40.920207 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:41 crc kubenswrapper[4921]: I0312 13:10:41.136311 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:10:41 crc kubenswrapper[4921]: I0312 13:10:41.136522 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:41 crc kubenswrapper[4921]: I0312 13:10:41.137731 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:41 crc kubenswrapper[4921]: I0312 13:10:41.137792 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:41 crc kubenswrapper[4921]: I0312 13:10:41.137804 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:41 crc kubenswrapper[4921]: I0312 13:10:41.140514 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:10:41 crc kubenswrapper[4921]: I0312 13:10:41.353865 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:41 crc kubenswrapper[4921]: I0312 13:10:41.353966 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:10:41 crc kubenswrapper[4921]: I0312 13:10:41.354912 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:41 crc kubenswrapper[4921]: I0312 13:10:41.354963 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:41 crc kubenswrapper[4921]: I0312 13:10:41.354976 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:41 crc kubenswrapper[4921]: I0312 13:10:41.922505 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:42 crc kubenswrapper[4921]: I0312 13:10:42.356835 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:42 crc kubenswrapper[4921]: I0312 13:10:42.357986 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:42 crc kubenswrapper[4921]: I0312 13:10:42.358031 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:42 crc kubenswrapper[4921]: I0312 13:10:42.358044 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:42 crc kubenswrapper[4921]: I0312 13:10:42.919427 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:43 crc kubenswrapper[4921]: I0312 13:10:43.918409 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:43 crc kubenswrapper[4921]: I0312 13:10:43.983137 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:43 crc kubenswrapper[4921]: I0312 13:10:43.984141 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:43 crc kubenswrapper[4921]: I0312 13:10:43.984181 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:43 crc kubenswrapper[4921]: I0312 13:10:43.984196 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:43 crc kubenswrapper[4921]: I0312 13:10:43.984797 4921 scope.go:117] "RemoveContainer" containerID="88e1c0d8f62fdd3d054bda0ccb2cff60ec0d55f0480fbaa30ccaccf275cfceec" Mar 12 13:10:44 crc kubenswrapper[4921]: I0312 13:10:44.364417 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 13:10:44 crc kubenswrapper[4921]: I0312 13:10:44.366525 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"35202d539a243cb28c79808a706d0f7030ad1b011ea706c3bf1132d623651ff6"} Mar 12 13:10:44 crc kubenswrapper[4921]: I0312 13:10:44.366692 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:44 crc kubenswrapper[4921]: I0312 13:10:44.367628 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:44 crc kubenswrapper[4921]: I0312 13:10:44.367664 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:44 crc kubenswrapper[4921]: I0312 13:10:44.367677 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:44 crc kubenswrapper[4921]: I0312 13:10:44.918159 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:45 crc kubenswrapper[4921]: I0312 13:10:45.371065 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 13:10:45 crc kubenswrapper[4921]: I0312 13:10:45.371921 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 12 13:10:45 crc kubenswrapper[4921]: I0312 13:10:45.373959 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="35202d539a243cb28c79808a706d0f7030ad1b011ea706c3bf1132d623651ff6" exitCode=255 Mar 12 13:10:45 crc kubenswrapper[4921]: I0312 13:10:45.374025 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"35202d539a243cb28c79808a706d0f7030ad1b011ea706c3bf1132d623651ff6"} Mar 12 13:10:45 crc kubenswrapper[4921]: I0312 13:10:45.374083 4921 scope.go:117] "RemoveContainer" containerID="88e1c0d8f62fdd3d054bda0ccb2cff60ec0d55f0480fbaa30ccaccf275cfceec" Mar 12 13:10:45 crc kubenswrapper[4921]: I0312 13:10:45.374240 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:45 crc kubenswrapper[4921]: I0312 13:10:45.375618 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:45 crc kubenswrapper[4921]: I0312 13:10:45.375668 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:45 crc kubenswrapper[4921]: I0312 13:10:45.375680 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:45 crc kubenswrapper[4921]: I0312 13:10:45.376275 4921 scope.go:117] "RemoveContainer" containerID="35202d539a243cb28c79808a706d0f7030ad1b011ea706c3bf1132d623651ff6" Mar 12 13:10:45 crc kubenswrapper[4921]: E0312 13:10:45.376466 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 13:10:45 crc kubenswrapper[4921]: I0312 13:10:45.918979 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:46 crc kubenswrapper[4921]: I0312 13:10:46.304086 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:10:46 crc kubenswrapper[4921]: I0312 13:10:46.379026 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 13:10:46 crc kubenswrapper[4921]: I0312 13:10:46.380523 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:46 crc kubenswrapper[4921]: I0312 13:10:46.381350 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:46 crc kubenswrapper[4921]: I0312 13:10:46.381381 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:46 crc kubenswrapper[4921]: I0312 13:10:46.381391 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:46 crc kubenswrapper[4921]: I0312 13:10:46.381871 4921 scope.go:117] "RemoveContainer" containerID="35202d539a243cb28c79808a706d0f7030ad1b011ea706c3bf1132d623651ff6" Mar 12 13:10:46 crc kubenswrapper[4921]: E0312 13:10:46.382017 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 13:10:46 crc kubenswrapper[4921]: I0312 13:10:46.917934 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:47 crc kubenswrapper[4921]: E0312 13:10:47.770357 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 13:10:47 crc kubenswrapper[4921]: I0312 13:10:47.783406 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:47 crc kubenswrapper[4921]: I0312 13:10:47.784617 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:47 crc kubenswrapper[4921]: I0312 13:10:47.784650 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:47 crc kubenswrapper[4921]: I0312 13:10:47.784658 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:47 crc kubenswrapper[4921]: I0312 13:10:47.784678 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:10:47 crc kubenswrapper[4921]: E0312 13:10:47.790033 4921 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 12 13:10:47 crc kubenswrapper[4921]: I0312 13:10:47.920742 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:48 crc kubenswrapper[4921]: E0312 13:10:48.067924 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 13:10:48 crc kubenswrapper[4921]: I0312 13:10:48.921520 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:49 crc kubenswrapper[4921]: W0312 13:10:49.333594 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 12 13:10:49 crc kubenswrapper[4921]: E0312 13:10:49.333656 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 12 13:10:49 crc kubenswrapper[4921]: I0312 13:10:49.921519 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:50 crc kubenswrapper[4921]: I0312 13:10:50.212947 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 13:10:50 crc kubenswrapper[4921]: I0312 13:10:50.228365 4921 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 12 13:10:50 crc kubenswrapper[4921]: I0312 13:10:50.919347 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:51 crc kubenswrapper[4921]: W0312 13:10:51.460421 4921 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:51 crc kubenswrapper[4921]: E0312 13:10:51.460510 4921 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 12 13:10:51 crc kubenswrapper[4921]: I0312 13:10:51.921233 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:52 crc kubenswrapper[4921]: I0312 13:10:52.920693 4921 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:10:52 crc kubenswrapper[4921]: I0312 13:10:52.934835 4921 csr.go:261] certificate signing request csr-vv4gl is approved, waiting to be issued Mar 12 13:10:52 crc kubenswrapper[4921]: I0312 13:10:52.947872 4921 csr.go:257] certificate signing request csr-vv4gl is issued Mar 12 13:10:53 crc kubenswrapper[4921]: I0312 13:10:53.023011 4921 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 12 13:10:53 crc kubenswrapper[4921]: I0312 13:10:53.757311 4921 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 12 13:10:53 crc kubenswrapper[4921]: I0312 13:10:53.846332 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:10:53 crc kubenswrapper[4921]: I0312 13:10:53.846500 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:53 crc kubenswrapper[4921]: I0312 13:10:53.848105 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:53 crc kubenswrapper[4921]: I0312 13:10:53.848170 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:53 crc kubenswrapper[4921]: I0312 13:10:53.848192 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:53 crc kubenswrapper[4921]: I0312 13:10:53.949764 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-30 15:36:12.762234801 +0000 UTC Mar 12 13:10:53 crc kubenswrapper[4921]: I0312 13:10:53.949890 4921 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7034h25m18.81235303s for next certificate rotation Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.790982 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.792568 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.792622 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.792641 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.792777 4921 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.807604 4921 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.807997 4921 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 12 13:10:54 crc kubenswrapper[4921]: E0312 13:10:54.808027 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.811959 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.812001 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.812012 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.812030 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.812045 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:10:54Z","lastTransitionTime":"2026-03-12T13:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:10:54 crc kubenswrapper[4921]: E0312 13:10:54.833221 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.842907 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.842953 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.842968 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.842987 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.843002 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:10:54Z","lastTransitionTime":"2026-03-12T13:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:10:54 crc kubenswrapper[4921]: E0312 13:10:54.856942 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.865029 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.865084 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.865107 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.865139 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.865186 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:10:54Z","lastTransitionTime":"2026-03-12T13:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:10:54 crc kubenswrapper[4921]: E0312 13:10:54.883992 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.893483 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.893512 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.893520 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.893533 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.893542 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:10:54Z","lastTransitionTime":"2026-03-12T13:10:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:10:54 crc kubenswrapper[4921]: E0312 13:10:54.906663 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:10:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:10:54 crc kubenswrapper[4921]: E0312 13:10:54.906799 4921 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 13:10:54 crc kubenswrapper[4921]: E0312 13:10:54.906834 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.951921 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.952117 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.953220 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.953258 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.953266 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:10:54 crc kubenswrapper[4921]: I0312 13:10:54.953858 4921 scope.go:117] "RemoveContainer" containerID="35202d539a243cb28c79808a706d0f7030ad1b011ea706c3bf1132d623651ff6" Mar 12 13:10:54 crc kubenswrapper[4921]: E0312 13:10:54.954199 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 13:10:55 crc kubenswrapper[4921]: E0312 13:10:55.007282 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:55 crc kubenswrapper[4921]: E0312 13:10:55.107602 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:55 crc kubenswrapper[4921]: E0312 13:10:55.208599 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:55 crc kubenswrapper[4921]: E0312 13:10:55.309166 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:55 crc kubenswrapper[4921]: E0312 13:10:55.410118 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:55 crc kubenswrapper[4921]: E0312 13:10:55.511085 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:55 crc kubenswrapper[4921]: E0312 13:10:55.612229 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:55 crc kubenswrapper[4921]: E0312 13:10:55.712904 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:55 crc kubenswrapper[4921]: E0312 13:10:55.813701 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:55 crc kubenswrapper[4921]: E0312 13:10:55.913876 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:56 crc kubenswrapper[4921]: E0312 13:10:56.014922 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:56 crc kubenswrapper[4921]: E0312 13:10:56.115462 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:56 crc kubenswrapper[4921]: E0312 13:10:56.216471 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:56 crc kubenswrapper[4921]: E0312 13:10:56.316599 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:56 crc kubenswrapper[4921]: E0312 13:10:56.416920 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:56 crc kubenswrapper[4921]: E0312 13:10:56.517614 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:56 crc kubenswrapper[4921]: E0312 13:10:56.618529 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:56 crc kubenswrapper[4921]: E0312 13:10:56.719118 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:56 crc kubenswrapper[4921]: E0312 13:10:56.819572 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:56 crc kubenswrapper[4921]: E0312 13:10:56.920185 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:57 crc kubenswrapper[4921]: E0312 13:10:57.020374 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:57 crc kubenswrapper[4921]: E0312 13:10:57.120984 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:57 crc kubenswrapper[4921]: E0312 13:10:57.221881 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:57 crc kubenswrapper[4921]: E0312 13:10:57.322038 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:57 crc kubenswrapper[4921]: E0312 13:10:57.422328 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:57 crc kubenswrapper[4921]: E0312 13:10:57.523000 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:57 crc kubenswrapper[4921]: E0312 13:10:57.624027 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:57 crc kubenswrapper[4921]: E0312 13:10:57.724920 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:57 crc kubenswrapper[4921]: E0312 13:10:57.826077 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:57 crc kubenswrapper[4921]: E0312 13:10:57.926494 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:58 crc kubenswrapper[4921]: E0312 13:10:58.027084 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:58 crc kubenswrapper[4921]: E0312 13:10:58.068508 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 13:10:58 crc kubenswrapper[4921]: E0312 13:10:58.127684 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:58 crc kubenswrapper[4921]: E0312 13:10:58.228558 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:58 crc kubenswrapper[4921]: E0312 13:10:58.329524 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:58 crc kubenswrapper[4921]: E0312 13:10:58.429652 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:58 crc kubenswrapper[4921]: E0312 13:10:58.530150 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:58 crc kubenswrapper[4921]: E0312 13:10:58.631922 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:58 crc kubenswrapper[4921]: E0312 13:10:58.732463 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:58 crc kubenswrapper[4921]: E0312 13:10:58.833408 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:58 crc kubenswrapper[4921]: E0312 13:10:58.933724 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:59 crc kubenswrapper[4921]: E0312 13:10:59.034269 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:59 crc kubenswrapper[4921]: E0312 13:10:59.134702 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:59 crc kubenswrapper[4921]: E0312 13:10:59.235313 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:59 crc kubenswrapper[4921]: E0312 13:10:59.336246 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:59 crc kubenswrapper[4921]: E0312 13:10:59.436955 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:59 crc kubenswrapper[4921]: E0312 13:10:59.537770 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:59 crc kubenswrapper[4921]: E0312 13:10:59.638893 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:59 crc kubenswrapper[4921]: E0312 13:10:59.739697 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:59 crc kubenswrapper[4921]: E0312 13:10:59.840869 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:10:59 crc kubenswrapper[4921]: E0312 13:10:59.941040 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:00 crc kubenswrapper[4921]: E0312 13:11:00.041225 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:00 crc kubenswrapper[4921]: E0312 13:11:00.141624 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:00 crc kubenswrapper[4921]: E0312 13:11:00.241717 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:00 crc kubenswrapper[4921]: E0312 13:11:00.341915 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:00 crc kubenswrapper[4921]: E0312 13:11:00.442696 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:00 crc kubenswrapper[4921]: E0312 13:11:00.543696 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:00 crc kubenswrapper[4921]: E0312 13:11:00.644591 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:00 crc kubenswrapper[4921]: E0312 13:11:00.745407 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:00 crc kubenswrapper[4921]: E0312 13:11:00.846071 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:00 crc kubenswrapper[4921]: E0312 13:11:00.946205 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:01 crc kubenswrapper[4921]: E0312 13:11:01.046885 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:01 crc kubenswrapper[4921]: E0312 13:11:01.147746 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:01 crc kubenswrapper[4921]: E0312 13:11:01.248573 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:01 crc kubenswrapper[4921]: E0312 13:11:01.349117 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:01 crc kubenswrapper[4921]: E0312 13:11:01.449663 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:01 crc kubenswrapper[4921]: E0312 13:11:01.550124 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:01 crc kubenswrapper[4921]: E0312 13:11:01.651193 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:01 crc kubenswrapper[4921]: E0312 13:11:01.751613 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:01 crc kubenswrapper[4921]: E0312 13:11:01.852231 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:01 crc kubenswrapper[4921]: E0312 13:11:01.953360 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:02 crc kubenswrapper[4921]: E0312 13:11:02.054171 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:02 crc kubenswrapper[4921]: E0312 13:11:02.154888 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:02 crc kubenswrapper[4921]: E0312 13:11:02.255986 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:02 crc kubenswrapper[4921]: E0312 13:11:02.356902 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:02 crc kubenswrapper[4921]: E0312 13:11:02.457178 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:02 crc kubenswrapper[4921]: E0312 13:11:02.557287 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:02 crc kubenswrapper[4921]: E0312 13:11:02.658200 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:02 crc kubenswrapper[4921]: E0312 13:11:02.759077 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:02 crc kubenswrapper[4921]: E0312 13:11:02.859296 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:02 crc kubenswrapper[4921]: E0312 13:11:02.959549 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:02 crc kubenswrapper[4921]: I0312 13:11:02.967900 4921 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 12 13:11:03 crc kubenswrapper[4921]: E0312 13:11:03.059967 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:03 crc kubenswrapper[4921]: E0312 13:11:03.160752 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:03 crc kubenswrapper[4921]: E0312 13:11:03.261141 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:03 crc kubenswrapper[4921]: E0312 13:11:03.361306 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:03 crc kubenswrapper[4921]: E0312 13:11:03.461608 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:03 crc kubenswrapper[4921]: E0312 13:11:03.562399 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:03 crc kubenswrapper[4921]: E0312 13:11:03.662533 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:03 crc kubenswrapper[4921]: E0312 13:11:03.762908 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:03 crc kubenswrapper[4921]: E0312 13:11:03.864032 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:03 crc kubenswrapper[4921]: E0312 13:11:03.965038 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.065217 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.165522 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.266702 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.367317 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.468309 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.568868 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.669946 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.771048 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.871970 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.924665 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.930136 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.930196 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.930213 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.930238 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.930257 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:04Z","lastTransitionTime":"2026-03-12T13:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.940757 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.946111 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.946166 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.946188 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.946217 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.946235 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:04Z","lastTransitionTime":"2026-03-12T13:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.957898 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.962284 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.962338 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.962358 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.962382 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.962404 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:04Z","lastTransitionTime":"2026-03-12T13:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.978675 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.983497 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.983587 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.983604 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.983632 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:04 crc kubenswrapper[4921]: I0312 13:11:04.983647 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:04Z","lastTransitionTime":"2026-03-12T13:11:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.998149 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.998397 4921 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 13:11:04 crc kubenswrapper[4921]: E0312 13:11:04.998444 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:05 crc kubenswrapper[4921]: E0312 13:11:05.098945 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:05 crc kubenswrapper[4921]: E0312 13:11:05.199900 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:05 crc kubenswrapper[4921]: E0312 13:11:05.300905 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:05 crc kubenswrapper[4921]: E0312 13:11:05.401346 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:05 crc kubenswrapper[4921]: E0312 13:11:05.501864 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:05 crc kubenswrapper[4921]: E0312 13:11:05.602972 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:05 crc kubenswrapper[4921]: E0312 13:11:05.703140 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:05 crc kubenswrapper[4921]: E0312 13:11:05.803515 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:05 crc kubenswrapper[4921]: E0312 13:11:05.904020 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:06 crc kubenswrapper[4921]: E0312 13:11:06.004409 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:06 crc kubenswrapper[4921]: E0312 13:11:06.105539 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:06 crc kubenswrapper[4921]: E0312 13:11:06.205659 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:06 crc kubenswrapper[4921]: E0312 13:11:06.306201 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:06 crc kubenswrapper[4921]: E0312 13:11:06.406613 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:06 crc kubenswrapper[4921]: E0312 13:11:06.507733 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:06 crc kubenswrapper[4921]: E0312 13:11:06.608722 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:06 crc kubenswrapper[4921]: I0312 13:11:06.639775 4921 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 12 13:11:06 crc kubenswrapper[4921]: E0312 13:11:06.708899 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:06 crc kubenswrapper[4921]: E0312 13:11:06.809762 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:06 crc kubenswrapper[4921]: E0312 13:11:06.910700 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:07 crc kubenswrapper[4921]: E0312 13:11:07.011182 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:07 crc kubenswrapper[4921]: E0312 13:11:07.112236 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:07 crc kubenswrapper[4921]: E0312 13:11:07.213190 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:07 crc kubenswrapper[4921]: E0312 13:11:07.313453 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:07 crc kubenswrapper[4921]: E0312 13:11:07.414633 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:07 crc kubenswrapper[4921]: E0312 13:11:07.516038 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:07 crc kubenswrapper[4921]: E0312 13:11:07.616580 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:07 crc kubenswrapper[4921]: E0312 13:11:07.716986 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:07 crc kubenswrapper[4921]: E0312 13:11:07.817144 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:07 crc kubenswrapper[4921]: E0312 13:11:07.917246 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:08 crc kubenswrapper[4921]: E0312 13:11:08.017653 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:08 crc kubenswrapper[4921]: E0312 13:11:08.069454 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 13:11:08 crc kubenswrapper[4921]: E0312 13:11:08.118357 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:08 crc kubenswrapper[4921]: E0312 13:11:08.218611 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:08 crc kubenswrapper[4921]: E0312 13:11:08.318730 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:08 crc kubenswrapper[4921]: E0312 13:11:08.418903 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:08 crc kubenswrapper[4921]: E0312 13:11:08.520093 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:08 crc kubenswrapper[4921]: E0312 13:11:08.620545 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:08 crc kubenswrapper[4921]: E0312 13:11:08.721741 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:08 crc kubenswrapper[4921]: E0312 13:11:08.822185 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:08 crc kubenswrapper[4921]: E0312 13:11:08.923024 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:09 crc kubenswrapper[4921]: E0312 13:11:09.023804 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:09 crc kubenswrapper[4921]: E0312 13:11:09.124392 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:09 crc kubenswrapper[4921]: E0312 13:11:09.224558 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:09 crc kubenswrapper[4921]: E0312 13:11:09.325423 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:09 crc kubenswrapper[4921]: E0312 13:11:09.425918 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:09 crc kubenswrapper[4921]: E0312 13:11:09.526440 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:09 crc kubenswrapper[4921]: E0312 13:11:09.626892 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:09 crc kubenswrapper[4921]: E0312 13:11:09.727672 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:09 crc kubenswrapper[4921]: E0312 13:11:09.828160 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:09 crc kubenswrapper[4921]: E0312 13:11:09.928889 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:09 crc kubenswrapper[4921]: I0312 13:11:09.983149 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:11:09 crc kubenswrapper[4921]: I0312 13:11:09.983149 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:11:09 crc kubenswrapper[4921]: I0312 13:11:09.984511 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:09 crc kubenswrapper[4921]: I0312 13:11:09.984578 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:09 crc kubenswrapper[4921]: I0312 13:11:09.984637 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:09 crc kubenswrapper[4921]: I0312 13:11:09.984675 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:09 crc kubenswrapper[4921]: I0312 13:11:09.984695 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:09 crc kubenswrapper[4921]: I0312 13:11:09.984706 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:09 crc kubenswrapper[4921]: I0312 13:11:09.985769 4921 scope.go:117] "RemoveContainer" containerID="35202d539a243cb28c79808a706d0f7030ad1b011ea706c3bf1132d623651ff6" Mar 12 13:11:09 crc kubenswrapper[4921]: E0312 13:11:09.986065 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 13:11:10 crc kubenswrapper[4921]: E0312 13:11:10.029066 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:10 crc kubenswrapper[4921]: E0312 13:11:10.130070 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:10 crc kubenswrapper[4921]: E0312 13:11:10.231014 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:10 crc kubenswrapper[4921]: E0312 13:11:10.331508 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:10 crc kubenswrapper[4921]: E0312 13:11:10.432084 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:10 crc kubenswrapper[4921]: E0312 13:11:10.532298 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:10 crc kubenswrapper[4921]: E0312 13:11:10.633373 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:10 crc kubenswrapper[4921]: E0312 13:11:10.733485 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:10 crc kubenswrapper[4921]: E0312 13:11:10.834394 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:10 crc kubenswrapper[4921]: E0312 13:11:10.935495 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:11 crc kubenswrapper[4921]: E0312 13:11:11.036015 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:11 crc kubenswrapper[4921]: E0312 13:11:11.137097 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:11 crc kubenswrapper[4921]: E0312 13:11:11.238221 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:11 crc kubenswrapper[4921]: E0312 13:11:11.338958 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:11 crc kubenswrapper[4921]: E0312 13:11:11.439040 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:11 crc kubenswrapper[4921]: E0312 13:11:11.539616 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:11 crc kubenswrapper[4921]: E0312 13:11:11.639757 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:11 crc kubenswrapper[4921]: E0312 13:11:11.740867 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:11 crc kubenswrapper[4921]: E0312 13:11:11.841694 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:11 crc kubenswrapper[4921]: E0312 13:11:11.942632 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:12 crc kubenswrapper[4921]: E0312 13:11:12.043117 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:12 crc kubenswrapper[4921]: E0312 13:11:12.143611 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:12 crc kubenswrapper[4921]: E0312 13:11:12.243874 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:12 crc kubenswrapper[4921]: E0312 13:11:12.343993 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:12 crc kubenswrapper[4921]: E0312 13:11:12.444965 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:12 crc kubenswrapper[4921]: E0312 13:11:12.545072 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:12 crc kubenswrapper[4921]: E0312 13:11:12.645590 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:12 crc kubenswrapper[4921]: E0312 13:11:12.746151 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:12 crc kubenswrapper[4921]: E0312 13:11:12.846475 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:12 crc kubenswrapper[4921]: E0312 13:11:12.947379 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:12 crc kubenswrapper[4921]: I0312 13:11:12.983267 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:11:12 crc kubenswrapper[4921]: I0312 13:11:12.984616 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:12 crc kubenswrapper[4921]: I0312 13:11:12.984679 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:12 crc kubenswrapper[4921]: I0312 13:11:12.984696 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:13 crc kubenswrapper[4921]: E0312 13:11:13.047528 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:13 crc kubenswrapper[4921]: E0312 13:11:13.148002 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:13 crc kubenswrapper[4921]: E0312 13:11:13.249001 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:13 crc kubenswrapper[4921]: E0312 13:11:13.349492 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:13 crc kubenswrapper[4921]: E0312 13:11:13.449699 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:13 crc kubenswrapper[4921]: E0312 13:11:13.549985 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:13 crc kubenswrapper[4921]: E0312 13:11:13.650143 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:13 crc kubenswrapper[4921]: E0312 13:11:13.751237 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:13 crc kubenswrapper[4921]: E0312 13:11:13.851408 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:13 crc kubenswrapper[4921]: E0312 13:11:13.952054 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:14 crc kubenswrapper[4921]: E0312 13:11:14.052549 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:14 crc kubenswrapper[4921]: E0312 13:11:14.153552 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:14 crc kubenswrapper[4921]: E0312 13:11:14.254574 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:14 crc kubenswrapper[4921]: E0312 13:11:14.355672 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:14 crc kubenswrapper[4921]: E0312 13:11:14.456797 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:14 crc kubenswrapper[4921]: E0312 13:11:14.557938 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:14 crc kubenswrapper[4921]: E0312 13:11:14.658833 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:14 crc kubenswrapper[4921]: E0312 13:11:14.758982 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:14 crc kubenswrapper[4921]: E0312 13:11:14.859306 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:14 crc kubenswrapper[4921]: E0312 13:11:14.960327 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.060699 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.083318 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.088242 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.088284 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.088296 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.088312 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.088322 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:15Z","lastTransitionTime":"2026-03-12T13:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.103863 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.108166 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.108228 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.108249 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.108272 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.108291 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:15Z","lastTransitionTime":"2026-03-12T13:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.123900 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.128116 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.128164 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.128175 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.128191 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.128206 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:15Z","lastTransitionTime":"2026-03-12T13:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.142873 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.147386 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.147523 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.147548 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.147577 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:15 crc kubenswrapper[4921]: I0312 13:11:15.147598 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:15Z","lastTransitionTime":"2026-03-12T13:11:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.160115 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.160266 4921 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.161068 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.262069 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.363133 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.463841 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.564804 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.665737 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.765918 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.866703 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:15 crc kubenswrapper[4921]: E0312 13:11:15.967449 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:16 crc kubenswrapper[4921]: E0312 13:11:16.067887 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:16 crc kubenswrapper[4921]: E0312 13:11:16.168441 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:16 crc kubenswrapper[4921]: E0312 13:11:16.269070 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:16 crc kubenswrapper[4921]: E0312 13:11:16.370114 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:16 crc kubenswrapper[4921]: E0312 13:11:16.470658 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:16 crc kubenswrapper[4921]: E0312 13:11:16.571782 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:16 crc kubenswrapper[4921]: E0312 13:11:16.672807 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:16 crc kubenswrapper[4921]: E0312 13:11:16.773068 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:16 crc kubenswrapper[4921]: E0312 13:11:16.873839 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:16 crc kubenswrapper[4921]: E0312 13:11:16.974156 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:17 crc kubenswrapper[4921]: E0312 13:11:17.074534 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:17 crc kubenswrapper[4921]: E0312 13:11:17.174880 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:17 crc kubenswrapper[4921]: E0312 13:11:17.275917 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:17 crc kubenswrapper[4921]: E0312 13:11:17.377086 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:17 crc kubenswrapper[4921]: E0312 13:11:17.477718 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:17 crc kubenswrapper[4921]: E0312 13:11:17.577917 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:17 crc kubenswrapper[4921]: E0312 13:11:17.679050 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:17 crc kubenswrapper[4921]: E0312 13:11:17.779872 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:17 crc kubenswrapper[4921]: E0312 13:11:17.880435 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:17 crc kubenswrapper[4921]: E0312 13:11:17.981594 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:18 crc kubenswrapper[4921]: E0312 13:11:18.070691 4921 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 12 13:11:18 crc kubenswrapper[4921]: E0312 13:11:18.081722 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:18 crc kubenswrapper[4921]: E0312 13:11:18.182651 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:18 crc kubenswrapper[4921]: E0312 13:11:18.283639 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:18 crc kubenswrapper[4921]: E0312 13:11:18.384667 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:18 crc kubenswrapper[4921]: E0312 13:11:18.485764 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:18 crc kubenswrapper[4921]: E0312 13:11:18.586585 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:18 crc kubenswrapper[4921]: E0312 13:11:18.687422 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:18 crc kubenswrapper[4921]: E0312 13:11:18.788351 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:18 crc kubenswrapper[4921]: E0312 13:11:18.889352 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:18 crc kubenswrapper[4921]: E0312 13:11:18.989684 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:19 crc kubenswrapper[4921]: E0312 13:11:19.090491 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:19 crc kubenswrapper[4921]: E0312 13:11:19.191300 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:19 crc kubenswrapper[4921]: E0312 13:11:19.292468 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:19 crc kubenswrapper[4921]: E0312 13:11:19.392802 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:19 crc kubenswrapper[4921]: E0312 13:11:19.493172 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:19 crc kubenswrapper[4921]: E0312 13:11:19.594247 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:19 crc kubenswrapper[4921]: E0312 13:11:19.695219 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:19 crc kubenswrapper[4921]: E0312 13:11:19.795460 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:19 crc kubenswrapper[4921]: E0312 13:11:19.896491 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:19 crc kubenswrapper[4921]: E0312 13:11:19.997678 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:20 crc kubenswrapper[4921]: E0312 13:11:20.098727 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:20 crc kubenswrapper[4921]: E0312 13:11:20.198902 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:20 crc kubenswrapper[4921]: E0312 13:11:20.299646 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:20 crc kubenswrapper[4921]: E0312 13:11:20.400114 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:20 crc kubenswrapper[4921]: E0312 13:11:20.500998 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:20 crc kubenswrapper[4921]: E0312 13:11:20.601125 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:20 crc kubenswrapper[4921]: E0312 13:11:20.702109 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:20 crc kubenswrapper[4921]: E0312 13:11:20.802581 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:20 crc kubenswrapper[4921]: E0312 13:11:20.903083 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:21 crc kubenswrapper[4921]: E0312 13:11:21.004073 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:21 crc kubenswrapper[4921]: E0312 13:11:21.104712 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:21 crc kubenswrapper[4921]: E0312 13:11:21.205899 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:21 crc kubenswrapper[4921]: E0312 13:11:21.306965 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:21 crc kubenswrapper[4921]: E0312 13:11:21.408139 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:21 crc kubenswrapper[4921]: E0312 13:11:21.508930 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:21 crc kubenswrapper[4921]: E0312 13:11:21.609845 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:21 crc kubenswrapper[4921]: E0312 13:11:21.710128 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:21 crc kubenswrapper[4921]: E0312 13:11:21.810411 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:21 crc kubenswrapper[4921]: E0312 13:11:21.911258 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:22 crc kubenswrapper[4921]: E0312 13:11:22.011880 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:22 crc kubenswrapper[4921]: E0312 13:11:22.112070 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:22 crc kubenswrapper[4921]: E0312 13:11:22.213283 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:22 crc kubenswrapper[4921]: E0312 13:11:22.313995 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:22 crc kubenswrapper[4921]: E0312 13:11:22.414659 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:22 crc kubenswrapper[4921]: E0312 13:11:22.515179 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:22 crc kubenswrapper[4921]: E0312 13:11:22.615939 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:22 crc kubenswrapper[4921]: E0312 13:11:22.717053 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:22 crc kubenswrapper[4921]: E0312 13:11:22.818006 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:22 crc kubenswrapper[4921]: E0312 13:11:22.918420 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:22 crc kubenswrapper[4921]: I0312 13:11:22.983286 4921 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:11:22 crc kubenswrapper[4921]: I0312 13:11:22.984426 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:22 crc kubenswrapper[4921]: I0312 13:11:22.984505 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:22 crc kubenswrapper[4921]: I0312 13:11:22.984540 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:22 crc kubenswrapper[4921]: I0312 13:11:22.985654 4921 scope.go:117] "RemoveContainer" containerID="35202d539a243cb28c79808a706d0f7030ad1b011ea706c3bf1132d623651ff6" Mar 12 13:11:22 crc kubenswrapper[4921]: E0312 13:11:22.986044 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 12 13:11:23 crc kubenswrapper[4921]: E0312 13:11:23.019617 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:23 crc kubenswrapper[4921]: E0312 13:11:23.120710 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:23 crc kubenswrapper[4921]: E0312 13:11:23.221542 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:23 crc kubenswrapper[4921]: E0312 13:11:23.322723 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:23 crc kubenswrapper[4921]: E0312 13:11:23.423660 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:23 crc kubenswrapper[4921]: E0312 13:11:23.524665 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:23 crc kubenswrapper[4921]: E0312 13:11:23.625433 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:23 crc kubenswrapper[4921]: E0312 13:11:23.726297 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:23 crc kubenswrapper[4921]: E0312 13:11:23.826679 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:23 crc kubenswrapper[4921]: E0312 13:11:23.927123 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:24 crc kubenswrapper[4921]: E0312 13:11:24.027974 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:24 crc kubenswrapper[4921]: E0312 13:11:24.128439 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:24 crc kubenswrapper[4921]: E0312 13:11:24.229534 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:24 crc kubenswrapper[4921]: E0312 13:11:24.329860 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:24 crc kubenswrapper[4921]: E0312 13:11:24.430893 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:24 crc kubenswrapper[4921]: E0312 13:11:24.531911 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:24 crc kubenswrapper[4921]: E0312 13:11:24.632518 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:24 crc kubenswrapper[4921]: E0312 13:11:24.733418 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:24 crc kubenswrapper[4921]: E0312 13:11:24.833624 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:24 crc kubenswrapper[4921]: E0312 13:11:24.934501 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:25 crc kubenswrapper[4921]: E0312 13:11:25.035256 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:25 crc kubenswrapper[4921]: E0312 13:11:25.135625 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:25 crc kubenswrapper[4921]: E0312 13:11:25.236084 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:25 crc kubenswrapper[4921]: E0312 13:11:25.336256 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:25 crc kubenswrapper[4921]: E0312 13:11:25.436368 4921 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.484319 4921 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.504875 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.505003 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.505019 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.505093 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.505113 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:25Z","lastTransitionTime":"2026-03-12T13:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:25 crc kubenswrapper[4921]: E0312 13:11:25.514641 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.518871 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.518927 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.518947 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.518973 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.518991 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:25Z","lastTransitionTime":"2026-03-12T13:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:25 crc kubenswrapper[4921]: E0312 13:11:25.535728 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.540649 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.540726 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.540747 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.540773 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.540793 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:25Z","lastTransitionTime":"2026-03-12T13:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:25 crc kubenswrapper[4921]: E0312 13:11:25.556572 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.561543 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.561598 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.561608 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.561628 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.561640 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:25Z","lastTransitionTime":"2026-03-12T13:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:25 crc kubenswrapper[4921]: E0312 13:11:25.581371 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.587184 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.587259 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.587270 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.587289 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.587301 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:25Z","lastTransitionTime":"2026-03-12T13:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:25 crc kubenswrapper[4921]: E0312 13:11:25.597715 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"cb0bf9b7-9747-40d7-a967-f44b0632d26d\\\",\\\"systemUUID\\\":\\\"2fb4ffa9-fae0-4002-98df-640245dc5e65\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:25 crc kubenswrapper[4921]: E0312 13:11:25.597986 4921 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.599973 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.600043 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.600068 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.600102 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.600128 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:25Z","lastTransitionTime":"2026-03-12T13:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.703261 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.703322 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.703340 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.703367 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.703390 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:25Z","lastTransitionTime":"2026-03-12T13:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.806341 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.806409 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.806427 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.806453 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.806472 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:25Z","lastTransitionTime":"2026-03-12T13:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.909932 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.909994 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.910013 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.910037 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.910055 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:25Z","lastTransitionTime":"2026-03-12T13:11:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.953705 4921 apiserver.go:52] "Watching apiserver" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.962493 4921 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.963114 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5swj9","openshift-image-registry/node-ca-rg7j6","openshift-machine-config-operator/machine-config-daemon-fkpqq","openshift-multus/multus-additional-cni-plugins-nwgrv","openshift-network-operator/iptables-alerter-4ln5h","openshift-multus/multus-q6tv6","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-rl674"] Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.963564 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.963710 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.963859 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.963893 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.963942 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.964055 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:25 crc kubenswrapper[4921]: E0312 13:11:25.964101 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 13:11:25 crc kubenswrapper[4921]: E0312 13:11:25.964166 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 13:11:25 crc kubenswrapper[4921]: E0312 13:11:25.964192 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.964419 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5swj9" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.964764 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rg7j6" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.964938 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.965082 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.965270 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.965546 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q6tv6" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.967501 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.969698 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.969913 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.970407 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.972111 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.972933 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.972948 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.973249 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.973576 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.973671 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.975652 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.975766 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.977789 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.978992 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.979577 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.979675 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.979975 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.980219 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.980290 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.980480 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.980792 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.981013 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.981076 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.981115 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.981288 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.981346 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.981369 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.981369 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.981451 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.981525 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.981575 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.981588 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.981596 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.981800 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.982077 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 12 13:11:25 crc kubenswrapper[4921]: I0312 13:11:25.988704 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.011743 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6tv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db00f274-e86e-48c1-b0fe-5b4750265b85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkbxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6tv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.013020 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.013082 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.013093 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.013110 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.013123 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:26Z","lastTransitionTime":"2026-03-12T13:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.018679 4921 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.028735 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.041665 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg7j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e54723c-5e07-4a87-9410-92b770b28714\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg7j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.058970 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.075095 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae82cb49-657a-4b47-8107-0729b9edf47b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fkpqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.077496 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.077567 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.077605 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.077681 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.077720 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.077753 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.077787 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.077942 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.077983 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078015 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078165 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078210 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078241 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078272 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078305 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078308 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078322 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078336 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078488 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078546 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078603 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078661 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078708 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078718 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078793 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078842 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078853 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078866 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078934 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079121 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079165 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079197 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079230 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079263 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079296 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079329 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079361 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079391 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079422 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079454 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079484 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079517 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079555 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079591 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079621 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079654 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079685 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079715 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079748 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079873 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079907 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079941 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079971 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080004 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080038 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080071 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080101 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080132 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080160 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080191 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080221 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080252 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080288 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080318 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080347 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080380 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080412 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080446 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080479 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080519 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080552 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080586 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080620 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080656 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080687 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080721 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080758 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080789 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080870 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080906 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.078922 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079204 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079470 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.079495 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.080940 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.082256 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.082284 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.082309 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.082290 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.082650 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.082696 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.082775 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.082834 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.082876 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.083524 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.083559 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.083605 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.083645 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.083920 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.084017 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.084955 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.084296 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.084510 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.084590 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.084685 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.084733 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.084762 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.084767 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.084782 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.084971 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.085476 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.085706 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.085780 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.085804 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.085845 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.085867 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.085877 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.085892 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.085972 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086008 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086015 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086037 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086070 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086358 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086104 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086581 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086608 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086632 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086656 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086679 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086702 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086725 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086759 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086780 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.086997 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087022 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087030 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087046 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087072 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087095 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087118 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087141 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087164 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087185 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087207 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087230 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087253 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087278 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087383 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087399 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087491 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087516 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087538 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087563 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087590 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087614 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087639 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087663 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087686 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087709 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087731 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087753 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087777 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087802 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087843 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087866 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087888 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087912 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087937 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087960 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.087988 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088011 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088035 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088060 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088086 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088112 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088136 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088159 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088182 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088184 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088206 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088294 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088509 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088555 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088595 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088632 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088671 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088707 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088741 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088779 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088843 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088958 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089009 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089224 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089268 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089304 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089337 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089375 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089408 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089443 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089478 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089513 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089556 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089605 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089660 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089700 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089735 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089772 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088456 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089858 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.088729 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089395 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089456 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089904 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089944 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089977 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.090013 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.090046 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.090081 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.090116 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.090151 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.089870 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.090434 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.090838 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.084914 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.091174 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.092322 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.092173 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.092404 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.092494 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.092700 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.093079 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.093268 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.093465 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.093499 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.093767 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.093917 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.090283 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.094079 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.094141 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.094194 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.094351 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.094424 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.094650 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.094707 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.094754 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.094775 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.095655 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.095953 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.095998 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.096452 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkbxk\" (UniqueName: \"kubernetes.io/projected/db00f274-e86e-48c1-b0fe-5b4750265b85-kube-api-access-rkbxk\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.096164 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.096926 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.097040 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:26.597014563 +0000 UTC m=+109.287086544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.097497 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.097518 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-openvswitch\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.097660 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-log-socket\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.097665 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.097918 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.099277 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.098904 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fda55cb-f6de-499d-8e5f-c48586fdfd34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nwgrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.098031 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.098049 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.098534 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.098560 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.098682 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.098892 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.097716 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100223 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100288 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100330 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100359 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-run-netns\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100388 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-cni-netd\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100426 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100453 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-slash\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100479 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8fda55cb-f6de-499d-8e5f-c48586fdfd34-os-release\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100520 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100561 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100592 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-systemd-units\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100619 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5c679df-0a81-4663-a3fc-d7247c933507-ovn-node-metrics-cert\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100647 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae82cb49-657a-4b47-8107-0729b9edf47b-mcd-auth-proxy-config\") pod \"machine-config-daemon-fkpqq\" (UID: \"ae82cb49-657a-4b47-8107-0729b9edf47b\") " pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100680 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100707 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-var-lib-cni-bin\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100795 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-var-lib-kubelet\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100958 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-hostroot\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.098921 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.098936 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.098986 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.101119 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-etc-kubernetes\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.098994 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100468 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.101168 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.101237 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.101516 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.101150 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-cni-bin\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.101873 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-env-overrides\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100589 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.100862 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.102089 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pxbl\" (UniqueName: \"kubernetes.io/projected/d5c679df-0a81-4663-a3fc-d7247c933507-kube-api-access-2pxbl\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.102439 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db00f274-e86e-48c1-b0fe-5b4750265b85-cni-binary-copy\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.102595 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrhnc\" (UniqueName: \"kubernetes.io/projected/ae82cb49-657a-4b47-8107-0729b9edf47b-kube-api-access-hrhnc\") pod \"machine-config-daemon-fkpqq\" (UID: \"ae82cb49-657a-4b47-8107-0729b9edf47b\") " pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.102744 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.102951 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-etc-openvswitch\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.103129 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.103287 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.103432 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-node-log\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.103568 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.103755 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-var-lib-cni-multus\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.103936 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-ovnkube-config\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.104080 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fda55cb-f6de-499d-8e5f-c48586fdfd34-system-cni-dir\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.104199 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-var-lib-openvswitch\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.104305 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1e54723c-5e07-4a87-9410-92b770b28714-serviceca\") pod \"node-ca-rg7j6\" (UID: \"1e54723c-5e07-4a87-9410-92b770b28714\") " pod="openshift-image-registry/node-ca-rg7j6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.104401 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-run-netns\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.104496 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.104596 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvqw7\" (UniqueName: \"kubernetes.io/projected/8fda55cb-f6de-499d-8e5f-c48586fdfd34-kube-api-access-vvqw7\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.104689 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-os-release\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.103978 4921 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.104880 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-run-k8s-cni-cncf-io\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.104984 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-run-multus-certs\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.105086 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-run-ovn-kubernetes\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.105210 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae82cb49-657a-4b47-8107-0729b9edf47b-proxy-tls\") pod \"machine-config-daemon-fkpqq\" (UID: \"ae82cb49-657a-4b47-8107-0729b9edf47b\") " pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.102645 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.105343 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkzh6\" (UniqueName: \"kubernetes.io/projected/1e54723c-5e07-4a87-9410-92b770b28714-kube-api-access-fkzh6\") pod \"node-ca-rg7j6\" (UID: \"1e54723c-5e07-4a87-9410-92b770b28714\") " pod="openshift-image-registry/node-ca-rg7j6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.102742 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.102930 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.103026 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.103056 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.103355 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.103382 4921 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.105653 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8fda55cb-f6de-499d-8e5f-c48586fdfd34-cni-binary-copy\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.103778 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.105678 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.105694 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:26.605673352 +0000 UTC m=+109.295745333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.105719 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.103533 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.105759 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-multus-conf-dir\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.104599 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.105140 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.103949 4921 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.105913 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-cnibin\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.105930 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:26.605918979 +0000 UTC m=+109.295990960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.105965 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.105971 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8fda55cb-f6de-499d-8e5f-c48586fdfd34-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.106051 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj8v6\" (UniqueName: \"kubernetes.io/projected/02d3ee9d-7145-4c65-94a8-55597fe7f574-kube-api-access-gj8v6\") pod \"node-resolver-5swj9\" (UID: \"02d3ee9d-7145-4c65-94a8-55597fe7f574\") " pod="openshift-dns/node-resolver-5swj9" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.106088 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.106103 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.106139 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-multus-cni-dir\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.107463 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-multus-socket-dir-parent\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.107502 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8fda55cb-f6de-499d-8e5f-c48586fdfd34-cnibin\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.107542 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02d3ee9d-7145-4c65-94a8-55597fe7f574-hosts-file\") pod \"node-resolver-5swj9\" (UID: \"02d3ee9d-7145-4c65-94a8-55597fe7f574\") " pod="openshift-dns/node-resolver-5swj9" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.107578 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-system-cni-dir\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.107606 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8fda55cb-f6de-499d-8e5f-c48586fdfd34-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.107632 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e54723c-5e07-4a87-9410-92b770b28714-host\") pod \"node-ca-rg7j6\" (UID: \"1e54723c-5e07-4a87-9410-92b770b28714\") " pod="openshift-image-registry/node-ca-rg7j6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.107799 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.107914 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ae82cb49-657a-4b47-8107-0729b9edf47b-rootfs\") pod \"machine-config-daemon-fkpqq\" (UID: \"ae82cb49-657a-4b47-8107-0729b9edf47b\") " pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.107961 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.107989 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db00f274-e86e-48c1-b0fe-5b4750265b85-multus-daemon-config\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108040 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-kubelet\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108068 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-systemd\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108096 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-ovn\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108100 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108148 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108184 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-ovnkube-script-lib\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108335 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108432 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108452 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108455 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108466 4921 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108498 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108526 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108549 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108567 4921 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108584 4921 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108604 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108622 4921 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108640 4921 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108660 4921 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108678 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108694 4921 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108715 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108732 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108750 4921 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108769 4921 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108790 4921 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108808 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108848 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108866 4921 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108884 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108901 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108917 4921 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108938 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108954 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108971 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.108990 4921 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109006 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109024 4921 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109040 4921 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109056 4921 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109074 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109091 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109110 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109127 4921 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109144 4921 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109161 4921 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109205 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109223 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109240 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109257 4921 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109276 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109294 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109312 4921 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109332 4921 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109352 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109368 4921 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109386 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109404 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109424 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109441 4921 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109457 4921 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109474 4921 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109491 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.117997 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.118077 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.118101 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.118120 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.118141 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.114968 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.118330 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.118361 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.118396 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.118414 4921 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.118713 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.118739 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.118750 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.118793 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.118805 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:26Z","lastTransitionTime":"2026-03-12T13:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.123192 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.123245 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.123263 4921 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.123279 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.123301 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.123318 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.123334 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.123351 4921 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.123375 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.123391 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.123400 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.123422 4921 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124461 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124524 4921 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124554 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124588 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124608 4921 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124627 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124646 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124661 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124675 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124688 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124705 4921 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124728 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124741 4921 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124765 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124798 4921 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.124871 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.109879 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.111650 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.112003 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.112525 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.112623 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.112730 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.113092 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.113707 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.116800 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.117580 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.117885 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.117910 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.119134 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.121337 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.121834 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.122036 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.122043 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.122066 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.125057 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.122661 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.122880 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.125343 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.125392 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.125412 4921 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.125496 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:26.625474407 +0000 UTC m=+109.315546398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.122923 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.123647 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.123692 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.124779 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.125880 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.125899 4921 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.125956 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:26.62594159 +0000 UTC m=+109.316013571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.128497 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.129557 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.129578 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.129696 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.130072 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.130350 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.130745 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.130437 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.131662 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.131685 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.131736 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.131776 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.132274 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.132126 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.132356 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.132595 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.132644 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.132697 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.133003 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.133012 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.133281 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.134292 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.134339 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.134588 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.134694 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.134715 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.134834 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.134839 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.134842 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.136211 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.134691 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.134922 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.134933 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.135001 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.135039 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.135054 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.135191 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.135383 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.135404 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.135546 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.135558 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.135592 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.135612 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.135711 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.135916 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.136019 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.136102 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.135439 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.137031 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.137041 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.137541 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.137807 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.137952 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.137971 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.138014 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.138140 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.138371 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.138364 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.138463 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.138711 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.139140 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.139209 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.139308 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.139467 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.139506 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.139861 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.139938 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.139944 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.140384 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.142071 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.142123 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.142637 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.142972 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.143337 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.143802 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.144029 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.144217 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.151806 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.158609 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.160713 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5swj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02d3ee9d-7145-4c65-94a8-55597fe7f574\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj8v6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5swj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.165145 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.174261 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.176659 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c679df-0a81-4663-a3fc-d7247c933507\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl674\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.177280 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.190463 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.221149 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.221224 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.221254 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.221270 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.221285 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:26Z","lastTransitionTime":"2026-03-12T13:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.226597 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-hostroot\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.226681 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-etc-kubernetes\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.226736 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-cni-bin\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.226772 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-hostroot\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.226824 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-etc-kubernetes\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.226784 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-env-overrides\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.226898 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pxbl\" (UniqueName: \"kubernetes.io/projected/d5c679df-0a81-4663-a3fc-d7247c933507-kube-api-access-2pxbl\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.226928 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db00f274-e86e-48c1-b0fe-5b4750265b85-cni-binary-copy\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.226951 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-var-lib-cni-bin\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.226975 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-var-lib-kubelet\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227003 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrhnc\" (UniqueName: \"kubernetes.io/projected/ae82cb49-657a-4b47-8107-0729b9edf47b-kube-api-access-hrhnc\") pod \"machine-config-daemon-fkpqq\" (UID: \"ae82cb49-657a-4b47-8107-0729b9edf47b\") " pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227048 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-etc-openvswitch\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227061 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-var-lib-cni-bin\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227109 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-var-lib-cni-multus\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227084 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-var-lib-cni-multus\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227182 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-node-log\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227187 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-var-lib-kubelet\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227252 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-var-lib-openvswitch\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227216 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-var-lib-openvswitch\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227297 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-node-log\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227322 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-ovnkube-config\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227339 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-etc-openvswitch\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.226936 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-cni-bin\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227362 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fda55cb-f6de-499d-8e5f-c48586fdfd34-system-cni-dir\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227405 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvqw7\" (UniqueName: \"kubernetes.io/projected/8fda55cb-f6de-499d-8e5f-c48586fdfd34-kube-api-access-vvqw7\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227456 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1e54723c-5e07-4a87-9410-92b770b28714-serviceca\") pod \"node-ca-rg7j6\" (UID: \"1e54723c-5e07-4a87-9410-92b770b28714\") " pod="openshift-image-registry/node-ca-rg7j6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227598 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8fda55cb-f6de-499d-8e5f-c48586fdfd34-system-cni-dir\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227779 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-run-netns\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227843 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/db00f274-e86e-48c1-b0fe-5b4750265b85-cni-binary-copy\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227502 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-run-netns\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227935 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-run-multus-certs\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227969 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-run-ovn-kubernetes\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.227999 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae82cb49-657a-4b47-8107-0729b9edf47b-proxy-tls\") pod \"machine-config-daemon-fkpqq\" (UID: \"ae82cb49-657a-4b47-8107-0729b9edf47b\") " pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228032 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkzh6\" (UniqueName: \"kubernetes.io/projected/1e54723c-5e07-4a87-9410-92b770b28714-kube-api-access-fkzh6\") pod \"node-ca-rg7j6\" (UID: \"1e54723c-5e07-4a87-9410-92b770b28714\") " pod="openshift-image-registry/node-ca-rg7j6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228049 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-run-multus-certs\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228057 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-env-overrides\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228079 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-os-release\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228115 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-run-k8s-cni-cncf-io\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228055 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-run-ovn-kubernetes\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228148 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-multus-conf-dir\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228166 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-os-release\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228185 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8fda55cb-f6de-499d-8e5f-c48586fdfd34-cni-binary-copy\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228202 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-multus-conf-dir\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228225 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8fda55cb-f6de-499d-8e5f-c48586fdfd34-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228258 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj8v6\" (UniqueName: \"kubernetes.io/projected/02d3ee9d-7145-4c65-94a8-55597fe7f574-kube-api-access-gj8v6\") pod \"node-resolver-5swj9\" (UID: \"02d3ee9d-7145-4c65-94a8-55597fe7f574\") " pod="openshift-dns/node-resolver-5swj9" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228372 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-cnibin\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228401 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-multus-cni-dir\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228437 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-multus-socket-dir-parent\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228466 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8fda55cb-f6de-499d-8e5f-c48586fdfd34-cnibin\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228495 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02d3ee9d-7145-4c65-94a8-55597fe7f574-hosts-file\") pod \"node-resolver-5swj9\" (UID: \"02d3ee9d-7145-4c65-94a8-55597fe7f574\") " pod="openshift-dns/node-resolver-5swj9" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228528 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-system-cni-dir\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228559 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8fda55cb-f6de-499d-8e5f-c48586fdfd34-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228591 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e54723c-5e07-4a87-9410-92b770b28714-host\") pod \"node-ca-rg7j6\" (UID: \"1e54723c-5e07-4a87-9410-92b770b28714\") " pod="openshift-image-registry/node-ca-rg7j6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228619 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ae82cb49-657a-4b47-8107-0729b9edf47b-rootfs\") pod \"machine-config-daemon-fkpqq\" (UID: \"ae82cb49-657a-4b47-8107-0729b9edf47b\") " pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228729 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8fda55cb-f6de-499d-8e5f-c48586fdfd34-cni-binary-copy\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228835 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-system-cni-dir\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228872 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8fda55cb-f6de-499d-8e5f-c48586fdfd34-cnibin\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228896 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-multus-socket-dir-parent\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228919 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1e54723c-5e07-4a87-9410-92b770b28714-host\") pod \"node-ca-rg7j6\" (UID: \"1e54723c-5e07-4a87-9410-92b770b28714\") " pod="openshift-image-registry/node-ca-rg7j6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228220 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-host-run-k8s-cni-cncf-io\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228992 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/02d3ee9d-7145-4c65-94a8-55597fe7f574-hosts-file\") pod \"node-resolver-5swj9\" (UID: \"02d3ee9d-7145-4c65-94a8-55597fe7f574\") " pod="openshift-dns/node-resolver-5swj9" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228988 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1e54723c-5e07-4a87-9410-92b770b28714-serviceca\") pod \"node-ca-rg7j6\" (UID: \"1e54723c-5e07-4a87-9410-92b770b28714\") " pod="openshift-image-registry/node-ca-rg7j6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229025 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-cnibin\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229054 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ae82cb49-657a-4b47-8107-0729b9edf47b-rootfs\") pod \"machine-config-daemon-fkpqq\" (UID: \"ae82cb49-657a-4b47-8107-0729b9edf47b\") " pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229078 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/db00f274-e86e-48c1-b0fe-5b4750265b85-multus-cni-dir\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.228650 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db00f274-e86e-48c1-b0fe-5b4750265b85-multus-daemon-config\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229258 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-kubelet\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229290 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-systemd\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229319 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-ovn\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229348 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-ovnkube-script-lib\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229381 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-log-socket\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229391 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-kubelet\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229411 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229456 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229479 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-ovnkube-config\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229480 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkbxk\" (UniqueName: \"kubernetes.io/projected/db00f274-e86e-48c1-b0fe-5b4750265b85-kube-api-access-rkbxk\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229582 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-openvswitch\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229632 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229652 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/db00f274-e86e-48c1-b0fe-5b4750265b85-multus-daemon-config\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229684 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-run-netns\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229728 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-log-socket\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229732 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-cni-netd\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229783 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-openvswitch\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229785 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229873 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8fda55cb-f6de-499d-8e5f-c48586fdfd34-os-release\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229947 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-slash\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.229999 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae82cb49-657a-4b47-8107-0729b9edf47b-mcd-auth-proxy-config\") pod \"machine-config-daemon-fkpqq\" (UID: \"ae82cb49-657a-4b47-8107-0729b9edf47b\") " pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230047 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-systemd-units\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230086 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-ovn\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230096 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5c679df-0a81-4663-a3fc-d7247c933507-ovn-node-metrics-cert\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230138 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-run-netns\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230152 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8fda55cb-f6de-499d-8e5f-c48586fdfd34-os-release\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230053 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-ovnkube-script-lib\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230275 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-slash\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230272 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-systemd\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230296 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230321 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230359 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-cni-netd\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230382 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-systemd-units\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230476 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230517 4921 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230531 4921 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230544 4921 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230557 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230592 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230607 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230621 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230635 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230648 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230687 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230700 4921 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230713 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230726 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230769 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230782 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230794 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230807 4921 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230800 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8fda55cb-f6de-499d-8e5f-c48586fdfd34-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230839 4921 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230901 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230922 4921 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230941 4921 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230960 4921 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230978 4921 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.230996 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231015 4921 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231032 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231050 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231070 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231089 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231108 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231126 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231143 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231160 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231177 4921 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231194 4921 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231211 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231228 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231247 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231265 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231286 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231305 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231328 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231345 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231364 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231382 4921 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231399 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231416 4921 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231434 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231452 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231469 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231486 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231504 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231521 4921 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231539 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231555 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231573 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231592 4921 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231610 4921 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231627 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231645 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231662 4921 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231679 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231697 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231718 4921 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231735 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231755 4921 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231773 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231789 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.231837 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232359 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232377 4921 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232394 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232414 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232436 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232454 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232471 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232490 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232507 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232524 4921 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232540 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232557 4921 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232574 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232593 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232610 4921 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232627 4921 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232644 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232661 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232679 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232700 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232717 4921 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232734 4921 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232752 4921 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232775 4921 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232792 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232835 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232855 4921 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232873 4921 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232891 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232909 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232933 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232958 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.232975 4921 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.233132 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ae82cb49-657a-4b47-8107-0729b9edf47b-mcd-auth-proxy-config\") pod \"machine-config-daemon-fkpqq\" (UID: \"ae82cb49-657a-4b47-8107-0729b9edf47b\") " pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.233985 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8fda55cb-f6de-499d-8e5f-c48586fdfd34-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.235035 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5c679df-0a81-4663-a3fc-d7247c933507-ovn-node-metrics-cert\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.236676 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ae82cb49-657a-4b47-8107-0729b9edf47b-proxy-tls\") pod \"machine-config-daemon-fkpqq\" (UID: \"ae82cb49-657a-4b47-8107-0729b9edf47b\") " pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.249970 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvqw7\" (UniqueName: \"kubernetes.io/projected/8fda55cb-f6de-499d-8e5f-c48586fdfd34-kube-api-access-vvqw7\") pod \"multus-additional-cni-plugins-nwgrv\" (UID: \"8fda55cb-f6de-499d-8e5f-c48586fdfd34\") " pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.249966 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj8v6\" (UniqueName: \"kubernetes.io/projected/02d3ee9d-7145-4c65-94a8-55597fe7f574-kube-api-access-gj8v6\") pod \"node-resolver-5swj9\" (UID: \"02d3ee9d-7145-4c65-94a8-55597fe7f574\") " pod="openshift-dns/node-resolver-5swj9" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.250188 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pxbl\" (UniqueName: \"kubernetes.io/projected/d5c679df-0a81-4663-a3fc-d7247c933507-kube-api-access-2pxbl\") pod \"ovnkube-node-rl674\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.250353 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrhnc\" (UniqueName: \"kubernetes.io/projected/ae82cb49-657a-4b47-8107-0729b9edf47b-kube-api-access-hrhnc\") pod \"machine-config-daemon-fkpqq\" (UID: \"ae82cb49-657a-4b47-8107-0729b9edf47b\") " pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.256428 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkzh6\" (UniqueName: \"kubernetes.io/projected/1e54723c-5e07-4a87-9410-92b770b28714-kube-api-access-fkzh6\") pod \"node-ca-rg7j6\" (UID: \"1e54723c-5e07-4a87-9410-92b770b28714\") " pod="openshift-image-registry/node-ca-rg7j6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.257328 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkbxk\" (UniqueName: \"kubernetes.io/projected/db00f274-e86e-48c1-b0fe-5b4750265b85-kube-api-access-rkbxk\") pod \"multus-q6tv6\" (UID: \"db00f274-e86e-48c1-b0fe-5b4750265b85\") " pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.284079 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.296682 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5swj9" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.304871 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 12 13:11:26 crc kubenswrapper[4921]: W0312 13:11:26.311512 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02d3ee9d_7145_4c65_94a8_55597fe7f574.slice/crio-f38b7636d7c5381219a2da6fff976a5166b88d15ad6e7939bb70e20ec3d10bab WatchSource:0}: Error finding container f38b7636d7c5381219a2da6fff976a5166b88d15ad6e7939bb70e20ec3d10bab: Status 404 returned error can't find the container with id f38b7636d7c5381219a2da6fff976a5166b88d15ad6e7939bb70e20ec3d10bab Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.315630 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rg7j6" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.322919 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.324120 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.324498 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.325332 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.325420 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.325719 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:26Z","lastTransitionTime":"2026-03-12T13:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:26 crc kubenswrapper[4921]: W0312 13:11:26.328834 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-af84e788d5b99e8de0ae936f43f4b1364f98363e22bfd1d0b7658cd4f8f01f60 WatchSource:0}: Error finding container af84e788d5b99e8de0ae936f43f4b1364f98363e22bfd1d0b7658cd4f8f01f60: Status 404 returned error can't find the container with id af84e788d5b99e8de0ae936f43f4b1364f98363e22bfd1d0b7658cd4f8f01f60 Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.330613 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.338928 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 12 13:11:26 crc kubenswrapper[4921]: W0312 13:11:26.342197 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e54723c_5e07_4a87_9410_92b770b28714.slice/crio-373ec0e7c11ab8fcd2b3812b0324688ed088922fb0809177e5feadabf6b2610e WatchSource:0}: Error finding container 373ec0e7c11ab8fcd2b3812b0324688ed088922fb0809177e5feadabf6b2610e: Status 404 returned error can't find the container with id 373ec0e7c11ab8fcd2b3812b0324688ed088922fb0809177e5feadabf6b2610e Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.346798 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.355401 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q6tv6" Mar 12 13:11:26 crc kubenswrapper[4921]: W0312 13:11:26.360484 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae82cb49_657a_4b47_8107_0729b9edf47b.slice/crio-31845e502317df23f6b9839dd2e6c557ff296090f6c1b440ad81a5691764ec90 WatchSource:0}: Error finding container 31845e502317df23f6b9839dd2e6c557ff296090f6c1b440ad81a5691764ec90: Status 404 returned error can't find the container with id 31845e502317df23f6b9839dd2e6c557ff296090f6c1b440ad81a5691764ec90 Mar 12 13:11:26 crc kubenswrapper[4921]: W0312 13:11:26.375012 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5c679df_0a81_4663_a3fc_d7247c933507.slice/crio-fdec2af208e179df589d5582612ee9723bba2bf10a9c43d1830cb2721b55c499 WatchSource:0}: Error finding container fdec2af208e179df589d5582612ee9723bba2bf10a9c43d1830cb2721b55c499: Status 404 returned error can't find the container with id fdec2af208e179df589d5582612ee9723bba2bf10a9c43d1830cb2721b55c499 Mar 12 13:11:26 crc kubenswrapper[4921]: W0312 13:11:26.400332 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-33dd573552219751679241780e41aa7c931b76bdd05155d75420598c509a7933 WatchSource:0}: Error finding container 33dd573552219751679241780e41aa7c931b76bdd05155d75420598c509a7933: Status 404 returned error can't find the container with id 33dd573552219751679241780e41aa7c931b76bdd05155d75420598c509a7933 Mar 12 13:11:26 crc kubenswrapper[4921]: W0312 13:11:26.402862 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fda55cb_f6de_499d_8e5f_c48586fdfd34.slice/crio-e9e03b58d125c470345b704127662ed89b8a67d5f43fae59184fa0adb7894530 WatchSource:0}: Error finding container e9e03b58d125c470345b704127662ed89b8a67d5f43fae59184fa0adb7894530: Status 404 returned error can't find the container with id e9e03b58d125c470345b704127662ed89b8a67d5f43fae59184fa0adb7894530 Mar 12 13:11:26 crc kubenswrapper[4921]: W0312 13:11:26.417244 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb00f274_e86e_48c1_b0fe_5b4750265b85.slice/crio-ee24637b20ac7b4319d31d3124e73b1de7e5ddb601f6ca7fa2f6ae39d7085041 WatchSource:0}: Error finding container ee24637b20ac7b4319d31d3124e73b1de7e5ddb601f6ca7fa2f6ae39d7085041: Status 404 returned error can't find the container with id ee24637b20ac7b4319d31d3124e73b1de7e5ddb601f6ca7fa2f6ae39d7085041 Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.427653 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.427681 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.427706 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.427720 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.427732 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:26Z","lastTransitionTime":"2026-03-12T13:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.485383 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" event={"ID":"8fda55cb-f6de-499d-8e5f-c48586fdfd34","Type":"ContainerStarted","Data":"e9e03b58d125c470345b704127662ed89b8a67d5f43fae59184fa0adb7894530"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.488567 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"31c3fdc807ab75c430672f592f2155cb082dfbf730e550f818103f30b992c49d"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.497160 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerStarted","Data":"fdec2af208e179df589d5582612ee9723bba2bf10a9c43d1830cb2721b55c499"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.498763 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5swj9" event={"ID":"02d3ee9d-7145-4c65-94a8-55597fe7f574","Type":"ContainerStarted","Data":"f38b7636d7c5381219a2da6fff976a5166b88d15ad6e7939bb70e20ec3d10bab"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.500092 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"33dd573552219751679241780e41aa7c931b76bdd05155d75420598c509a7933"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.503153 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rg7j6" event={"ID":"1e54723c-5e07-4a87-9410-92b770b28714","Type":"ContainerStarted","Data":"373ec0e7c11ab8fcd2b3812b0324688ed088922fb0809177e5feadabf6b2610e"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.504138 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"af84e788d5b99e8de0ae936f43f4b1364f98363e22bfd1d0b7658cd4f8f01f60"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.504785 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q6tv6" event={"ID":"db00f274-e86e-48c1-b0fe-5b4750265b85","Type":"ContainerStarted","Data":"ee24637b20ac7b4319d31d3124e73b1de7e5ddb601f6ca7fa2f6ae39d7085041"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.505942 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"31845e502317df23f6b9839dd2e6c557ff296090f6c1b440ad81a5691764ec90"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.532927 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.532965 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.532974 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.532994 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.533005 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:26Z","lastTransitionTime":"2026-03-12T13:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.635182 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.635250 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.635268 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.635291 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.635307 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:26Z","lastTransitionTime":"2026-03-12T13:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.636374 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.636461 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.636492 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.636512 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.636532 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.636594 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:27.636567673 +0000 UTC m=+110.326639674 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.636610 4921 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.636652 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.636664 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.636702 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.636717 4921 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.636717 4921 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.636670 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.636772 4921 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.636675 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:27.636658536 +0000 UTC m=+110.326730597 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.636846 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:27.636808481 +0000 UTC m=+110.326880462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.636871 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:27.636859502 +0000 UTC m=+110.326931573 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 13:11:26 crc kubenswrapper[4921]: E0312 13:11:26.636889 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:27.636879473 +0000 UTC m=+110.326951564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.737899 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.737971 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.737982 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.738019 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.738041 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:26Z","lastTransitionTime":"2026-03-12T13:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.840799 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.840868 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.840881 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.840901 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.840915 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:26Z","lastTransitionTime":"2026-03-12T13:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.944102 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.944163 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.944182 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.944208 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:26 crc kubenswrapper[4921]: I0312 13:11:26.944225 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:26Z","lastTransitionTime":"2026-03-12T13:11:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.046689 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.046733 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.046745 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.046761 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.046777 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:27Z","lastTransitionTime":"2026-03-12T13:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.149701 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.150157 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.150166 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.150181 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.150190 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:27Z","lastTransitionTime":"2026-03-12T13:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.252507 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.252554 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.252566 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.252588 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.252601 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:27Z","lastTransitionTime":"2026-03-12T13:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.355577 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.355637 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.355650 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.355675 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.355691 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:27Z","lastTransitionTime":"2026-03-12T13:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.458406 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.458480 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.458508 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.458543 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.458569 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:27Z","lastTransitionTime":"2026-03-12T13:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.514126 4921 generic.go:334] "Generic (PLEG): container finished" podID="8fda55cb-f6de-499d-8e5f-c48586fdfd34" containerID="778ae10702983671bc87b6535678ff40fc350cfd45a63ab5042164d16fc2a7b9" exitCode=0 Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.514231 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" event={"ID":"8fda55cb-f6de-499d-8e5f-c48586fdfd34","Type":"ContainerDied","Data":"778ae10702983671bc87b6535678ff40fc350cfd45a63ab5042164d16fc2a7b9"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.518294 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5c679df-0a81-4663-a3fc-d7247c933507" containerID="eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f" exitCode=0 Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.518392 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerDied","Data":"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.520675 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5swj9" event={"ID":"02d3ee9d-7145-4c65-94a8-55597fe7f574","Type":"ContainerStarted","Data":"60b74afd5e6ba1fe8d4a2ffe95b2ea3c02d5f251a89327f82db57a3b5749d41b"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.523417 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rg7j6" event={"ID":"1e54723c-5e07-4a87-9410-92b770b28714","Type":"ContainerStarted","Data":"8817d3829c88d78e583ecb69bfc7378dc827db26bbec90690b359adfa4c5e055"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.526281 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8a7a8dbc82235493c2f9467168d359b20cc566794d1fb076f3239bca42d51dc7"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.526318 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"01c2aa3a9ab67226f08d27d57c3b3285d56763f18942b35b4cc60578d7a82490"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.530921 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q6tv6" event={"ID":"db00f274-e86e-48c1-b0fe-5b4750265b85","Type":"ContainerStarted","Data":"64277a86d9ee84804412d148ece2e3feff1c23021c557d7224d7bab6172ce894"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.533060 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cde4cd65e62a0ad05f09d888fbd1a7767ed386c610413ea5cec91b586cdf911e"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.542124 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6tv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db00f274-e86e-48c1-b0fe-5b4750265b85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkbxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6tv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.545199 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"d118bdb3b6804196dedf1cad2dd46b463023cc9832d8c63003360e45958bbfb3"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.545269 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"3558d676a3c882348661fd9967700d03038460628a1f557e21868fc5a9c603bc"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.566074 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.566126 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.566140 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.566162 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.566177 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:27Z","lastTransitionTime":"2026-03-12T13:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.572210 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.588398 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg7j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e54723c-5e07-4a87-9410-92b770b28714\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg7j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.603340 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.617420 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae82cb49-657a-4b47-8107-0729b9edf47b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fkpqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.637358 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fda55cb-f6de-499d-8e5f-c48586fdfd34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778ae10702983671bc87b6535678ff40fc350cfd45a63ab5042164d16fc2a7b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://778ae10702983671bc87b6535678ff40fc350cfd45a63ab5042164d16fc2a7b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T13:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nwgrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.650362 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.650496 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.650590 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:29.650559002 +0000 UTC m=+112.340631043 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.650697 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.650730 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.650758 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.650803 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.650859 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.650877 4921 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.650965 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.650978 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.650988 4921 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.650998 4921 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.651059 4921 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.651105 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:29.651091928 +0000 UTC m=+112.341163899 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.651121 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:29.651114579 +0000 UTC m=+112.341186550 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.651134 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:29.65112845 +0000 UTC m=+112.341200421 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.651145 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:29.65114083 +0000 UTC m=+112.341212801 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.654285 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.667369 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.669070 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.669095 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.669103 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.669120 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.669129 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:27Z","lastTransitionTime":"2026-03-12T13:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.683080 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.694242 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5swj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02d3ee9d-7145-4c65-94a8-55597fe7f574\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj8v6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5swj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.712219 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c679df-0a81-4663-a3fc-d7247c933507\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl674\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.725884 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.739150 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.749567 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg7j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e54723c-5e07-4a87-9410-92b770b28714\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8817d3829c88d78e583ecb69bfc7378dc827db26bbec90690b359adfa4c5e055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg7j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.763247 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6tv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db00f274-e86e-48c1-b0fe-5b4750265b85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64277a86d9ee84804412d148ece2e3feff1c23021c557d7224d7bab6172ce894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkbxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6tv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.771383 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.771410 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.771419 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.771436 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.771446 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:27Z","lastTransitionTime":"2026-03-12T13:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.775797 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.791995 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae82cb49-657a-4b47-8107-0729b9edf47b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d118bdb3b6804196dedf1cad2dd46b463023cc9832d8c63003360e45958bbfb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3558d676a3c882348661fd9967700d03038460628a1f557e21868fc5a9c603bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fkpqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.808671 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fda55cb-f6de-499d-8e5f-c48586fdfd34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778ae10702983671bc87b6535678ff40fc350cfd45a63ab5042164d16fc2a7b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://778ae10702983671bc87b6535678ff40fc350cfd45a63ab5042164d16fc2a7b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T13:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nwgrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.820320 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5swj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02d3ee9d-7145-4c65-94a8-55597fe7f574\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b74afd5e6ba1fe8d4a2ffe95b2ea3c02d5f251a89327f82db57a3b5749d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj8v6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5swj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.844258 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c679df-0a81-4663-a3fc-d7247c933507\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T13:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl674\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.860840 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.872791 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.874223 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.874274 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.874283 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.874301 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.874336 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:27Z","lastTransitionTime":"2026-03-12T13:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.885539 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a7a8dbc82235493c2f9467168d359b20cc566794d1fb076f3239bca42d51dc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c2aa3a9ab67226f08d27d57c3b3285d56763f18942b35b4cc60578d7a82490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.906353 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cde4cd65e62a0ad05f09d888fbd1a7767ed386c610413ea5cec91b586cdf911e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.979028 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.979092 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.979106 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.979143 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.979156 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:27Z","lastTransitionTime":"2026-03-12T13:11:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.982924 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.983079 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.983215 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.983354 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.983478 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:27 crc kubenswrapper[4921]: E0312 13:11:27.983557 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.997699 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:27Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:27 crc kubenswrapper[4921]: I0312 13:11:27.999143 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.000102 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.001657 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.002419 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.003856 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.012451 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae82cb49-657a-4b47-8107-0729b9edf47b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d118bdb3b6804196dedf1cad2dd46b463023cc9832d8c63003360e45958bbfb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3558d676a3c882348661fd9967700d03038460628a1f557e21868fc5a9c603bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fkpqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.029750 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fda55cb-f6de-499d-8e5f-c48586fdfd34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778ae10702983671bc87b6535678ff40fc350cfd45a63ab5042164d16fc2a7b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://778ae10702983671bc87b6535678ff40fc350cfd45a63ab5042164d16fc2a7b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T13:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nwgrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.044840 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.061375 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.079659 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a7a8dbc82235493c2f9467168d359b20cc566794d1fb076f3239bca42d51dc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c2aa3a9ab67226f08d27d57c3b3285d56763f18942b35b4cc60578d7a82490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.081825 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.082349 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.082362 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.082382 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.082395 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:28Z","lastTransitionTime":"2026-03-12T13:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.094535 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5swj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02d3ee9d-7145-4c65-94a8-55597fe7f574\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b74afd5e6ba1fe8d4a2ffe95b2ea3c02d5f251a89327f82db57a3b5749d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj8v6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5swj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.104244 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.106105 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.107608 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.108717 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.109422 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.110691 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.111583 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.112625 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.113196 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.113639 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5c679df-0a81-4663-a3fc-d7247c933507\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T13:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2pxbl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rl674\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.113916 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.114882 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.115552 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.117166 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.119216 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.120230 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.120876 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.122228 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.125955 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.130782 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.131646 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.133315 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.134233 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.135564 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.136565 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.137717 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.138380 4921 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.138529 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.140154 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cde4cd65e62a0ad05f09d888fbd1a7767ed386c610413ea5cec91b586cdf911e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.141246 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.142092 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.142767 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.145156 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.147267 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.148466 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.149499 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.150869 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.151523 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.153222 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.154151 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.155537 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.156176 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.157657 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.158439 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.159979 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-q6tv6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"db00f274-e86e-48c1-b0fe-5b4750265b85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64277a86d9ee84804412d148ece2e3feff1c23021c557d7224d7bab6172ce894\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rkbxk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-q6tv6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.160699 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.161387 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.162589 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.163318 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.165493 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.166548 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.167220 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.172729 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.181170 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg7j6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1e54723c-5e07-4a87-9410-92b770b28714\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8817d3829c88d78e583ecb69bfc7378dc827db26bbec90690b359adfa4c5e055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fkzh6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg7j6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.184915 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.184937 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.184945 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.184958 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.184966 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:28Z","lastTransitionTime":"2026-03-12T13:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.288150 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.288187 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.288195 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.288211 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.288223 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:28Z","lastTransitionTime":"2026-03-12T13:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.391707 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.391736 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.391744 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.391757 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.391766 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:28Z","lastTransitionTime":"2026-03-12T13:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.494361 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.494930 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.494943 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.494969 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.494981 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:28Z","lastTransitionTime":"2026-03-12T13:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.550659 4921 generic.go:334] "Generic (PLEG): container finished" podID="8fda55cb-f6de-499d-8e5f-c48586fdfd34" containerID="c366f302b4a442b14b61397d16048932b80ec6a21795337bd2ae860873b67b40" exitCode=0 Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.550749 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" event={"ID":"8fda55cb-f6de-499d-8e5f-c48586fdfd34","Type":"ContainerDied","Data":"c366f302b4a442b14b61397d16048932b80ec6a21795337bd2ae860873b67b40"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.555865 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerStarted","Data":"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.555908 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerStarted","Data":"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.555918 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerStarted","Data":"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.555927 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerStarted","Data":"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.555938 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerStarted","Data":"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.568447 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.587358 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae82cb49-657a-4b47-8107-0729b9edf47b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d118bdb3b6804196dedf1cad2dd46b463023cc9832d8c63003360e45958bbfb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3558d676a3c882348661fd9967700d03038460628a1f557e21868fc5a9c603bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hrhnc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-fkpqq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.597273 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.597323 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.597334 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.597349 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.597360 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:28Z","lastTransitionTime":"2026-03-12T13:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.609175 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fda55cb-f6de-499d-8e5f-c48586fdfd34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://778ae10702983671bc87b6535678ff40fc350cfd45a63ab5042164d16fc2a7b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://778ae10702983671bc87b6535678ff40fc350cfd45a63ab5042164d16fc2a7b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T13:11:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c366f302b4a442b14b61397d16048932b80ec6a21795337bd2ae860873b67b40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c366f302b4a442b14b61397d16048932b80ec6a21795337bd2ae860873b67b40\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-12T13:11:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-12T13:11:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vvqw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nwgrv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.625408 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.641570 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.658630 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a7a8dbc82235493c2f9467168d359b20cc566794d1fb076f3239bca42d51dc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://01c2aa3a9ab67226f08d27d57c3b3285d56763f18942b35b4cc60578d7a82490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.672017 4921 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5swj9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"02d3ee9d-7145-4c65-94a8-55597fe7f574\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-12T13:11:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60b74afd5e6ba1fe8d4a2ffe95b2ea3c02d5f251a89327f82db57a3b5749d41b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-12T13:11:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gj8v6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-12T13:11:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5swj9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-12T13:11:28Z is after 2025-08-24T17:21:41Z" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.699847 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.699880 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.699890 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.699904 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.699915 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:28Z","lastTransitionTime":"2026-03-12T13:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.744854 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-q6tv6" podStartSLOduration=46.744804884 podStartE2EDuration="46.744804884s" podCreationTimestamp="2026-03-12 13:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:28.744722851 +0000 UTC m=+111.434794822" watchObservedRunningTime="2026-03-12 13:11:28.744804884 +0000 UTC m=+111.434876865" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.745303 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rg7j6" podStartSLOduration=46.745296039 podStartE2EDuration="46.745296039s" podCreationTimestamp="2026-03-12 13:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:28.731073657 +0000 UTC m=+111.421145648" watchObservedRunningTime="2026-03-12 13:11:28.745296039 +0000 UTC m=+111.435368020" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.802001 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.802037 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.802049 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.802067 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.802079 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:28Z","lastTransitionTime":"2026-03-12T13:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.849282 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4"] Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.849753 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.851886 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.852055 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.862223 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podStartSLOduration=46.86220192 podStartE2EDuration="46.86220192s" podCreationTimestamp="2026-03-12 13:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:28.861711765 +0000 UTC m=+111.551783726" watchObservedRunningTime="2026-03-12 13:11:28.86220192 +0000 UTC m=+111.552273911" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.865657 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-5jsfz"] Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.866167 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:28 crc kubenswrapper[4921]: E0312 13:11:28.866233 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jsfz" podUID="9823f1cf-662f-4896-a6a0-a3bfb3aa660b" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.866505 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fe7c14d-f890-466d-bd5e-94c5ebef3893-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jnmz4\" (UID: \"1fe7c14d-f890-466d-bd5e-94c5ebef3893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.866539 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1fe7c14d-f890-466d-bd5e-94c5ebef3893-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jnmz4\" (UID: \"1fe7c14d-f890-466d-bd5e-94c5ebef3893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.866557 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1fe7c14d-f890-466d-bd5e-94c5ebef3893-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jnmz4\" (UID: \"1fe7c14d-f890-466d-bd5e-94c5ebef3893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.866666 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chgsd\" (UniqueName: \"kubernetes.io/projected/1fe7c14d-f890-466d-bd5e-94c5ebef3893-kube-api-access-chgsd\") pod \"ovnkube-control-plane-749d76644c-jnmz4\" (UID: \"1fe7c14d-f890-466d-bd5e-94c5ebef3893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.904191 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.904241 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.904253 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.904273 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.904285 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:28Z","lastTransitionTime":"2026-03-12T13:11:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.953269 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5swj9" podStartSLOduration=46.953242278 podStartE2EDuration="46.953242278s" podCreationTimestamp="2026-03-12 13:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:28.952859557 +0000 UTC m=+111.642931538" watchObservedRunningTime="2026-03-12 13:11:28.953242278 +0000 UTC m=+111.643314249" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.967930 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs\") pod \"network-metrics-daemon-5jsfz\" (UID: \"9823f1cf-662f-4896-a6a0-a3bfb3aa660b\") " pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.967986 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fe7c14d-f890-466d-bd5e-94c5ebef3893-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jnmz4\" (UID: \"1fe7c14d-f890-466d-bd5e-94c5ebef3893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.968012 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5vs4\" (UniqueName: \"kubernetes.io/projected/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-kube-api-access-f5vs4\") pod \"network-metrics-daemon-5jsfz\" (UID: \"9823f1cf-662f-4896-a6a0-a3bfb3aa660b\") " pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.968032 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1fe7c14d-f890-466d-bd5e-94c5ebef3893-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jnmz4\" (UID: \"1fe7c14d-f890-466d-bd5e-94c5ebef3893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.968049 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1fe7c14d-f890-466d-bd5e-94c5ebef3893-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jnmz4\" (UID: \"1fe7c14d-f890-466d-bd5e-94c5ebef3893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.968080 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chgsd\" (UniqueName: \"kubernetes.io/projected/1fe7c14d-f890-466d-bd5e-94c5ebef3893-kube-api-access-chgsd\") pod \"ovnkube-control-plane-749d76644c-jnmz4\" (UID: \"1fe7c14d-f890-466d-bd5e-94c5ebef3893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.968982 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1fe7c14d-f890-466d-bd5e-94c5ebef3893-env-overrides\") pod \"ovnkube-control-plane-749d76644c-jnmz4\" (UID: \"1fe7c14d-f890-466d-bd5e-94c5ebef3893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.969129 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1fe7c14d-f890-466d-bd5e-94c5ebef3893-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-jnmz4\" (UID: \"1fe7c14d-f890-466d-bd5e-94c5ebef3893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.976572 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1fe7c14d-f890-466d-bd5e-94c5ebef3893-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-jnmz4\" (UID: \"1fe7c14d-f890-466d-bd5e-94c5ebef3893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" Mar 12 13:11:28 crc kubenswrapper[4921]: I0312 13:11:28.988430 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chgsd\" (UniqueName: \"kubernetes.io/projected/1fe7c14d-f890-466d-bd5e-94c5ebef3893-kube-api-access-chgsd\") pod \"ovnkube-control-plane-749d76644c-jnmz4\" (UID: \"1fe7c14d-f890-466d-bd5e-94c5ebef3893\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.007434 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.007478 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.007491 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.007511 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.007525 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:29Z","lastTransitionTime":"2026-03-12T13:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.068970 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5vs4\" (UniqueName: \"kubernetes.io/projected/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-kube-api-access-f5vs4\") pod \"network-metrics-daemon-5jsfz\" (UID: \"9823f1cf-662f-4896-a6a0-a3bfb3aa660b\") " pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.069072 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs\") pod \"network-metrics-daemon-5jsfz\" (UID: \"9823f1cf-662f-4896-a6a0-a3bfb3aa660b\") " pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.069185 4921 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.069240 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs podName:9823f1cf-662f-4896-a6a0-a3bfb3aa660b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:29.569225171 +0000 UTC m=+112.259297142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs") pod "network-metrics-daemon-5jsfz" (UID: "9823f1cf-662f-4896-a6a0-a3bfb3aa660b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.089181 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5vs4\" (UniqueName: \"kubernetes.io/projected/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-kube-api-access-f5vs4\") pod \"network-metrics-daemon-5jsfz\" (UID: \"9823f1cf-662f-4896-a6a0-a3bfb3aa660b\") " pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.111227 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.111273 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.111281 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.111301 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.111311 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:29Z","lastTransitionTime":"2026-03-12T13:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.214244 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.214311 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.214328 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.214352 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.214372 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:29Z","lastTransitionTime":"2026-03-12T13:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.219843 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" Mar 12 13:11:29 crc kubenswrapper[4921]: W0312 13:11:29.245181 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fe7c14d_f890_466d_bd5e_94c5ebef3893.slice/crio-e03bb4bdedcc1e06394df7cfc063cbbce1c6ab42f2fbc046e0365da46a9d1142 WatchSource:0}: Error finding container e03bb4bdedcc1e06394df7cfc063cbbce1c6ab42f2fbc046e0365da46a9d1142: Status 404 returned error can't find the container with id e03bb4bdedcc1e06394df7cfc063cbbce1c6ab42f2fbc046e0365da46a9d1142 Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.316840 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.317258 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.317273 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.317291 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.317304 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:29Z","lastTransitionTime":"2026-03-12T13:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.420296 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.420357 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.420372 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.420396 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.420411 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:29Z","lastTransitionTime":"2026-03-12T13:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.524355 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.524392 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.524407 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.524428 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.524443 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:29Z","lastTransitionTime":"2026-03-12T13:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.562968 4921 generic.go:334] "Generic (PLEG): container finished" podID="8fda55cb-f6de-499d-8e5f-c48586fdfd34" containerID="fd15400371a724fdaba07fe31e6b1f077f5105f0e906eea69fd87c725170c884" exitCode=0 Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.563070 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" event={"ID":"8fda55cb-f6de-499d-8e5f-c48586fdfd34","Type":"ContainerDied","Data":"fd15400371a724fdaba07fe31e6b1f077f5105f0e906eea69fd87c725170c884"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.565185 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8e6f8c16d06969b111a68624e81ee54d5f3d30bfd19ddd5758b7e912c2ea890e"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.569138 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" event={"ID":"1fe7c14d-f890-466d-bd5e-94c5ebef3893","Type":"ContainerStarted","Data":"63d7b966facaed00d1ec236943249c2b67a442a33ae0a099b54d72009e35919a"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.569183 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" event={"ID":"1fe7c14d-f890-466d-bd5e-94c5ebef3893","Type":"ContainerStarted","Data":"e03bb4bdedcc1e06394df7cfc063cbbce1c6ab42f2fbc046e0365da46a9d1142"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.573935 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs\") pod \"network-metrics-daemon-5jsfz\" (UID: \"9823f1cf-662f-4896-a6a0-a3bfb3aa660b\") " pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.574051 4921 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.574099 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs podName:9823f1cf-662f-4896-a6a0-a3bfb3aa660b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:30.574086965 +0000 UTC m=+113.264158926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs") pod "network-metrics-daemon-5jsfz" (UID: "9823f1cf-662f-4896-a6a0-a3bfb3aa660b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.577796 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerStarted","Data":"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.626683 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.626733 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.626753 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.626771 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.626782 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:29Z","lastTransitionTime":"2026-03-12T13:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.674681 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.674830 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:33.674800223 +0000 UTC m=+116.364872194 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.674876 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.674912 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.674930 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.674948 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.675012 4921 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.675066 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:33.675051231 +0000 UTC m=+116.365123202 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.675067 4921 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.675090 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.675100 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:33.675091972 +0000 UTC m=+116.365163943 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.675105 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.675106 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.675117 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.675125 4921 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.675125 4921 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.675149 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:33.675141244 +0000 UTC m=+116.365213215 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.675161 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:33.675156344 +0000 UTC m=+116.365228315 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.729595 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.729628 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.729657 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.729671 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.729680 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:29Z","lastTransitionTime":"2026-03-12T13:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.831562 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.831602 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.831610 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.831628 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.831637 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:29Z","lastTransitionTime":"2026-03-12T13:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.934934 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.934986 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.935002 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.935026 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.935042 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:29Z","lastTransitionTime":"2026-03-12T13:11:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.983247 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.983282 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:29 crc kubenswrapper[4921]: I0312 13:11:29.983282 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.983416 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.983498 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 13:11:29 crc kubenswrapper[4921]: E0312 13:11:29.983574 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.037440 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.037487 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.037498 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.037515 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.037527 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:30Z","lastTransitionTime":"2026-03-12T13:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.140698 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.140749 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.140763 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.140782 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.140798 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:30Z","lastTransitionTime":"2026-03-12T13:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.243647 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.243686 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.243696 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.243710 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.243719 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:30Z","lastTransitionTime":"2026-03-12T13:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.345947 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.345981 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.345993 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.346008 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.346018 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:30Z","lastTransitionTime":"2026-03-12T13:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.447857 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.447926 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.447943 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.447966 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.447985 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:30Z","lastTransitionTime":"2026-03-12T13:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.550481 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.550552 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.550575 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.550609 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.550633 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:30Z","lastTransitionTime":"2026-03-12T13:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.583058 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs\") pod \"network-metrics-daemon-5jsfz\" (UID: \"9823f1cf-662f-4896-a6a0-a3bfb3aa660b\") " pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:30 crc kubenswrapper[4921]: E0312 13:11:30.583227 4921 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:11:30 crc kubenswrapper[4921]: E0312 13:11:30.583291 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs podName:9823f1cf-662f-4896-a6a0-a3bfb3aa660b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:32.583275154 +0000 UTC m=+115.273347125 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs") pod "network-metrics-daemon-5jsfz" (UID: "9823f1cf-662f-4896-a6a0-a3bfb3aa660b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.583866 4921 generic.go:334] "Generic (PLEG): container finished" podID="8fda55cb-f6de-499d-8e5f-c48586fdfd34" containerID="27c6ce946c1cc0d7403b0f11aaac7245b32ad633f3dd87bf8972f617a128ffce" exitCode=0 Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.583905 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" event={"ID":"8fda55cb-f6de-499d-8e5f-c48586fdfd34","Type":"ContainerDied","Data":"27c6ce946c1cc0d7403b0f11aaac7245b32ad633f3dd87bf8972f617a128ffce"} Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.586136 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" event={"ID":"1fe7c14d-f890-466d-bd5e-94c5ebef3893","Type":"ContainerStarted","Data":"d332da96310da96dd960d63ff3923501ee3e7e6b49fb6bd1afb54d671ab91731"} Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.652986 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.653045 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.653058 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.653074 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.653083 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:30Z","lastTransitionTime":"2026-03-12T13:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.755230 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.755270 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.755281 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.755299 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.755311 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:30Z","lastTransitionTime":"2026-03-12T13:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.857824 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.857861 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.857871 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.857886 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.857897 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:30Z","lastTransitionTime":"2026-03-12T13:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.960419 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.960463 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.960473 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.960487 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.960497 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:30Z","lastTransitionTime":"2026-03-12T13:11:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:30 crc kubenswrapper[4921]: I0312 13:11:30.982449 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:30 crc kubenswrapper[4921]: E0312 13:11:30.982598 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jsfz" podUID="9823f1cf-662f-4896-a6a0-a3bfb3aa660b" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.063022 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.063084 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.063101 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.063125 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.063178 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:31Z","lastTransitionTime":"2026-03-12T13:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.165521 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.165600 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.165620 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.165645 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.165665 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:31Z","lastTransitionTime":"2026-03-12T13:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.269597 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.269927 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.269960 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.269984 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.270002 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:31Z","lastTransitionTime":"2026-03-12T13:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.373296 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.373343 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.373355 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.373376 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.373389 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:31Z","lastTransitionTime":"2026-03-12T13:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.476457 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.476520 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.476546 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.476568 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.476582 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:31Z","lastTransitionTime":"2026-03-12T13:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.583028 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.583084 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.583099 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.583121 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.583135 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:31Z","lastTransitionTime":"2026-03-12T13:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.595076 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerStarted","Data":"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c"} Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.599346 4921 generic.go:334] "Generic (PLEG): container finished" podID="8fda55cb-f6de-499d-8e5f-c48586fdfd34" containerID="2adda7108baa68c9b4d40f662a649864bd18bd863f0d8a0f3b2e0d3dee2200bc" exitCode=0 Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.599480 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" event={"ID":"8fda55cb-f6de-499d-8e5f-c48586fdfd34","Type":"ContainerDied","Data":"2adda7108baa68c9b4d40f662a649864bd18bd863f0d8a0f3b2e0d3dee2200bc"} Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.629696 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-jnmz4" podStartSLOduration=48.629657809 podStartE2EDuration="48.629657809s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:30.617042543 +0000 UTC m=+113.307114544" watchObservedRunningTime="2026-03-12 13:11:31.629657809 +0000 UTC m=+114.319729820" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.685428 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.685500 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.685519 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.685548 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.685571 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:31Z","lastTransitionTime":"2026-03-12T13:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.789143 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.789203 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.789221 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.789245 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.789261 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:31Z","lastTransitionTime":"2026-03-12T13:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.893408 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.893858 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.894039 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.894433 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.895031 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:31Z","lastTransitionTime":"2026-03-12T13:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.983092 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.983162 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.983169 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:31 crc kubenswrapper[4921]: E0312 13:11:31.983223 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 13:11:31 crc kubenswrapper[4921]: E0312 13:11:31.983289 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 13:11:31 crc kubenswrapper[4921]: E0312 13:11:31.983380 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.997409 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.997462 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.997472 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.997491 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:31 crc kubenswrapper[4921]: I0312 13:11:31.997501 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:31Z","lastTransitionTime":"2026-03-12T13:11:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.099756 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.099807 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.099867 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.099888 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.099931 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:32Z","lastTransitionTime":"2026-03-12T13:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.202657 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.202737 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.202770 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.202798 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.202909 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:32Z","lastTransitionTime":"2026-03-12T13:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.306147 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.306221 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.306248 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.306279 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.306304 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:32Z","lastTransitionTime":"2026-03-12T13:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.410096 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.410163 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.410186 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.410218 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.410240 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:32Z","lastTransitionTime":"2026-03-12T13:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.513445 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.513542 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.513565 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.513596 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.513618 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:32Z","lastTransitionTime":"2026-03-12T13:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.608383 4921 generic.go:334] "Generic (PLEG): container finished" podID="8fda55cb-f6de-499d-8e5f-c48586fdfd34" containerID="0a6631a23a43424d27e95dbceeaae4d437a16477dd8d0df62092093cb1e2af52" exitCode=0 Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.608469 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" event={"ID":"8fda55cb-f6de-499d-8e5f-c48586fdfd34","Type":"ContainerDied","Data":"0a6631a23a43424d27e95dbceeaae4d437a16477dd8d0df62092093cb1e2af52"} Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.616744 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.616794 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.616847 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.616878 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.616902 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:32Z","lastTransitionTime":"2026-03-12T13:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.625463 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs\") pod \"network-metrics-daemon-5jsfz\" (UID: \"9823f1cf-662f-4896-a6a0-a3bfb3aa660b\") " pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:32 crc kubenswrapper[4921]: E0312 13:11:32.625610 4921 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:11:32 crc kubenswrapper[4921]: E0312 13:11:32.625669 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs podName:9823f1cf-662f-4896-a6a0-a3bfb3aa660b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:36.625652229 +0000 UTC m=+119.315724210 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs") pod "network-metrics-daemon-5jsfz" (UID: "9823f1cf-662f-4896-a6a0-a3bfb3aa660b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.718775 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.719192 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.719212 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.719275 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.719292 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:32Z","lastTransitionTime":"2026-03-12T13:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.822544 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.822587 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.822599 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.822615 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.822626 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:32Z","lastTransitionTime":"2026-03-12T13:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.925215 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.925248 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.925257 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.925270 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.925282 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:32Z","lastTransitionTime":"2026-03-12T13:11:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:32 crc kubenswrapper[4921]: I0312 13:11:32.982799 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:32 crc kubenswrapper[4921]: E0312 13:11:32.983005 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jsfz" podUID="9823f1cf-662f-4896-a6a0-a3bfb3aa660b" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.027426 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.027485 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.027503 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.027528 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.027545 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:33Z","lastTransitionTime":"2026-03-12T13:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.131157 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.131205 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.131217 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.131237 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.131251 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:33Z","lastTransitionTime":"2026-03-12T13:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.234499 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.234547 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.234566 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.234587 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.234599 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:33Z","lastTransitionTime":"2026-03-12T13:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.338097 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.338163 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.338186 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.338217 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.338239 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:33Z","lastTransitionTime":"2026-03-12T13:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.441316 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.441355 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.441367 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.441384 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.441397 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:33Z","lastTransitionTime":"2026-03-12T13:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.544560 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.544655 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.544715 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.544749 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.544775 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:33Z","lastTransitionTime":"2026-03-12T13:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.622601 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" event={"ID":"8fda55cb-f6de-499d-8e5f-c48586fdfd34","Type":"ContainerStarted","Data":"d6254bdacfceac8e42045d3fa43cd9d544c9c24ed44e56ce7b7d5f47bfb83af2"} Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.629722 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerStarted","Data":"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624"} Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.630262 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.630295 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.630307 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.648290 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.648354 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.648377 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.648404 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.648427 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:33Z","lastTransitionTime":"2026-03-12T13:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.679235 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.679315 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.686561 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" podStartSLOduration=50.686525734 podStartE2EDuration="50.686525734s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:33.685068288 +0000 UTC m=+116.375140279" watchObservedRunningTime="2026-03-12 13:11:33.686525734 +0000 UTC m=+116.376597705" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.686905 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nwgrv" podStartSLOduration=51.686898456 podStartE2EDuration="51.686898456s" podCreationTimestamp="2026-03-12 13:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:33.648929996 +0000 UTC m=+116.339001997" watchObservedRunningTime="2026-03-12 13:11:33.686898456 +0000 UTC m=+116.376970427" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.736415 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.736702 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.736749 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:41.736691092 +0000 UTC m=+124.426763123 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.736945 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.736977 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.736999 4921 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.737091 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:41.737062604 +0000 UTC m=+124.427134745 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.736914 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.737138 4921 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.737270 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:41.73724549 +0000 UTC m=+124.427317471 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.737601 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.737801 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.737806 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.737859 4921 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.737948 4921 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.738037 4921 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.738062 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:41.738024123 +0000 UTC m=+124.428096134 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.738140 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:41.738114886 +0000 UTC m=+124.428187057 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.751919 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.751978 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.751990 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.752011 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.752024 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:33Z","lastTransitionTime":"2026-03-12T13:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.854914 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.854988 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.855007 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.855032 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.855049 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:33Z","lastTransitionTime":"2026-03-12T13:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.958472 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.958543 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.958559 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.958582 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.958600 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:33Z","lastTransitionTime":"2026-03-12T13:11:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.983355 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.983468 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:33 crc kubenswrapper[4921]: I0312 13:11:33.983549 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.983601 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.983740 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 13:11:33 crc kubenswrapper[4921]: E0312 13:11:33.983960 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.062286 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.062362 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.062382 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.062413 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.062433 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:34Z","lastTransitionTime":"2026-03-12T13:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.165875 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.165922 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.165935 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.165956 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.165969 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:34Z","lastTransitionTime":"2026-03-12T13:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.269946 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.270009 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.270026 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.270050 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.270067 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:34Z","lastTransitionTime":"2026-03-12T13:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.374104 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.374703 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.374723 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.374750 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.374771 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:34Z","lastTransitionTime":"2026-03-12T13:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.479431 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.479501 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.479518 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.479545 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.479565 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:34Z","lastTransitionTime":"2026-03-12T13:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.582974 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.583025 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.583043 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.583068 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.583090 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:34Z","lastTransitionTime":"2026-03-12T13:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.685589 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.685618 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.685628 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.685639 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.685647 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:34Z","lastTransitionTime":"2026-03-12T13:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.788111 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.788157 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.788166 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.788181 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.788192 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:34Z","lastTransitionTime":"2026-03-12T13:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.891109 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.891167 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.891180 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.891198 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.891216 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:34Z","lastTransitionTime":"2026-03-12T13:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.982774 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:34 crc kubenswrapper[4921]: E0312 13:11:34.982952 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jsfz" podUID="9823f1cf-662f-4896-a6a0-a3bfb3aa660b" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.993635 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.993713 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.993727 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.993745 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:34 crc kubenswrapper[4921]: I0312 13:11:34.993758 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:34Z","lastTransitionTime":"2026-03-12T13:11:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.097701 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.097769 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.097782 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.097806 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.097847 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:35Z","lastTransitionTime":"2026-03-12T13:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.168512 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5jsfz"] Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.201309 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.201416 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.201434 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.201457 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.201474 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:35Z","lastTransitionTime":"2026-03-12T13:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.303786 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.303886 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.303912 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.303940 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.303961 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:35Z","lastTransitionTime":"2026-03-12T13:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.407616 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.407654 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.407665 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.407681 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.407695 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:35Z","lastTransitionTime":"2026-03-12T13:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.511403 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.511468 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.511488 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.511514 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.511534 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:35Z","lastTransitionTime":"2026-03-12T13:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.615303 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.615349 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.615362 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.615382 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.615395 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:35Z","lastTransitionTime":"2026-03-12T13:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.653545 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:35 crc kubenswrapper[4921]: E0312 13:11:35.653685 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jsfz" podUID="9823f1cf-662f-4896-a6a0-a3bfb3aa660b" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.718118 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.718146 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.718156 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.718172 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.718180 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:35Z","lastTransitionTime":"2026-03-12T13:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.821241 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.821327 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.821346 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.821380 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.821399 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:35Z","lastTransitionTime":"2026-03-12T13:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.905468 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.905527 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.905549 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.905574 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.905598 4921 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-12T13:11:35Z","lastTransitionTime":"2026-03-12T13:11:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.967767 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz"] Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.968406 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.973129 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.973205 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.973283 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.973658 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.982675 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.982732 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.982865 4921 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 12 13:11:35 crc kubenswrapper[4921]: E0312 13:11:35.982927 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 12 13:11:35 crc kubenswrapper[4921]: I0312 13:11:35.982975 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:35 crc kubenswrapper[4921]: E0312 13:11:35.983166 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 12 13:11:35 crc kubenswrapper[4921]: E0312 13:11:35.983616 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.000450 4921 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.003775 4921 scope.go:117] "RemoveContainer" containerID="35202d539a243cb28c79808a706d0f7030ad1b011ea706c3bf1132d623651ff6" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.005983 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.066020 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a02b83a4-aca2-44c5-a79f-0f1901754cb0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.066138 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a02b83a4-aca2-44c5-a79f-0f1901754cb0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.066235 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a02b83a4-aca2-44c5-a79f-0f1901754cb0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.066269 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a02b83a4-aca2-44c5-a79f-0f1901754cb0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.066303 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a02b83a4-aca2-44c5-a79f-0f1901754cb0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.166939 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a02b83a4-aca2-44c5-a79f-0f1901754cb0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.167122 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a02b83a4-aca2-44c5-a79f-0f1901754cb0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.167120 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/a02b83a4-aca2-44c5-a79f-0f1901754cb0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.167204 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a02b83a4-aca2-44c5-a79f-0f1901754cb0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.167229 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a02b83a4-aca2-44c5-a79f-0f1901754cb0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.167272 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a02b83a4-aca2-44c5-a79f-0f1901754cb0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.167365 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/a02b83a4-aca2-44c5-a79f-0f1901754cb0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.168690 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a02b83a4-aca2-44c5-a79f-0f1901754cb0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.175147 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a02b83a4-aca2-44c5-a79f-0f1901754cb0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.189769 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a02b83a4-aca2-44c5-a79f-0f1901754cb0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-gtlnz\" (UID: \"a02b83a4-aca2-44c5-a79f-0f1901754cb0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.288648 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" Mar 12 13:11:36 crc kubenswrapper[4921]: W0312 13:11:36.309640 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda02b83a4_aca2_44c5_a79f_0f1901754cb0.slice/crio-92f95977fe0094cafb1e3de2e69261d0d58af42c9e2ab8f2aaf6cf00ff4b4d8d WatchSource:0}: Error finding container 92f95977fe0094cafb1e3de2e69261d0d58af42c9e2ab8f2aaf6cf00ff4b4d8d: Status 404 returned error can't find the container with id 92f95977fe0094cafb1e3de2e69261d0d58af42c9e2ab8f2aaf6cf00ff4b4d8d Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.657857 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.660537 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"afa2ffb558d1975cd44b098a477b250aeb500db914f05edfe28794ceb4218bb8"} Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.660755 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.661535 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" event={"ID":"a02b83a4-aca2-44c5-a79f-0f1901754cb0","Type":"ContainerStarted","Data":"f3d3edecda0532eb9884f2a231eb8597f85a84e8f557aefdf64d0b0d055f5086"} Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.661571 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" event={"ID":"a02b83a4-aca2-44c5-a79f-0f1901754cb0","Type":"ContainerStarted","Data":"92f95977fe0094cafb1e3de2e69261d0d58af42c9e2ab8f2aaf6cf00ff4b4d8d"} Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.672936 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs\") pod \"network-metrics-daemon-5jsfz\" (UID: \"9823f1cf-662f-4896-a6a0-a3bfb3aa660b\") " pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:36 crc kubenswrapper[4921]: E0312 13:11:36.673127 4921 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:11:36 crc kubenswrapper[4921]: E0312 13:11:36.673363 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs podName:9823f1cf-662f-4896-a6a0-a3bfb3aa660b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:44.673342577 +0000 UTC m=+127.363414548 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs") pod "network-metrics-daemon-5jsfz" (UID: "9823f1cf-662f-4896-a6a0-a3bfb3aa660b") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.677516 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=1.677497426 podStartE2EDuration="1.677497426s" podCreationTimestamp="2026-03-12 13:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:36.676265358 +0000 UTC m=+119.366337339" watchObservedRunningTime="2026-03-12 13:11:36.677497426 +0000 UTC m=+119.367569397" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.690073 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-gtlnz" podStartSLOduration=54.690052916 podStartE2EDuration="54.690052916s" podCreationTimestamp="2026-03-12 13:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:36.689656884 +0000 UTC m=+119.379728875" watchObservedRunningTime="2026-03-12 13:11:36.690052916 +0000 UTC m=+119.380124897" Mar 12 13:11:36 crc kubenswrapper[4921]: I0312 13:11:36.982255 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:36 crc kubenswrapper[4921]: E0312 13:11:36.982393 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jsfz" podUID="9823f1cf-662f-4896-a6a0-a3bfb3aa660b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.377879 4921 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.378183 4921 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.417547 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gg92s"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.418092 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.418844 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4c92v"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.419417 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.420008 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r7sfx"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.420436 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.421252 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pztgf"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.421929 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.422467 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.422990 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.423885 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.424863 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.425049 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.429720 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.434217 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.435068 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.435680 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.435989 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.436117 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j5pwt"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.436865 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.437269 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.449237 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.450603 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.451001 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.451784 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.452464 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.452570 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.453063 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.455390 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.456426 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.458835 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.458974 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.458996 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.459122 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.459189 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.459289 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.459439 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.459711 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.459905 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.460172 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.460189 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.460308 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.460404 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.460584 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.460925 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.461020 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.461175 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.465139 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.465606 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.466076 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.466616 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.467937 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.468018 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.468342 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.468498 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.468542 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.468632 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.468875 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.468925 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.469033 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.469145 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.469275 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.469397 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.469415 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.469510 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.470173 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.471786 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.471975 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.472246 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.472663 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.473082 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.473494 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.473770 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.474055 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.475411 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.475859 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.475976 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.476199 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.476394 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.476539 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.476643 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.476939 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.477004 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.478061 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.478236 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fp9vb"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.479172 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4c92v"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.479303 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fp9vb" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.479351 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.480597 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.481077 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gg92s"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.484165 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485452 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485488 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485512 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-image-import-ca\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485531 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485546 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-etcd-serving-ca\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485563 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98a0cc52-4219-45b7-a15f-d763979accbc-etcd-client\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485587 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj8m6\" (UniqueName: \"kubernetes.io/projected/bd5e10c8-1017-4083-a5d8-550f2aca7920-kube-api-access-qj8m6\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485603 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/345c99f7-75d2-48da-9a45-6fd8ce5c92da-images\") pod \"machine-api-operator-5694c8668f-r7sfx\" (UID: \"345c99f7-75d2-48da-9a45-6fd8ce5c92da\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485624 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd5e10c8-1017-4083-a5d8-550f2aca7920-serving-cert\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485641 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/345c99f7-75d2-48da-9a45-6fd8ce5c92da-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r7sfx\" (UID: \"345c99f7-75d2-48da-9a45-6fd8ce5c92da\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485670 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485690 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345c9c4b-5322-4521-abdb-5736718e654c-config\") pod \"route-controller-manager-6576b87f9c-zdw4r\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485706 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-audit-dir\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485722 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4kfc\" (UniqueName: \"kubernetes.io/projected/345c99f7-75d2-48da-9a45-6fd8ce5c92da-kube-api-access-x4kfc\") pod \"machine-api-operator-5694c8668f-r7sfx\" (UID: \"345c99f7-75d2-48da-9a45-6fd8ce5c92da\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485744 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485759 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345c99f7-75d2-48da-9a45-6fd8ce5c92da-config\") pod \"machine-api-operator-5694c8668f-r7sfx\" (UID: \"345c99f7-75d2-48da-9a45-6fd8ce5c92da\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485790 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd5e10c8-1017-4083-a5d8-550f2aca7920-config\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485822 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd5e10c8-1017-4083-a5d8-550f2aca7920-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485837 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98a0cc52-4219-45b7-a15f-d763979accbc-serving-cert\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485857 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/345c9c4b-5322-4521-abdb-5736718e654c-serving-cert\") pod \"route-controller-manager-6576b87f9c-zdw4r\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485884 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485904 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485920 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd5e10c8-1017-4083-a5d8-550f2aca7920-service-ca-bundle\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485935 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485969 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/98a0cc52-4219-45b7-a15f-d763979accbc-node-pullsecrets\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485982 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98a0cc52-4219-45b7-a15f-d763979accbc-audit-dir\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.485997 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-audit-policies\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.486012 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmqtf\" (UniqueName: \"kubernetes.io/projected/345c9c4b-5322-4521-abdb-5736718e654c-kube-api-access-qmqtf\") pod \"route-controller-manager-6576b87f9c-zdw4r\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.486027 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/345c9c4b-5322-4521-abdb-5736718e654c-client-ca\") pod \"route-controller-manager-6576b87f9c-zdw4r\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.486042 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.486056 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6srd5\" (UniqueName: \"kubernetes.io/projected/98a0cc52-4219-45b7-a15f-d763979accbc-kube-api-access-6srd5\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.486072 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-audit\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.486105 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.486120 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.486136 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-config\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.486152 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98a0cc52-4219-45b7-a15f-d763979accbc-encryption-config\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.486181 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.486212 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6kqr\" (UniqueName: \"kubernetes.io/projected/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-kube-api-access-n6kqr\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.509366 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.511266 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j5pwt"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.514458 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.515364 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.526459 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xk2qg"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.545889 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.546085 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.546598 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.548347 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.548528 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.548639 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.548726 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.548836 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.548915 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.549128 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.549204 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.549271 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.549432 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.549507 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.549579 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.549649 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.549719 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.553691 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.553956 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-22xz2"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.554097 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.554221 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gdgrq"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.554440 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.554752 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.555048 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.555311 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.560681 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.560922 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.561067 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.561179 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.561281 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.561390 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.561499 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.562048 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.562549 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.566519 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f2bw7"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.568502 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.568521 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.568645 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.568700 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.570493 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.570741 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.570921 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.571333 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.571496 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.577282 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.577485 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-5ns6b"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.577738 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.578360 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.578739 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.579140 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.579392 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.579644 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.579663 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.579681 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nxwgt"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.579740 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.579923 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.580515 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x8rdl"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.580849 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7gdw7"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.581177 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.581519 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.581695 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nxwgt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.581696 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.582077 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x8rdl" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.582116 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.582740 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.582798 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.583210 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.583293 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.583645 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.585545 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-4z7zk"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.583826 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.585909 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.586033 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.583991 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.585308 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.586991 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-audit\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587027 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587052 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-config\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587080 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/675e0fd3-342d-46b4-968a-33dd611eb8c0-audit-policies\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587108 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587132 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587154 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98a0cc52-4219-45b7-a15f-d763979accbc-encryption-config\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587178 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e82943-c5ab-4f7e-91d2-f99937a1ad40-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9lvzt\" (UID: \"e0e82943-c5ab-4f7e-91d2-f99937a1ad40\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587200 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/675e0fd3-342d-46b4-968a-33dd611eb8c0-serving-cert\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587224 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-client-ca\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587247 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6kqr\" (UniqueName: \"kubernetes.io/projected/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-kube-api-access-n6kqr\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587270 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587295 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-image-import-ca\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587318 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587344 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587364 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-etcd-serving-ca\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587391 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e334bcf0-dbe3-41d4-974b-222a58148c43-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r6qq6\" (UID: \"e334bcf0-dbe3-41d4-974b-222a58148c43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587416 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj8m6\" (UniqueName: \"kubernetes.io/projected/bd5e10c8-1017-4083-a5d8-550f2aca7920-kube-api-access-qj8m6\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587438 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98a0cc52-4219-45b7-a15f-d763979accbc-etcd-client\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587464 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/675e0fd3-342d-46b4-968a-33dd611eb8c0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587488 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/345c99f7-75d2-48da-9a45-6fd8ce5c92da-images\") pod \"machine-api-operator-5694c8668f-r7sfx\" (UID: \"345c99f7-75d2-48da-9a45-6fd8ce5c92da\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587514 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd5e10c8-1017-4083-a5d8-550f2aca7920-serving-cert\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587536 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/345c99f7-75d2-48da-9a45-6fd8ce5c92da-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r7sfx\" (UID: \"345c99f7-75d2-48da-9a45-6fd8ce5c92da\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587562 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a55aa2b4-ed3e-414c-8fa3-cba24092f81a-auth-proxy-config\") pod \"machine-approver-56656f9798-9jx6b\" (UID: \"a55aa2b4-ed3e-414c-8fa3-cba24092f81a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587588 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345c9c4b-5322-4521-abdb-5736718e654c-config\") pod \"route-controller-manager-6576b87f9c-zdw4r\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587612 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6757f226-348a-4d6c-a9ee-22c6315701af-metrics-tls\") pod \"dns-operator-744455d44c-fp9vb\" (UID: \"6757f226-348a-4d6c-a9ee-22c6315701af\") " pod="openshift-dns-operator/dns-operator-744455d44c-fp9vb" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587636 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gvxf\" (UniqueName: \"kubernetes.io/projected/6757f226-348a-4d6c-a9ee-22c6315701af-kube-api-access-2gvxf\") pod \"dns-operator-744455d44c-fp9vb\" (UID: \"6757f226-348a-4d6c-a9ee-22c6315701af\") " pod="openshift-dns-operator/dns-operator-744455d44c-fp9vb" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587670 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587697 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-serving-cert\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.587722 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvk6r\" (UniqueName: \"kubernetes.io/projected/e334bcf0-dbe3-41d4-974b-222a58148c43-kube-api-access-pvk6r\") pod \"openshift-config-operator-7777fb866f-r6qq6\" (UID: \"e334bcf0-dbe3-41d4-974b-222a58148c43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.588303 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.588807 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.589314 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-audit\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.589373 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c82bd"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.589627 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4z7zk" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.589769 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-audit-dir\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.589822 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e334bcf0-dbe3-41d4-974b-222a58148c43-serving-cert\") pod \"openshift-config-operator-7777fb866f-r6qq6\" (UID: \"e334bcf0-dbe3-41d4-974b-222a58148c43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.589859 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.589886 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345c99f7-75d2-48da-9a45-6fd8ce5c92da-config\") pod \"machine-api-operator-5694c8668f-r7sfx\" (UID: \"345c99f7-75d2-48da-9a45-6fd8ce5c92da\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590403 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4kfc\" (UniqueName: \"kubernetes.io/projected/345c99f7-75d2-48da-9a45-6fd8ce5c92da-kube-api-access-x4kfc\") pod \"machine-api-operator-5694c8668f-r7sfx\" (UID: \"345c99f7-75d2-48da-9a45-6fd8ce5c92da\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590467 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e82943-c5ab-4f7e-91d2-f99937a1ad40-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9lvzt\" (UID: \"e0e82943-c5ab-4f7e-91d2-f99937a1ad40\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590492 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/675e0fd3-342d-46b4-968a-33dd611eb8c0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590536 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd5e10c8-1017-4083-a5d8-550f2aca7920-config\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590563 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/345c9c4b-5322-4521-abdb-5736718e654c-serving-cert\") pod \"route-controller-manager-6576b87f9c-zdw4r\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590586 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/63c96b06-6182-4472-b8a8-393c627c77c9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tfcm2\" (UID: \"63c96b06-6182-4472-b8a8-393c627c77c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590678 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-etcd-serving-ca\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590705 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd5e10c8-1017-4083-a5d8-550f2aca7920-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590733 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98a0cc52-4219-45b7-a15f-d763979accbc-serving-cert\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590769 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/675e0fd3-342d-46b4-968a-33dd611eb8c0-etcd-client\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590792 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/675e0fd3-342d-46b4-968a-33dd611eb8c0-encryption-config\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590852 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsb8c\" (UniqueName: \"kubernetes.io/projected/a55aa2b4-ed3e-414c-8fa3-cba24092f81a-kube-api-access-zsb8c\") pod \"machine-approver-56656f9798-9jx6b\" (UID: \"a55aa2b4-ed3e-414c-8fa3-cba24092f81a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590881 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590910 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590936 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghn6b\" (UniqueName: \"kubernetes.io/projected/e0e82943-c5ab-4f7e-91d2-f99937a1ad40-kube-api-access-ghn6b\") pod \"openshift-apiserver-operator-796bbdcf4f-9lvzt\" (UID: \"e0e82943-c5ab-4f7e-91d2-f99937a1ad40\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590939 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-config\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590958 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/675e0fd3-342d-46b4-968a-33dd611eb8c0-audit-dir\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590983 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc2wq\" (UniqueName: \"kubernetes.io/projected/675e0fd3-342d-46b4-968a-33dd611eb8c0-kube-api-access-pc2wq\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591013 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd5e10c8-1017-4083-a5d8-550f2aca7920-service-ca-bundle\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591059 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-config\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591087 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/98a0cc52-4219-45b7-a15f-d763979accbc-node-pullsecrets\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591113 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98a0cc52-4219-45b7-a15f-d763979accbc-audit-dir\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591139 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591163 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591190 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-audit-policies\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591216 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a55aa2b4-ed3e-414c-8fa3-cba24092f81a-config\") pod \"machine-approver-56656f9798-9jx6b\" (UID: \"a55aa2b4-ed3e-414c-8fa3-cba24092f81a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591242 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvnml\" (UniqueName: \"kubernetes.io/projected/63c96b06-6182-4472-b8a8-393c627c77c9-kube-api-access-vvnml\") pod \"cluster-samples-operator-665b6dd947-tfcm2\" (UID: \"63c96b06-6182-4472-b8a8-393c627c77c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591269 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmqtf\" (UniqueName: \"kubernetes.io/projected/345c9c4b-5322-4521-abdb-5736718e654c-kube-api-access-qmqtf\") pod \"route-controller-manager-6576b87f9c-zdw4r\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591294 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbqb2\" (UniqueName: \"kubernetes.io/projected/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-kube-api-access-zbqb2\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591317 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591341 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6srd5\" (UniqueName: \"kubernetes.io/projected/98a0cc52-4219-45b7-a15f-d763979accbc-kube-api-access-6srd5\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591359 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345c9c4b-5322-4521-abdb-5736718e654c-config\") pod \"route-controller-manager-6576b87f9c-zdw4r\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591364 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/345c9c4b-5322-4521-abdb-5736718e654c-client-ca\") pod \"route-controller-manager-6576b87f9c-zdw4r\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591419 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a55aa2b4-ed3e-414c-8fa3-cba24092f81a-machine-approver-tls\") pod \"machine-approver-56656f9798-9jx6b\" (UID: \"a55aa2b4-ed3e-414c-8fa3-cba24092f81a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.591921 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.613718 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/345c9c4b-5322-4521-abdb-5736718e654c-client-ca\") pod \"route-controller-manager-6576b87f9c-zdw4r\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.614437 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.614508 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/98a0cc52-4219-45b7-a15f-d763979accbc-etcd-client\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.614826 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.615325 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/345c99f7-75d2-48da-9a45-6fd8ce5c92da-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-r7sfx\" (UID: \"345c99f7-75d2-48da-9a45-6fd8ce5c92da\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.616035 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.616889 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.617694 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.618660 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.620039 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.620510 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.620992 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.624526 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd5e10c8-1017-4083-a5d8-550f2aca7920-serving-cert\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.624652 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-audit-dir\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590180 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.590302 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.624946 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/345c99f7-75d2-48da-9a45-6fd8ce5c92da-images\") pod \"machine-api-operator-5694c8668f-r7sfx\" (UID: \"345c99f7-75d2-48da-9a45-6fd8ce5c92da\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.626259 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-c82bd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.626905 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q684w"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.627991 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd5e10c8-1017-4083-a5d8-550f2aca7920-config\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.630268 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-audit-policies\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.630392 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/98a0cc52-4219-45b7-a15f-d763979accbc-node-pullsecrets\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.630578 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/98a0cc52-4219-45b7-a15f-d763979accbc-audit-dir\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.631087 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345c99f7-75d2-48da-9a45-6fd8ce5c92da-config\") pod \"machine-api-operator-5694c8668f-r7sfx\" (UID: \"345c99f7-75d2-48da-9a45-6fd8ce5c92da\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.631725 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd5e10c8-1017-4083-a5d8-550f2aca7920-service-ca-bundle\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.631844 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/345c9c4b-5322-4521-abdb-5736718e654c-serving-cert\") pod \"route-controller-manager-6576b87f9c-zdw4r\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.631939 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.636573 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98a0cc52-4219-45b7-a15f-d763979accbc-serving-cert\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.636644 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.637159 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dctml"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.637269 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.637335 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.647093 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.647586 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.631442 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/98a0cc52-4219-45b7-a15f-d763979accbc-encryption-config\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.650866 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.652317 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.652352 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hl25r"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.652659 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.652785 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.652855 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.652882 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-22xz2"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.652994 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hl25r" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.653123 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.653142 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r7sfx"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.653279 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd5e10c8-1017-4083-a5d8-550f2aca7920-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.654024 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/98a0cc52-4219-45b7-a15f-d763979accbc-image-import-ca\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.654101 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.654127 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.657919 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.659336 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.662673 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xk2qg"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.664960 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.666126 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gdgrq"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.667058 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.667783 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pztgf"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.669633 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.669659 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.671105 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.675135 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4z7zk"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.676281 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fp9vb"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.679849 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c82bd"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.679883 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.679895 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.680216 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.681885 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sg6kz"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.682760 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sg6kz" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.683078 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-ms2ld"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.683963 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ms2ld" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.684150 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x8rdl"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.685080 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.686181 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7gdw7"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.687463 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q684w"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.688668 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.690241 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.690752 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.691871 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f2bw7"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.692705 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-service-ca\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.692741 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10edb77f-2de4-4817-8991-944bcdc731c3-config\") pod \"service-ca-operator-777779d784-nl5fb\" (UID: \"10edb77f-2de4-4817-8991-944bcdc731c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.692770 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpmhm\" (UniqueName: \"kubernetes.io/projected/10edb77f-2de4-4817-8991-944bcdc731c3-kube-api-access-tpmhm\") pod \"service-ca-operator-777779d784-nl5fb\" (UID: \"10edb77f-2de4-4817-8991-944bcdc731c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.692870 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc5516ca-a316-4768-85b7-1acc90471ad3-webhook-cert\") pod \"packageserver-d55dfcdfc-6pfpd\" (UID: \"bc5516ca-a316-4768-85b7-1acc90471ad3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.692900 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/294eef62-eb36-4e34-baa0-f391d25f72f1-proxy-tls\") pod \"machine-config-operator-74547568cd-q684w\" (UID: \"294eef62-eb36-4e34-baa0-f391d25f72f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.692926 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-default-certificate\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.692957 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/675e0fd3-342d-46b4-968a-33dd611eb8c0-serving-cert\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.692986 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-client-ca\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.693015 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3931e667-2d91-47d8-91ef-f8df2f96d75e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-x76dm\" (UID: \"3931e667-2d91-47d8-91ef-f8df2f96d75e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.693044 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec32db1f-c08c-4ea3-93c0-13dee21a1deb-serving-cert\") pod \"console-operator-58897d9998-gdgrq\" (UID: \"ec32db1f-c08c-4ea3-93c0-13dee21a1deb\") " pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.693073 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/42780a2c-c305-4915-9be7-799cec82b8b8-etcd-service-ca\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.693100 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49132f14-e873-424a-8ffb-ebaa836c2db5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f9kwz\" (UID: \"49132f14-e873-424a-8ffb-ebaa836c2db5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.693131 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e334bcf0-dbe3-41d4-974b-222a58148c43-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r6qq6\" (UID: \"e334bcf0-dbe3-41d4-974b-222a58148c43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.693296 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a5bda7f-8bc4-47fc-8eac-148ea84f0160-proxy-tls\") pod \"machine-config-controller-84d6567774-c2f7f\" (UID: \"4a5bda7f-8bc4-47fc-8eac-148ea84f0160\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.693532 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gvxf\" (UniqueName: \"kubernetes.io/projected/6757f226-348a-4d6c-a9ee-22c6315701af-kube-api-access-2gvxf\") pod \"dns-operator-744455d44c-fp9vb\" (UID: \"6757f226-348a-4d6c-a9ee-22c6315701af\") " pod="openshift-dns-operator/dns-operator-744455d44c-fp9vb" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.693584 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e334bcf0-dbe3-41d4-974b-222a58148c43-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r6qq6\" (UID: \"e334bcf0-dbe3-41d4-974b-222a58148c43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.693539 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.693775 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57677fcb-c7a5-431c-b751-ec13d22484b1-console-oauth-config\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.693852 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03ac45e-da7d-4e11-ab4b-ad1032a469a6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zqbl\" (UID: \"b03ac45e-da7d-4e11-ab4b-ad1032a469a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.693924 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-client-ca\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.693886 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvk6r\" (UniqueName: \"kubernetes.io/projected/e334bcf0-dbe3-41d4-974b-222a58148c43-kube-api-access-pvk6r\") pod \"openshift-config-operator-7777fb866f-r6qq6\" (UID: \"e334bcf0-dbe3-41d4-974b-222a58148c43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.693984 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec32db1f-c08c-4ea3-93c0-13dee21a1deb-trusted-ca\") pod \"console-operator-58897d9998-gdgrq\" (UID: \"ec32db1f-c08c-4ea3-93c0-13dee21a1deb\") " pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694011 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e334bcf0-dbe3-41d4-974b-222a58148c43-serving-cert\") pod \"openshift-config-operator-7777fb866f-r6qq6\" (UID: \"e334bcf0-dbe3-41d4-974b-222a58148c43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694032 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3931e667-2d91-47d8-91ef-f8df2f96d75e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-x76dm\" (UID: \"3931e667-2d91-47d8-91ef-f8df2f96d75e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694060 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10edb77f-2de4-4817-8991-944bcdc731c3-serving-cert\") pod \"service-ca-operator-777779d784-nl5fb\" (UID: \"10edb77f-2de4-4817-8991-944bcdc731c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694079 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55gl5\" (UniqueName: \"kubernetes.io/projected/5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4-kube-api-access-55gl5\") pod \"control-plane-machine-set-operator-78cbb6b69f-x8rdl\" (UID: \"5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x8rdl" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694096 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57677fcb-c7a5-431c-b751-ec13d22484b1-console-serving-cert\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694145 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e82943-c5ab-4f7e-91d2-f99937a1ad40-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9lvzt\" (UID: \"e0e82943-c5ab-4f7e-91d2-f99937a1ad40\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694172 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/675e0fd3-342d-46b4-968a-33dd611eb8c0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694192 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e98fd710-c49f-489e-8ccb-04834b738a98-srv-cert\") pod \"catalog-operator-68c6474976-fcwhd\" (UID: \"e98fd710-c49f-489e-8ccb-04834b738a98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694218 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8jnx\" (UniqueName: \"kubernetes.io/projected/bc5516ca-a316-4768-85b7-1acc90471ad3-kube-api-access-r8jnx\") pod \"packageserver-d55dfcdfc-6pfpd\" (UID: \"bc5516ca-a316-4768-85b7-1acc90471ad3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694239 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtgd6\" (UniqueName: \"kubernetes.io/projected/294eef62-eb36-4e34-baa0-f391d25f72f1-kube-api-access-rtgd6\") pod \"machine-config-operator-74547568cd-q684w\" (UID: \"294eef62-eb36-4e34-baa0-f391d25f72f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694260 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/675e0fd3-342d-46b4-968a-33dd611eb8c0-etcd-client\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694282 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/680f8033-da87-4897-bf8c-23b2ad8af659-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7gdw7\" (UID: \"680f8033-da87-4897-bf8c-23b2ad8af659\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694302 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/40158ae9-4fb8-4e44-b9b1-d4abe50533d2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c82bd\" (UID: \"40158ae9-4fb8-4e44-b9b1-d4abe50533d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c82bd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694336 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfnbn\" (UniqueName: \"kubernetes.io/projected/2221a476-fe26-4767-9773-b88c8c9bfa7e-kube-api-access-xfnbn\") pod \"olm-operator-6b444d44fb-rxv7d\" (UID: \"2221a476-fe26-4767-9773-b88c8c9bfa7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694368 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e98fd710-c49f-489e-8ccb-04834b738a98-profile-collector-cert\") pod \"catalog-operator-68c6474976-fcwhd\" (UID: \"e98fd710-c49f-489e-8ccb-04834b738a98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694456 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc2wq\" (UniqueName: \"kubernetes.io/projected/675e0fd3-342d-46b4-968a-33dd611eb8c0-kube-api-access-pc2wq\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694488 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shmbv\" (UniqueName: \"kubernetes.io/projected/7c43287c-9ccf-472f-b671-fb54307ee938-kube-api-access-shmbv\") pod \"package-server-manager-789f6589d5-grsp5\" (UID: \"7c43287c-9ccf-472f-b671-fb54307ee938\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694510 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hl25r"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694517 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc6791-5e3b-47e8-a542-33f6aea34ba4-config\") pod \"kube-apiserver-operator-766d6c64bb-4vjkr\" (UID: \"40fc6791-5e3b-47e8-a542-33f6aea34ba4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694701 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-service-ca-bundle\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694724 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-stats-auth\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694749 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694770 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40fc6791-5e3b-47e8-a542-33f6aea34ba4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vjkr\" (UID: \"40fc6791-5e3b-47e8-a542-33f6aea34ba4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694792 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2221a476-fe26-4767-9773-b88c8c9bfa7e-srv-cert\") pod \"olm-operator-6b444d44fb-rxv7d\" (UID: \"2221a476-fe26-4767-9773-b88c8c9bfa7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694853 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/675e0fd3-342d-46b4-968a-33dd611eb8c0-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694861 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a55aa2b4-ed3e-414c-8fa3-cba24092f81a-machine-approver-tls\") pod \"machine-approver-56656f9798-9jx6b\" (UID: \"a55aa2b4-ed3e-414c-8fa3-cba24092f81a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694882 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9wwt\" (UniqueName: \"kubernetes.io/projected/ec32db1f-c08c-4ea3-93c0-13dee21a1deb-kube-api-access-h9wwt\") pod \"console-operator-58897d9998-gdgrq\" (UID: \"ec32db1f-c08c-4ea3-93c0-13dee21a1deb\") " pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694905 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f403288d-b503-4f0c-bf83-3b29ff86ab94-secret-volume\") pod \"collect-profiles-29555340-b9fqc\" (UID: \"f403288d-b503-4f0c-bf83-3b29ff86ab94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694927 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42780a2c-c305-4915-9be7-799cec82b8b8-config\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694947 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3931e667-2d91-47d8-91ef-f8df2f96d75e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-x76dm\" (UID: \"3931e667-2d91-47d8-91ef-f8df2f96d75e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694964 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/42780a2c-c305-4915-9be7-799cec82b8b8-etcd-ca\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.694982 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcmnr\" (UniqueName: \"kubernetes.io/projected/40158ae9-4fb8-4e44-b9b1-d4abe50533d2-kube-api-access-mcmnr\") pod \"multus-admission-controller-857f4d67dd-c82bd\" (UID: \"40158ae9-4fb8-4e44-b9b1-d4abe50533d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c82bd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695024 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/680f8033-da87-4897-bf8c-23b2ad8af659-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7gdw7\" (UID: \"680f8033-da87-4897-bf8c-23b2ad8af659\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695048 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/675e0fd3-342d-46b4-968a-33dd611eb8c0-audit-policies\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695067 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/42780a2c-c305-4915-9be7-799cec82b8b8-etcd-client\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695091 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49132f14-e873-424a-8ffb-ebaa836c2db5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f9kwz\" (UID: \"49132f14-e873-424a-8ffb-ebaa836c2db5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695109 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-metrics-certs\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695129 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2scfr\" (UniqueName: \"kubernetes.io/projected/e98fd710-c49f-489e-8ccb-04834b738a98-kube-api-access-2scfr\") pod \"catalog-operator-68c6474976-fcwhd\" (UID: \"e98fd710-c49f-489e-8ccb-04834b738a98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695185 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e82943-c5ab-4f7e-91d2-f99937a1ad40-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9lvzt\" (UID: \"e0e82943-c5ab-4f7e-91d2-f99937a1ad40\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695313 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-trusted-ca-bundle\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695394 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqh67\" (UniqueName: \"kubernetes.io/projected/3931e667-2d91-47d8-91ef-f8df2f96d75e-kube-api-access-zqh67\") pod \"cluster-image-registry-operator-dc59b4c8b-x76dm\" (UID: \"3931e667-2d91-47d8-91ef-f8df2f96d75e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695451 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebdafa2f-f106-4273-be68-5f14d68904dc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qvprx\" (UID: \"ebdafa2f-f106-4273-be68-5f14d68904dc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695484 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49132f14-e873-424a-8ffb-ebaa836c2db5-config\") pod \"kube-controller-manager-operator-78b949d7b-f9kwz\" (UID: \"49132f14-e873-424a-8ffb-ebaa836c2db5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695540 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrv4q\" (UniqueName: \"kubernetes.io/projected/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-kube-api-access-hrv4q\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695568 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bc5516ca-a316-4768-85b7-1acc90471ad3-tmpfs\") pod \"packageserver-d55dfcdfc-6pfpd\" (UID: \"bc5516ca-a316-4768-85b7-1acc90471ad3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695596 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sg6kz"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695567 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/675e0fd3-342d-46b4-968a-33dd611eb8c0-audit-policies\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695732 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/675e0fd3-342d-46b4-968a-33dd611eb8c0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695772 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e82943-c5ab-4f7e-91d2-f99937a1ad40-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9lvzt\" (UID: \"e0e82943-c5ab-4f7e-91d2-f99937a1ad40\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695849 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a55aa2b4-ed3e-414c-8fa3-cba24092f81a-auth-proxy-config\") pod \"machine-approver-56656f9798-9jx6b\" (UID: \"a55aa2b4-ed3e-414c-8fa3-cba24092f81a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695910 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec32db1f-c08c-4ea3-93c0-13dee21a1deb-config\") pod \"console-operator-58897d9998-gdgrq\" (UID: \"ec32db1f-c08c-4ea3-93c0-13dee21a1deb\") " pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.695939 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/294eef62-eb36-4e34-baa0-f391d25f72f1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q684w\" (UID: \"294eef62-eb36-4e34-baa0-f391d25f72f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696083 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-oauth-serving-cert\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696115 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696122 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6757f226-348a-4d6c-a9ee-22c6315701af-metrics-tls\") pod \"dns-operator-744455d44c-fp9vb\" (UID: \"6757f226-348a-4d6c-a9ee-22c6315701af\") " pod="openshift-dns-operator/dns-operator-744455d44c-fp9vb" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696208 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/675e0fd3-342d-46b4-968a-33dd611eb8c0-serving-cert\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696274 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4h6c\" (UniqueName: \"kubernetes.io/projected/42780a2c-c305-4915-9be7-799cec82b8b8-kube-api-access-h4h6c\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696319 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2221a476-fe26-4767-9773-b88c8c9bfa7e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rxv7d\" (UID: \"2221a476-fe26-4767-9773-b88c8c9bfa7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696349 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b03ac45e-da7d-4e11-ab4b-ad1032a469a6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zqbl\" (UID: \"b03ac45e-da7d-4e11-ab4b-ad1032a469a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696394 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-serving-cert\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696394 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/675e0fd3-342d-46b4-968a-33dd611eb8c0-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696422 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebdafa2f-f106-4273-be68-5f14d68904dc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qvprx\" (UID: \"ebdafa2f-f106-4273-be68-5f14d68904dc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696576 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4a5bda7f-8bc4-47fc-8eac-148ea84f0160-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c2f7f\" (UID: \"4a5bda7f-8bc4-47fc-8eac-148ea84f0160\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696606 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696631 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv5s4\" (UniqueName: \"kubernetes.io/projected/680f8033-da87-4897-bf8c-23b2ad8af659-kube-api-access-zv5s4\") pod \"marketplace-operator-79b997595-7gdw7\" (UID: \"680f8033-da87-4897-bf8c-23b2ad8af659\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696651 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a55aa2b4-ed3e-414c-8fa3-cba24092f81a-auth-proxy-config\") pod \"machine-approver-56656f9798-9jx6b\" (UID: \"a55aa2b4-ed3e-414c-8fa3-cba24092f81a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696671 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c43287c-9ccf-472f-b671-fb54307ee938-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-grsp5\" (UID: \"7c43287c-9ccf-472f-b671-fb54307ee938\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696749 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/63c96b06-6182-4472-b8a8-393c627c77c9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tfcm2\" (UID: \"63c96b06-6182-4472-b8a8-393c627c77c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696854 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42780a2c-c305-4915-9be7-799cec82b8b8-serving-cert\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696879 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/675e0fd3-342d-46b4-968a-33dd611eb8c0-encryption-config\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696898 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsb8c\" (UniqueName: \"kubernetes.io/projected/a55aa2b4-ed3e-414c-8fa3-cba24092f81a-kube-api-access-zsb8c\") pod \"machine-approver-56656f9798-9jx6b\" (UID: \"a55aa2b4-ed3e-414c-8fa3-cba24092f81a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696945 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/675e0fd3-342d-46b4-968a-33dd611eb8c0-audit-dir\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.696967 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dfpw\" (UniqueName: \"kubernetes.io/projected/f403288d-b503-4f0c-bf83-3b29ff86ab94-kube-api-access-5dfpw\") pod \"collect-profiles-29555340-b9fqc\" (UID: \"f403288d-b503-4f0c-bf83-3b29ff86ab94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697017 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/294eef62-eb36-4e34-baa0-f391d25f72f1-images\") pod \"machine-config-operator-74547568cd-q684w\" (UID: \"294eef62-eb36-4e34-baa0-f391d25f72f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697039 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghn6b\" (UniqueName: \"kubernetes.io/projected/e0e82943-c5ab-4f7e-91d2-f99937a1ad40-kube-api-access-ghn6b\") pod \"openshift-apiserver-operator-796bbdcf4f-9lvzt\" (UID: \"e0e82943-c5ab-4f7e-91d2-f99937a1ad40\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697056 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-config\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697057 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/675e0fd3-342d-46b4-968a-33dd611eb8c0-audit-dir\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697072 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc5516ca-a316-4768-85b7-1acc90471ad3-apiservice-cert\") pod \"packageserver-d55dfcdfc-6pfpd\" (UID: \"bc5516ca-a316-4768-85b7-1acc90471ad3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697140 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-x8rdl\" (UID: \"5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x8rdl" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697166 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a55aa2b4-ed3e-414c-8fa3-cba24092f81a-config\") pod \"machine-approver-56656f9798-9jx6b\" (UID: \"a55aa2b4-ed3e-414c-8fa3-cba24092f81a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697181 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-console-config\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697198 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40fc6791-5e3b-47e8-a542-33f6aea34ba4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vjkr\" (UID: \"40fc6791-5e3b-47e8-a542-33f6aea34ba4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697220 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbqb2\" (UniqueName: \"kubernetes.io/projected/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-kube-api-access-zbqb2\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697241 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvnml\" (UniqueName: \"kubernetes.io/projected/63c96b06-6182-4472-b8a8-393c627c77c9-kube-api-access-vvnml\") pod \"cluster-samples-operator-665b6dd947-tfcm2\" (UID: \"63c96b06-6182-4472-b8a8-393c627c77c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697258 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m76bh\" (UniqueName: \"kubernetes.io/projected/b03ac45e-da7d-4e11-ab4b-ad1032a469a6-kube-api-access-m76bh\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zqbl\" (UID: \"b03ac45e-da7d-4e11-ab4b-ad1032a469a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697274 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kwfb\" (UniqueName: \"kubernetes.io/projected/4a5bda7f-8bc4-47fc-8eac-148ea84f0160-kube-api-access-2kwfb\") pod \"machine-config-controller-84d6567774-c2f7f\" (UID: \"4a5bda7f-8bc4-47fc-8eac-148ea84f0160\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697292 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdafa2f-f106-4273-be68-5f14d68904dc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qvprx\" (UID: \"ebdafa2f-f106-4273-be68-5f14d68904dc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697309 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wc24\" (UniqueName: \"kubernetes.io/projected/10dc730d-4b35-4b62-885e-9592696ba259-kube-api-access-8wc24\") pod \"migrator-59844c95c7-nxwgt\" (UID: \"10dc730d-4b35-4b62-885e-9592696ba259\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nxwgt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697323 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f403288d-b503-4f0c-bf83-3b29ff86ab94-config-volume\") pod \"collect-profiles-29555340-b9fqc\" (UID: \"f403288d-b503-4f0c-bf83-3b29ff86ab94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697338 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zft6\" (UniqueName: \"kubernetes.io/projected/57677fcb-c7a5-431c-b751-ec13d22484b1-kube-api-access-7zft6\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.697880 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a55aa2b4-ed3e-414c-8fa3-cba24092f81a-config\") pod \"machine-approver-56656f9798-9jx6b\" (UID: \"a55aa2b4-ed3e-414c-8fa3-cba24092f81a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.698071 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nxwgt"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.698180 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-config\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.698900 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-serving-cert\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.698929 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dctml"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.699532 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e334bcf0-dbe3-41d4-974b-222a58148c43-serving-cert\") pod \"openshift-config-operator-7777fb866f-r6qq6\" (UID: \"e334bcf0-dbe3-41d4-974b-222a58148c43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.699871 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lgwhc"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.700416 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a55aa2b4-ed3e-414c-8fa3-cba24092f81a-machine-approver-tls\") pod \"machine-approver-56656f9798-9jx6b\" (UID: \"a55aa2b4-ed3e-414c-8fa3-cba24092f81a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.700450 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.700529 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/675e0fd3-342d-46b4-968a-33dd611eb8c0-encryption-config\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.701289 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/63c96b06-6182-4472-b8a8-393c627c77c9-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tfcm2\" (UID: \"63c96b06-6182-4472-b8a8-393c627c77c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.701662 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e82943-c5ab-4f7e-91d2-f99937a1ad40-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9lvzt\" (UID: \"e0e82943-c5ab-4f7e-91d2-f99937a1ad40\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.701732 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.701981 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/675e0fd3-342d-46b4-968a-33dd611eb8c0-etcd-client\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.702056 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6757f226-348a-4d6c-a9ee-22c6315701af-metrics-tls\") pod \"dns-operator-744455d44c-fp9vb\" (UID: \"6757f226-348a-4d6c-a9ee-22c6315701af\") " pod="openshift-dns-operator/dns-operator-744455d44c-fp9vb" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.702162 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-655n6"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.703107 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-655n6"] Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.703201 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-655n6" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.744791 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.764800 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.779674 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.799873 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dfpw\" (UniqueName: \"kubernetes.io/projected/f403288d-b503-4f0c-bf83-3b29ff86ab94-kube-api-access-5dfpw\") pod \"collect-profiles-29555340-b9fqc\" (UID: \"f403288d-b503-4f0c-bf83-3b29ff86ab94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.799909 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/294eef62-eb36-4e34-baa0-f391d25f72f1-images\") pod \"machine-config-operator-74547568cd-q684w\" (UID: \"294eef62-eb36-4e34-baa0-f391d25f72f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.799938 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc5516ca-a316-4768-85b7-1acc90471ad3-apiservice-cert\") pod \"packageserver-d55dfcdfc-6pfpd\" (UID: \"bc5516ca-a316-4768-85b7-1acc90471ad3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.799958 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-x8rdl\" (UID: \"5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x8rdl" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.799979 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-console-config\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800000 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40fc6791-5e3b-47e8-a542-33f6aea34ba4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vjkr\" (UID: \"40fc6791-5e3b-47e8-a542-33f6aea34ba4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800038 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m76bh\" (UniqueName: \"kubernetes.io/projected/b03ac45e-da7d-4e11-ab4b-ad1032a469a6-kube-api-access-m76bh\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zqbl\" (UID: \"b03ac45e-da7d-4e11-ab4b-ad1032a469a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800057 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kwfb\" (UniqueName: \"kubernetes.io/projected/4a5bda7f-8bc4-47fc-8eac-148ea84f0160-kube-api-access-2kwfb\") pod \"machine-config-controller-84d6567774-c2f7f\" (UID: \"4a5bda7f-8bc4-47fc-8eac-148ea84f0160\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800074 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdafa2f-f106-4273-be68-5f14d68904dc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qvprx\" (UID: \"ebdafa2f-f106-4273-be68-5f14d68904dc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800092 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wc24\" (UniqueName: \"kubernetes.io/projected/10dc730d-4b35-4b62-885e-9592696ba259-kube-api-access-8wc24\") pod \"migrator-59844c95c7-nxwgt\" (UID: \"10dc730d-4b35-4b62-885e-9592696ba259\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nxwgt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800112 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f403288d-b503-4f0c-bf83-3b29ff86ab94-config-volume\") pod \"collect-profiles-29555340-b9fqc\" (UID: \"f403288d-b503-4f0c-bf83-3b29ff86ab94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800131 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zft6\" (UniqueName: \"kubernetes.io/projected/57677fcb-c7a5-431c-b751-ec13d22484b1-kube-api-access-7zft6\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800151 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-service-ca\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800174 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10edb77f-2de4-4817-8991-944bcdc731c3-config\") pod \"service-ca-operator-777779d784-nl5fb\" (UID: \"10edb77f-2de4-4817-8991-944bcdc731c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800198 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpmhm\" (UniqueName: \"kubernetes.io/projected/10edb77f-2de4-4817-8991-944bcdc731c3-kube-api-access-tpmhm\") pod \"service-ca-operator-777779d784-nl5fb\" (UID: \"10edb77f-2de4-4817-8991-944bcdc731c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800216 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc5516ca-a316-4768-85b7-1acc90471ad3-webhook-cert\") pod \"packageserver-d55dfcdfc-6pfpd\" (UID: \"bc5516ca-a316-4768-85b7-1acc90471ad3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800233 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/294eef62-eb36-4e34-baa0-f391d25f72f1-proxy-tls\") pod \"machine-config-operator-74547568cd-q684w\" (UID: \"294eef62-eb36-4e34-baa0-f391d25f72f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800252 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-default-certificate\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800271 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3931e667-2d91-47d8-91ef-f8df2f96d75e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-x76dm\" (UID: \"3931e667-2d91-47d8-91ef-f8df2f96d75e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800289 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec32db1f-c08c-4ea3-93c0-13dee21a1deb-serving-cert\") pod \"console-operator-58897d9998-gdgrq\" (UID: \"ec32db1f-c08c-4ea3-93c0-13dee21a1deb\") " pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800307 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/42780a2c-c305-4915-9be7-799cec82b8b8-etcd-service-ca\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800325 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49132f14-e873-424a-8ffb-ebaa836c2db5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f9kwz\" (UID: \"49132f14-e873-424a-8ffb-ebaa836c2db5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800346 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a5bda7f-8bc4-47fc-8eac-148ea84f0160-proxy-tls\") pod \"machine-config-controller-84d6567774-c2f7f\" (UID: \"4a5bda7f-8bc4-47fc-8eac-148ea84f0160\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800384 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57677fcb-c7a5-431c-b751-ec13d22484b1-console-oauth-config\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800403 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03ac45e-da7d-4e11-ab4b-ad1032a469a6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zqbl\" (UID: \"b03ac45e-da7d-4e11-ab4b-ad1032a469a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800424 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec32db1f-c08c-4ea3-93c0-13dee21a1deb-trusted-ca\") pod \"console-operator-58897d9998-gdgrq\" (UID: \"ec32db1f-c08c-4ea3-93c0-13dee21a1deb\") " pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800442 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3931e667-2d91-47d8-91ef-f8df2f96d75e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-x76dm\" (UID: \"3931e667-2d91-47d8-91ef-f8df2f96d75e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800463 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10edb77f-2de4-4817-8991-944bcdc731c3-serving-cert\") pod \"service-ca-operator-777779d784-nl5fb\" (UID: \"10edb77f-2de4-4817-8991-944bcdc731c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800479 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55gl5\" (UniqueName: \"kubernetes.io/projected/5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4-kube-api-access-55gl5\") pod \"control-plane-machine-set-operator-78cbb6b69f-x8rdl\" (UID: \"5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x8rdl" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800494 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57677fcb-c7a5-431c-b751-ec13d22484b1-console-serving-cert\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800525 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e98fd710-c49f-489e-8ccb-04834b738a98-srv-cert\") pod \"catalog-operator-68c6474976-fcwhd\" (UID: \"e98fd710-c49f-489e-8ccb-04834b738a98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800541 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8jnx\" (UniqueName: \"kubernetes.io/projected/bc5516ca-a316-4768-85b7-1acc90471ad3-kube-api-access-r8jnx\") pod \"packageserver-d55dfcdfc-6pfpd\" (UID: \"bc5516ca-a316-4768-85b7-1acc90471ad3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800560 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtgd6\" (UniqueName: \"kubernetes.io/projected/294eef62-eb36-4e34-baa0-f391d25f72f1-kube-api-access-rtgd6\") pod \"machine-config-operator-74547568cd-q684w\" (UID: \"294eef62-eb36-4e34-baa0-f391d25f72f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800578 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/680f8033-da87-4897-bf8c-23b2ad8af659-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7gdw7\" (UID: \"680f8033-da87-4897-bf8c-23b2ad8af659\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800596 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/40158ae9-4fb8-4e44-b9b1-d4abe50533d2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c82bd\" (UID: \"40158ae9-4fb8-4e44-b9b1-d4abe50533d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c82bd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800612 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfnbn\" (UniqueName: \"kubernetes.io/projected/2221a476-fe26-4767-9773-b88c8c9bfa7e-kube-api-access-xfnbn\") pod \"olm-operator-6b444d44fb-rxv7d\" (UID: \"2221a476-fe26-4767-9773-b88c8c9bfa7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800627 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e98fd710-c49f-489e-8ccb-04834b738a98-profile-collector-cert\") pod \"catalog-operator-68c6474976-fcwhd\" (UID: \"e98fd710-c49f-489e-8ccb-04834b738a98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800652 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shmbv\" (UniqueName: \"kubernetes.io/projected/7c43287c-9ccf-472f-b671-fb54307ee938-kube-api-access-shmbv\") pod \"package-server-manager-789f6589d5-grsp5\" (UID: \"7c43287c-9ccf-472f-b671-fb54307ee938\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800671 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc6791-5e3b-47e8-a542-33f6aea34ba4-config\") pod \"kube-apiserver-operator-766d6c64bb-4vjkr\" (UID: \"40fc6791-5e3b-47e8-a542-33f6aea34ba4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800687 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-service-ca-bundle\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800705 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-stats-auth\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800720 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40fc6791-5e3b-47e8-a542-33f6aea34ba4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vjkr\" (UID: \"40fc6791-5e3b-47e8-a542-33f6aea34ba4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800735 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2221a476-fe26-4767-9773-b88c8c9bfa7e-srv-cert\") pod \"olm-operator-6b444d44fb-rxv7d\" (UID: \"2221a476-fe26-4767-9773-b88c8c9bfa7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800728 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-console-config\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800760 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9wwt\" (UniqueName: \"kubernetes.io/projected/ec32db1f-c08c-4ea3-93c0-13dee21a1deb-kube-api-access-h9wwt\") pod \"console-operator-58897d9998-gdgrq\" (UID: \"ec32db1f-c08c-4ea3-93c0-13dee21a1deb\") " pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800777 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f403288d-b503-4f0c-bf83-3b29ff86ab94-secret-volume\") pod \"collect-profiles-29555340-b9fqc\" (UID: \"f403288d-b503-4f0c-bf83-3b29ff86ab94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800794 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42780a2c-c305-4915-9be7-799cec82b8b8-config\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800825 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3931e667-2d91-47d8-91ef-f8df2f96d75e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-x76dm\" (UID: \"3931e667-2d91-47d8-91ef-f8df2f96d75e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800841 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/42780a2c-c305-4915-9be7-799cec82b8b8-etcd-ca\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800856 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcmnr\" (UniqueName: \"kubernetes.io/projected/40158ae9-4fb8-4e44-b9b1-d4abe50533d2-kube-api-access-mcmnr\") pod \"multus-admission-controller-857f4d67dd-c82bd\" (UID: \"40158ae9-4fb8-4e44-b9b1-d4abe50533d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c82bd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800872 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/680f8033-da87-4897-bf8c-23b2ad8af659-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7gdw7\" (UID: \"680f8033-da87-4897-bf8c-23b2ad8af659\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800891 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/42780a2c-c305-4915-9be7-799cec82b8b8-etcd-client\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800910 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49132f14-e873-424a-8ffb-ebaa836c2db5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f9kwz\" (UID: \"49132f14-e873-424a-8ffb-ebaa836c2db5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800929 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-metrics-certs\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800947 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2scfr\" (UniqueName: \"kubernetes.io/projected/e98fd710-c49f-489e-8ccb-04834b738a98-kube-api-access-2scfr\") pod \"catalog-operator-68c6474976-fcwhd\" (UID: \"e98fd710-c49f-489e-8ccb-04834b738a98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.800987 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-trusted-ca-bundle\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801012 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqh67\" (UniqueName: \"kubernetes.io/projected/3931e667-2d91-47d8-91ef-f8df2f96d75e-kube-api-access-zqh67\") pod \"cluster-image-registry-operator-dc59b4c8b-x76dm\" (UID: \"3931e667-2d91-47d8-91ef-f8df2f96d75e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801028 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebdafa2f-f106-4273-be68-5f14d68904dc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qvprx\" (UID: \"ebdafa2f-f106-4273-be68-5f14d68904dc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801045 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49132f14-e873-424a-8ffb-ebaa836c2db5-config\") pod \"kube-controller-manager-operator-78b949d7b-f9kwz\" (UID: \"49132f14-e873-424a-8ffb-ebaa836c2db5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801063 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrv4q\" (UniqueName: \"kubernetes.io/projected/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-kube-api-access-hrv4q\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801079 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bc5516ca-a316-4768-85b7-1acc90471ad3-tmpfs\") pod \"packageserver-d55dfcdfc-6pfpd\" (UID: \"bc5516ca-a316-4768-85b7-1acc90471ad3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801104 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec32db1f-c08c-4ea3-93c0-13dee21a1deb-config\") pod \"console-operator-58897d9998-gdgrq\" (UID: \"ec32db1f-c08c-4ea3-93c0-13dee21a1deb\") " pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801121 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/294eef62-eb36-4e34-baa0-f391d25f72f1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q684w\" (UID: \"294eef62-eb36-4e34-baa0-f391d25f72f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801140 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-oauth-serving-cert\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801161 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4h6c\" (UniqueName: \"kubernetes.io/projected/42780a2c-c305-4915-9be7-799cec82b8b8-kube-api-access-h4h6c\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801184 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2221a476-fe26-4767-9773-b88c8c9bfa7e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rxv7d\" (UID: \"2221a476-fe26-4767-9773-b88c8c9bfa7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801201 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b03ac45e-da7d-4e11-ab4b-ad1032a469a6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zqbl\" (UID: \"b03ac45e-da7d-4e11-ab4b-ad1032a469a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801218 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebdafa2f-f106-4273-be68-5f14d68904dc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qvprx\" (UID: \"ebdafa2f-f106-4273-be68-5f14d68904dc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801236 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4a5bda7f-8bc4-47fc-8eac-148ea84f0160-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c2f7f\" (UID: \"4a5bda7f-8bc4-47fc-8eac-148ea84f0160\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801254 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv5s4\" (UniqueName: \"kubernetes.io/projected/680f8033-da87-4897-bf8c-23b2ad8af659-kube-api-access-zv5s4\") pod \"marketplace-operator-79b997595-7gdw7\" (UID: \"680f8033-da87-4897-bf8c-23b2ad8af659\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801271 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c43287c-9ccf-472f-b671-fb54307ee938-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-grsp5\" (UID: \"7c43287c-9ccf-472f-b671-fb54307ee938\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801304 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42780a2c-c305-4915-9be7-799cec82b8b8-serving-cert\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.801730 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-service-ca\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.802786 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-oauth-serving-cert\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.802862 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42780a2c-c305-4915-9be7-799cec82b8b8-config\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.802901 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/42780a2c-c305-4915-9be7-799cec82b8b8-etcd-ca\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.803060 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3931e667-2d91-47d8-91ef-f8df2f96d75e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-x76dm\" (UID: \"3931e667-2d91-47d8-91ef-f8df2f96d75e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.803312 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/42780a2c-c305-4915-9be7-799cec82b8b8-etcd-service-ca\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.803380 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bc5516ca-a316-4768-85b7-1acc90471ad3-tmpfs\") pod \"packageserver-d55dfcdfc-6pfpd\" (UID: \"bc5516ca-a316-4768-85b7-1acc90471ad3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.803458 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-trusted-ca-bundle\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.804053 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec32db1f-c08c-4ea3-93c0-13dee21a1deb-config\") pod \"console-operator-58897d9998-gdgrq\" (UID: \"ec32db1f-c08c-4ea3-93c0-13dee21a1deb\") " pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.804244 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec32db1f-c08c-4ea3-93c0-13dee21a1deb-trusted-ca\") pod \"console-operator-58897d9998-gdgrq\" (UID: \"ec32db1f-c08c-4ea3-93c0-13dee21a1deb\") " pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.804440 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4a5bda7f-8bc4-47fc-8eac-148ea84f0160-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-c2f7f\" (UID: \"4a5bda7f-8bc4-47fc-8eac-148ea84f0160\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.804589 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/294eef62-eb36-4e34-baa0-f391d25f72f1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-q684w\" (UID: \"294eef62-eb36-4e34-baa0-f391d25f72f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.807148 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42780a2c-c305-4915-9be7-799cec82b8b8-serving-cert\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.807766 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.807763 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/42780a2c-c305-4915-9be7-799cec82b8b8-etcd-client\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.807920 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57677fcb-c7a5-431c-b751-ec13d22484b1-console-oauth-config\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.808551 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec32db1f-c08c-4ea3-93c0-13dee21a1deb-serving-cert\") pod \"console-operator-58897d9998-gdgrq\" (UID: \"ec32db1f-c08c-4ea3-93c0-13dee21a1deb\") " pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.808663 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57677fcb-c7a5-431c-b751-ec13d22484b1-console-serving-cert\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.820340 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.843640 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.860091 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.865310 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-metrics-certs\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.880494 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.886391 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3931e667-2d91-47d8-91ef-f8df2f96d75e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-x76dm\" (UID: \"3931e667-2d91-47d8-91ef-f8df2f96d75e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.900525 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.930681 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.939987 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.945542 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-default-certificate\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.960604 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.962291 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-service-ca-bundle\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.982617 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.983447 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.983572 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:37 crc kubenswrapper[4921]: I0312 13:11:37.983863 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.000799 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.020637 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.025962 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-stats-auth\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.040908 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.044201 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebdafa2f-f106-4273-be68-5f14d68904dc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qvprx\" (UID: \"ebdafa2f-f106-4273-be68-5f14d68904dc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.060058 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.080778 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.083478 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebdafa2f-f106-4273-be68-5f14d68904dc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qvprx\" (UID: \"ebdafa2f-f106-4273-be68-5f14d68904dc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.100843 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.103197 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40fc6791-5e3b-47e8-a542-33f6aea34ba4-config\") pod \"kube-apiserver-operator-766d6c64bb-4vjkr\" (UID: \"40fc6791-5e3b-47e8-a542-33f6aea34ba4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.120887 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.142403 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.155603 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40fc6791-5e3b-47e8-a542-33f6aea34ba4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vjkr\" (UID: \"40fc6791-5e3b-47e8-a542-33f6aea34ba4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.160209 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.180488 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.202134 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.221033 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.227394 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/e98fd710-c49f-489e-8ccb-04834b738a98-srv-cert\") pod \"catalog-operator-68c6474976-fcwhd\" (UID: \"e98fd710-c49f-489e-8ccb-04834b738a98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.241390 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.245648 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f403288d-b503-4f0c-bf83-3b29ff86ab94-secret-volume\") pod \"collect-profiles-29555340-b9fqc\" (UID: \"f403288d-b503-4f0c-bf83-3b29ff86ab94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.246198 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/e98fd710-c49f-489e-8ccb-04834b738a98-profile-collector-cert\") pod \"catalog-operator-68c6474976-fcwhd\" (UID: \"e98fd710-c49f-489e-8ccb-04834b738a98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.248460 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2221a476-fe26-4767-9773-b88c8c9bfa7e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rxv7d\" (UID: \"2221a476-fe26-4767-9773-b88c8c9bfa7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.262135 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.284862 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.301247 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.314404 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-x8rdl\" (UID: \"5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x8rdl" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.320933 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.340895 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.360090 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.380587 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.387240 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c43287c-9ccf-472f-b671-fb54307ee938-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-grsp5\" (UID: \"7c43287c-9ccf-472f-b671-fb54307ee938\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.400300 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.427768 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.433370 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/680f8033-da87-4897-bf8c-23b2ad8af659-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7gdw7\" (UID: \"680f8033-da87-4897-bf8c-23b2ad8af659\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.441154 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.460142 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.462863 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f403288d-b503-4f0c-bf83-3b29ff86ab94-config-volume\") pod \"collect-profiles-29555340-b9fqc\" (UID: \"f403288d-b503-4f0c-bf83-3b29ff86ab94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.481700 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.501197 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.521115 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.542248 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.549116 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/680f8033-da87-4897-bf8c-23b2ad8af659-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7gdw7\" (UID: \"680f8033-da87-4897-bf8c-23b2ad8af659\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.561517 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.598932 4921 request.go:700] Waited for 1.013497924s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/secrets?fieldSelector=metadata.name%3Dingress-operator-dockercfg-7lnqk&limit=500&resourceVersion=0 Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.603211 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.605072 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.621637 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.640098 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.646136 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bc5516ca-a316-4768-85b7-1acc90471ad3-webhook-cert\") pod \"packageserver-d55dfcdfc-6pfpd\" (UID: \"bc5516ca-a316-4768-85b7-1acc90471ad3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.647468 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bc5516ca-a316-4768-85b7-1acc90471ad3-apiservice-cert\") pod \"packageserver-d55dfcdfc-6pfpd\" (UID: \"bc5516ca-a316-4768-85b7-1acc90471ad3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.662980 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.701526 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.708891 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b03ac45e-da7d-4e11-ab4b-ad1032a469a6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zqbl\" (UID: \"b03ac45e-da7d-4e11-ab4b-ad1032a469a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.720733 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.727663 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49132f14-e873-424a-8ffb-ebaa836c2db5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f9kwz\" (UID: \"49132f14-e873-424a-8ffb-ebaa836c2db5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.741168 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.747537 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2221a476-fe26-4767-9773-b88c8c9bfa7e-srv-cert\") pod \"olm-operator-6b444d44fb-rxv7d\" (UID: \"2221a476-fe26-4767-9773-b88c8c9bfa7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.762200 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.781808 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.788178 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a5bda7f-8bc4-47fc-8eac-148ea84f0160-proxy-tls\") pod \"machine-config-controller-84d6567774-c2f7f\" (UID: \"4a5bda7f-8bc4-47fc-8eac-148ea84f0160\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" Mar 12 13:11:38 crc kubenswrapper[4921]: E0312 13:11:38.800941 4921 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:38 crc kubenswrapper[4921]: E0312 13:11:38.800997 4921 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 12 13:11:38 crc kubenswrapper[4921]: E0312 13:11:38.801046 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/294eef62-eb36-4e34-baa0-f391d25f72f1-images podName:294eef62-eb36-4e34-baa0-f391d25f72f1 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:39.301020982 +0000 UTC m=+121.991092993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/294eef62-eb36-4e34-baa0-f391d25f72f1-images") pod "machine-config-operator-74547568cd-q684w" (UID: "294eef62-eb36-4e34-baa0-f391d25f72f1") : failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:38 crc kubenswrapper[4921]: E0312 13:11:38.801074 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10edb77f-2de4-4817-8991-944bcdc731c3-serving-cert podName:10edb77f-2de4-4817-8991-944bcdc731c3 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:39.301061663 +0000 UTC m=+121.991133664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/10edb77f-2de4-4817-8991-944bcdc731c3-serving-cert") pod "service-ca-operator-777779d784-nl5fb" (UID: "10edb77f-2de4-4817-8991-944bcdc731c3") : failed to sync secret cache: timed out waiting for the condition Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.801422 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 12 13:11:38 crc kubenswrapper[4921]: E0312 13:11:38.802324 4921 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 12 13:11:38 crc kubenswrapper[4921]: E0312 13:11:38.802371 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/294eef62-eb36-4e34-baa0-f391d25f72f1-proxy-tls podName:294eef62-eb36-4e34-baa0-f391d25f72f1 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:39.302355293 +0000 UTC m=+121.992427264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/294eef62-eb36-4e34-baa0-f391d25f72f1-proxy-tls") pod "machine-config-operator-74547568cd-q684w" (UID: "294eef62-eb36-4e34-baa0-f391d25f72f1") : failed to sync secret cache: timed out waiting for the condition Mar 12 13:11:38 crc kubenswrapper[4921]: E0312 13:11:38.802387 4921 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Mar 12 13:11:38 crc kubenswrapper[4921]: E0312 13:11:38.802451 4921 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:38 crc kubenswrapper[4921]: E0312 13:11:38.802461 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40158ae9-4fb8-4e44-b9b1-d4abe50533d2-webhook-certs podName:40158ae9-4fb8-4e44-b9b1-d4abe50533d2 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:39.302439495 +0000 UTC m=+121.992511496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/40158ae9-4fb8-4e44-b9b1-d4abe50533d2-webhook-certs") pod "multus-admission-controller-857f4d67dd-c82bd" (UID: "40158ae9-4fb8-4e44-b9b1-d4abe50533d2") : failed to sync secret cache: timed out waiting for the condition Mar 12 13:11:38 crc kubenswrapper[4921]: E0312 13:11:38.802546 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/10edb77f-2de4-4817-8991-944bcdc731c3-config podName:10edb77f-2de4-4817-8991-944bcdc731c3 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:39.302520358 +0000 UTC m=+121.992592359 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/10edb77f-2de4-4817-8991-944bcdc731c3-config") pod "service-ca-operator-777779d784-nl5fb" (UID: "10edb77f-2de4-4817-8991-944bcdc731c3") : failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:38 crc kubenswrapper[4921]: E0312 13:11:38.803301 4921 configmap.go:193] Couldn't get configMap openshift-kube-controller-manager-operator/kube-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:38 crc kubenswrapper[4921]: E0312 13:11:38.803373 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/49132f14-e873-424a-8ffb-ebaa836c2db5-config podName:49132f14-e873-424a-8ffb-ebaa836c2db5 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:39.303357054 +0000 UTC m=+121.993429065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/49132f14-e873-424a-8ffb-ebaa836c2db5-config") pod "kube-controller-manager-operator-78b949d7b-f9kwz" (UID: "49132f14-e873-424a-8ffb-ebaa836c2db5") : failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:38 crc kubenswrapper[4921]: E0312 13:11:38.804409 4921 configmap.go:193] Couldn't get configMap openshift-kube-storage-version-migrator-operator/config: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:38 crc kubenswrapper[4921]: E0312 13:11:38.804480 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b03ac45e-da7d-4e11-ab4b-ad1032a469a6-config podName:b03ac45e-da7d-4e11-ab4b-ad1032a469a6 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:39.304462589 +0000 UTC m=+121.994534590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b03ac45e-da7d-4e11-ab4b-ad1032a469a6-config") pod "kube-storage-version-migrator-operator-b67b599dd-8zqbl" (UID: "b03ac45e-da7d-4e11-ab4b-ad1032a469a6") : failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.820483 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.840425 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.860972 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.880079 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.920445 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.941196 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.961586 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.981626 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 12 13:11:38 crc kubenswrapper[4921]: I0312 13:11:38.982668 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.001796 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.061168 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.100911 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.120852 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.141344 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.160615 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.180846 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.220833 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.239587 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.261059 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.280955 4921 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.300971 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.319983 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49132f14-e873-424a-8ffb-ebaa836c2db5-config\") pod \"kube-controller-manager-operator-78b949d7b-f9kwz\" (UID: \"49132f14-e873-424a-8ffb-ebaa836c2db5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.320450 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/294eef62-eb36-4e34-baa0-f391d25f72f1-images\") pod \"machine-config-operator-74547568cd-q684w\" (UID: \"294eef62-eb36-4e34-baa0-f391d25f72f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.320562 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10edb77f-2de4-4817-8991-944bcdc731c3-config\") pod \"service-ca-operator-777779d784-nl5fb\" (UID: \"10edb77f-2de4-4817-8991-944bcdc731c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.320604 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/294eef62-eb36-4e34-baa0-f391d25f72f1-proxy-tls\") pod \"machine-config-operator-74547568cd-q684w\" (UID: \"294eef62-eb36-4e34-baa0-f391d25f72f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.320673 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03ac45e-da7d-4e11-ab4b-ad1032a469a6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zqbl\" (UID: \"b03ac45e-da7d-4e11-ab4b-ad1032a469a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.320714 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10edb77f-2de4-4817-8991-944bcdc731c3-serving-cert\") pod \"service-ca-operator-777779d784-nl5fb\" (UID: \"10edb77f-2de4-4817-8991-944bcdc731c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.320786 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/40158ae9-4fb8-4e44-b9b1-d4abe50533d2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c82bd\" (UID: \"40158ae9-4fb8-4e44-b9b1-d4abe50533d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c82bd" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.321325 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.321391 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49132f14-e873-424a-8ffb-ebaa836c2db5-config\") pod \"kube-controller-manager-operator-78b949d7b-f9kwz\" (UID: \"49132f14-e873-424a-8ffb-ebaa836c2db5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.321679 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b03ac45e-da7d-4e11-ab4b-ad1032a469a6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zqbl\" (UID: \"b03ac45e-da7d-4e11-ab4b-ad1032a469a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.321786 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/294eef62-eb36-4e34-baa0-f391d25f72f1-images\") pod \"machine-config-operator-74547568cd-q684w\" (UID: \"294eef62-eb36-4e34-baa0-f391d25f72f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.321849 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10edb77f-2de4-4817-8991-944bcdc731c3-config\") pod \"service-ca-operator-777779d784-nl5fb\" (UID: \"10edb77f-2de4-4817-8991-944bcdc731c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.324437 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/40158ae9-4fb8-4e44-b9b1-d4abe50533d2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-c82bd\" (UID: \"40158ae9-4fb8-4e44-b9b1-d4abe50533d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c82bd" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.324800 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10edb77f-2de4-4817-8991-944bcdc731c3-serving-cert\") pod \"service-ca-operator-777779d784-nl5fb\" (UID: \"10edb77f-2de4-4817-8991-944bcdc731c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.325039 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/294eef62-eb36-4e34-baa0-f391d25f72f1-proxy-tls\") pod \"machine-config-operator-74547568cd-q684w\" (UID: \"294eef62-eb36-4e34-baa0-f391d25f72f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.340635 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.360288 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.380634 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.400623 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.420696 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.440403 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.461612 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.479966 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.501437 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.521032 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.618959 4921 request.go:700] Waited for 1.921693896s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.680839 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 12 13:11:39 crc kubenswrapper[4921]: E0312 13:11:39.696954 4921 projected.go:288] Couldn't get configMap openshift-authentication-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.700665 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.721724 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.741494 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.795390 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dfpw\" (UniqueName: \"kubernetes.io/projected/f403288d-b503-4f0c-bf83-3b29ff86ab94-kube-api-access-5dfpw\") pod \"collect-profiles-29555340-b9fqc\" (UID: \"f403288d-b503-4f0c-bf83-3b29ff86ab94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.828708 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/40fc6791-5e3b-47e8-a542-33f6aea34ba4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vjkr\" (UID: \"40fc6791-5e3b-47e8-a542-33f6aea34ba4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.842487 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.850335 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m76bh\" (UniqueName: \"kubernetes.io/projected/b03ac45e-da7d-4e11-ab4b-ad1032a469a6-kube-api-access-m76bh\") pod \"kube-storage-version-migrator-operator-b67b599dd-8zqbl\" (UID: \"b03ac45e-da7d-4e11-ab4b-ad1032a469a6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.860973 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kwfb\" (UniqueName: \"kubernetes.io/projected/4a5bda7f-8bc4-47fc-8eac-148ea84f0160-kube-api-access-2kwfb\") pod \"machine-config-controller-84d6567774-c2f7f\" (UID: \"4a5bda7f-8bc4-47fc-8eac-148ea84f0160\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.875343 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wc24\" (UniqueName: \"kubernetes.io/projected/10dc730d-4b35-4b62-885e-9592696ba259-kube-api-access-8wc24\") pod \"migrator-59844c95c7-nxwgt\" (UID: \"10dc730d-4b35-4b62-885e-9592696ba259\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nxwgt" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.886703 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" Mar 12 13:11:39 crc kubenswrapper[4921]: E0312 13:11:39.924056 4921 projected.go:288] Couldn't get configMap openshift-machine-api/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.929044 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.949931 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpmhm\" (UniqueName: \"kubernetes.io/projected/10edb77f-2de4-4817-8991-944bcdc731c3-kube-api-access-tpmhm\") pod \"service-ca-operator-777779d784-nl5fb\" (UID: \"10edb77f-2de4-4817-8991-944bcdc731c3\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.961236 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.967137 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.968495 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtgd6\" (UniqueName: \"kubernetes.io/projected/294eef62-eb36-4e34-baa0-f391d25f72f1-kube-api-access-rtgd6\") pod \"machine-config-operator-74547568cd-q684w\" (UID: \"294eef62-eb36-4e34-baa0-f391d25f72f1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.972299 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" Mar 12 13:11:39 crc kubenswrapper[4921]: I0312 13:11:39.982608 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3931e667-2d91-47d8-91ef-f8df2f96d75e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-x76dm\" (UID: \"3931e667-2d91-47d8-91ef-f8df2f96d75e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.005978 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2scfr\" (UniqueName: \"kubernetes.io/projected/e98fd710-c49f-489e-8ccb-04834b738a98-kube-api-access-2scfr\") pod \"catalog-operator-68c6474976-fcwhd\" (UID: \"e98fd710-c49f-489e-8ccb-04834b738a98\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.022048 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcmnr\" (UniqueName: \"kubernetes.io/projected/40158ae9-4fb8-4e44-b9b1-d4abe50533d2-kube-api-access-mcmnr\") pod \"multus-admission-controller-857f4d67dd-c82bd\" (UID: \"40158ae9-4fb8-4e44-b9b1-d4abe50533d2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-c82bd" Mar 12 13:11:40 crc kubenswrapper[4921]: E0312 13:11:40.044186 4921 projected.go:288] Couldn't get configMap openshift-authentication/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:40 crc kubenswrapper[4921]: E0312 13:11:40.064477 4921 projected.go:288] Couldn't get configMap openshift-route-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.067780 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr"] Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.077905 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfnbn\" (UniqueName: \"kubernetes.io/projected/2221a476-fe26-4767-9773-b88c8c9bfa7e-kube-api-access-xfnbn\") pod \"olm-operator-6b444d44fb-rxv7d\" (UID: \"2221a476-fe26-4767-9773-b88c8c9bfa7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.084608 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shmbv\" (UniqueName: \"kubernetes.io/projected/7c43287c-9ccf-472f-b671-fb54307ee938-kube-api-access-shmbv\") pod \"package-server-manager-789f6589d5-grsp5\" (UID: \"7c43287c-9ccf-472f-b671-fb54307ee938\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.097450 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqh67\" (UniqueName: \"kubernetes.io/projected/3931e667-2d91-47d8-91ef-f8df2f96d75e-kube-api-access-zqh67\") pod \"cluster-image-registry-operator-dc59b4c8b-x76dm\" (UID: \"3931e667-2d91-47d8-91ef-f8df2f96d75e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" Mar 12 13:11:40 crc kubenswrapper[4921]: E0312 13:11:40.105470 4921 projected.go:288] Couldn't get configMap openshift-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.130579 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.140183 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49132f14-e873-424a-8ffb-ebaa836c2db5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f9kwz\" (UID: \"49132f14-e873-424a-8ffb-ebaa836c2db5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.148495 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.154629 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nxwgt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.162978 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebdafa2f-f106-4273-be68-5f14d68904dc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qvprx\" (UID: \"ebdafa2f-f106-4273-be68-5f14d68904dc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.172421 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.195785 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv5s4\" (UniqueName: \"kubernetes.io/projected/680f8033-da87-4897-bf8c-23b2ad8af659-kube-api-access-zv5s4\") pod \"marketplace-operator-79b997595-7gdw7\" (UID: \"680f8033-da87-4897-bf8c-23b2ad8af659\") " pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.216712 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8jnx\" (UniqueName: \"kubernetes.io/projected/bc5516ca-a316-4768-85b7-1acc90471ad3-kube-api-access-r8jnx\") pod \"packageserver-d55dfcdfc-6pfpd\" (UID: \"bc5516ca-a316-4768-85b7-1acc90471ad3\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.217431 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl"] Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.227360 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: W0312 13:11:40.233487 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03ac45e_da7d_4e11_ab4b_ad1032a469a6.slice/crio-a36723b0d1879c6693cdc0da58ec652fbcfa453dc3772b6ddbb761525f33be79 WatchSource:0}: Error finding container a36723b0d1879c6693cdc0da58ec652fbcfa453dc3772b6ddbb761525f33be79: Status 404 returned error can't find the container with id a36723b0d1879c6693cdc0da58ec652fbcfa453dc3772b6ddbb761525f33be79 Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.238151 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.246424 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.247071 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.252400 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f"] Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.254905 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-c82bd" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.263980 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.266327 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb"] Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.279918 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.299277 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-q684w"] Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.301406 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.319942 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: W0312 13:11:40.331171 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod294eef62_eb36_4e34_baa0_f391d25f72f1.slice/crio-c1c308344a7ef908fb9737f5f050a3d6510cc14eaa42d8d93599fba70f2597f4 WatchSource:0}: Error finding container c1c308344a7ef908fb9737f5f050a3d6510cc14eaa42d8d93599fba70f2597f4: Status 404 returned error can't find the container with id c1c308344a7ef908fb9737f5f050a3d6510cc14eaa42d8d93599fba70f2597f4 Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.341190 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.360439 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.372979 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc"] Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.382904 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.402911 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.432282 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd"] Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.435584 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.436567 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.465532 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.470834 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm"] Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.481671 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.502191 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5"] Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.504029 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.520421 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: W0312 13:11:40.522359 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c43287c_9ccf_472f_b671_fb54307ee938.slice/crio-56688c4e25d1b70a3ce8546078687d2425f9bce8729aec698f985a215dba7d8b WatchSource:0}: Error finding container 56688c4e25d1b70a3ce8546078687d2425f9bce8729aec698f985a215dba7d8b: Status 404 returned error can't find the container with id 56688c4e25d1b70a3ce8546078687d2425f9bce8729aec698f985a215dba7d8b Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.543424 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.561571 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.582583 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.602954 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: E0312 13:11:40.607799 4921 projected.go:194] Error preparing data for projected volume kube-api-access-qj8m6 for pod openshift-authentication-operator/authentication-operator-69f744f599-4c92v: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:40 crc kubenswrapper[4921]: E0312 13:11:40.607886 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd5e10c8-1017-4083-a5d8-550f2aca7920-kube-api-access-qj8m6 podName:bd5e10c8-1017-4083-a5d8-550f2aca7920 nodeName:}" failed. No retries permitted until 2026-03-12 13:11:41.107862999 +0000 UTC m=+123.797934970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qj8m6" (UniqueName: "kubernetes.io/projected/bd5e10c8-1017-4083-a5d8-550f2aca7920-kube-api-access-qj8m6") pod "authentication-operator-69f744f599-4c92v" (UID: "bd5e10c8-1017-4083-a5d8-550f2aca7920") : failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.634099 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nxwgt"] Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.640622 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: E0312 13:11:40.645216 4921 projected.go:194] Error preparing data for projected volume kube-api-access-x4kfc for pod openshift-machine-api/machine-api-operator-5694c8668f-r7sfx: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:40 crc kubenswrapper[4921]: E0312 13:11:40.645305 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/345c99f7-75d2-48da-9a45-6fd8ce5c92da-kube-api-access-x4kfc podName:345c99f7-75d2-48da-9a45-6fd8ce5c92da nodeName:}" failed. No retries permitted until 2026-03-12 13:11:41.145280722 +0000 UTC m=+123.835352693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x4kfc" (UniqueName: "kubernetes.io/projected/345c99f7-75d2-48da-9a45-6fd8ce5c92da-kube-api-access-x4kfc") pod "machine-api-operator-5694c8668f-r7sfx" (UID: "345c99f7-75d2-48da-9a45-6fd8ce5c92da") : failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.651679 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55gl5\" (UniqueName: \"kubernetes.io/projected/5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4-kube-api-access-55gl5\") pod \"control-plane-machine-set-operator-78cbb6b69f-x8rdl\" (UID: \"5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x8rdl" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.661462 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.679847 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.683391 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" event={"ID":"f403288d-b503-4f0c-bf83-3b29ff86ab94","Type":"ContainerStarted","Data":"fc91c9434028740655300c434b681676ae5f6bc96b088a1a58fa416bc98b3208"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.685919 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" event={"ID":"f403288d-b503-4f0c-bf83-3b29ff86ab94","Type":"ContainerStarted","Data":"7a3e9667630e8d4701542a5a65087e6697285fb7735bf6b95e5484279fac4394"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.687530 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx"] Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.691544 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" event={"ID":"4a5bda7f-8bc4-47fc-8eac-148ea84f0160","Type":"ContainerStarted","Data":"c4515004be190e784f3866e69f3adacf262486629acdea8018745dddd13708fb"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.691581 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" event={"ID":"4a5bda7f-8bc4-47fc-8eac-148ea84f0160","Type":"ContainerStarted","Data":"af66923e799c9a03324652711ee16fb6ce8f6c4173e4aee40174a4d99ca15f1a"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.691596 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" event={"ID":"4a5bda7f-8bc4-47fc-8eac-148ea84f0160","Type":"ContainerStarted","Data":"1bfa19efaa5fefdb87b9a0f34f58cdfb8d8237d3a94098751e1c861b367f0e44"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.699728 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.700910 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nxwgt" event={"ID":"10dc730d-4b35-4b62-885e-9592696ba259","Type":"ContainerStarted","Data":"7f9018f35504552f9297ee7268e186c809fbd25b57dcf34c9bc9e83a3614655e"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.702429 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" event={"ID":"b03ac45e-da7d-4e11-ab4b-ad1032a469a6","Type":"ContainerStarted","Data":"653bc749208c07c4f39dceea10238af87ee1e3b962057b68cbf0b9bb78b41e61"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.702470 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" event={"ID":"b03ac45e-da7d-4e11-ab4b-ad1032a469a6","Type":"ContainerStarted","Data":"a36723b0d1879c6693cdc0da58ec652fbcfa453dc3772b6ddbb761525f33be79"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.704919 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" event={"ID":"e98fd710-c49f-489e-8ccb-04834b738a98","Type":"ContainerStarted","Data":"94e4693e2f77777d09a3b996b78fedeed7551c4ef1656a84fb6cb0167b655eb6"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.704969 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" event={"ID":"e98fd710-c49f-489e-8ccb-04834b738a98","Type":"ContainerStarted","Data":"2a65dcf4b7f1c954d41e1033a78ee0c62b30326ef8dbcb2d4133395e6c6936a7"} Mar 12 13:11:40 crc kubenswrapper[4921]: E0312 13:11:40.706007 4921 projected.go:194] Error preparing data for projected volume kube-api-access-n6kqr for pod openshift-authentication/oauth-openshift-558db77b4-gg92s: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:40 crc kubenswrapper[4921]: E0312 13:11:40.706073 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-kube-api-access-n6kqr podName:88e0b0eb-d051-410d-b2e8-c80e9fe3fdce nodeName:}" failed. No retries permitted until 2026-03-12 13:11:41.20605589 +0000 UTC m=+123.896127861 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-n6kqr" (UniqueName: "kubernetes.io/projected/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-kube-api-access-n6kqr") pod "oauth-openshift-558db77b4-gg92s" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce") : failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.706098 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.709463 4921 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fcwhd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.709523 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" podUID="e98fd710-c49f-489e-8ccb-04834b738a98" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.709668 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" event={"ID":"3931e667-2d91-47d8-91ef-f8df2f96d75e","Type":"ContainerStarted","Data":"b6e97daf39107b97d3fba8aee01154d220a7e93969dc15cb2fa05b2dc9b7b1e2"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.717625 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" event={"ID":"10edb77f-2de4-4817-8991-944bcdc731c3","Type":"ContainerStarted","Data":"cc59003856be0692754342b326e1c5670d1767d0e2579b61774beb4ebf2e5e10"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.717682 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" event={"ID":"10edb77f-2de4-4817-8991-944bcdc731c3","Type":"ContainerStarted","Data":"9b680371fc02b4a757d3b5e0834ec66ca6ca2505a22b6970c502d7c3f69335a1"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.719645 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.724194 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" event={"ID":"294eef62-eb36-4e34-baa0-f391d25f72f1","Type":"ContainerStarted","Data":"61a83fa4e07d45df3c331f09edd4ba52897e83bacbb946f1b4cad8093c6fa7d3"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.724242 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" event={"ID":"294eef62-eb36-4e34-baa0-f391d25f72f1","Type":"ContainerStarted","Data":"91892e6877b890c21a0db7545e11922cb0cf898161c91df539dfaed6dce757e7"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.724254 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" event={"ID":"294eef62-eb36-4e34-baa0-f391d25f72f1","Type":"ContainerStarted","Data":"c1c308344a7ef908fb9737f5f050a3d6510cc14eaa42d8d93599fba70f2597f4"} Mar 12 13:11:40 crc kubenswrapper[4921]: E0312 13:11:40.724556 4921 projected.go:194] Error preparing data for projected volume kube-api-access-qmqtf for pod openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:40 crc kubenswrapper[4921]: E0312 13:11:40.724621 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/345c9c4b-5322-4521-abdb-5736718e654c-kube-api-access-qmqtf podName:345c9c4b-5322-4521-abdb-5736718e654c nodeName:}" failed. No retries permitted until 2026-03-12 13:11:41.224600106 +0000 UTC m=+123.914672077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qmqtf" (UniqueName: "kubernetes.io/projected/345c9c4b-5322-4521-abdb-5736718e654c-kube-api-access-qmqtf") pod "route-controller-manager-6576b87f9c-zdw4r" (UID: "345c9c4b-5322-4521-abdb-5736718e654c") : failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.737548 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5" event={"ID":"7c43287c-9ccf-472f-b671-fb54307ee938","Type":"ContainerStarted","Data":"575b9ada49ad6377b7147174b7b56dfaff391e67471d288d5dba970d9c31c20e"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.737606 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5" event={"ID":"7c43287c-9ccf-472f-b671-fb54307ee938","Type":"ContainerStarted","Data":"56688c4e25d1b70a3ce8546078687d2425f9bce8729aec698f985a215dba7d8b"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.741367 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.744568 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr" event={"ID":"40fc6791-5e3b-47e8-a542-33f6aea34ba4","Type":"ContainerStarted","Data":"e8c637c823a48a190b915f8226bdb87a7eb7ab4e75488818a7e28d184348a178"} Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.744619 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr" event={"ID":"40fc6791-5e3b-47e8-a542-33f6aea34ba4","Type":"ContainerStarted","Data":"46664de400ec2ed9fe9c5e7a7e346ec375f5365182f149506c61eecc885ce666"} Mar 12 13:11:40 crc kubenswrapper[4921]: E0312 13:11:40.746269 4921 projected.go:194] Error preparing data for projected volume kube-api-access-6srd5 for pod openshift-apiserver/apiserver-76f77b778f-pztgf: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:40 crc kubenswrapper[4921]: E0312 13:11:40.746343 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/98a0cc52-4219-45b7-a15f-d763979accbc-kube-api-access-6srd5 podName:98a0cc52-4219-45b7-a15f-d763979accbc nodeName:}" failed. No retries permitted until 2026-03-12 13:11:41.246324561 +0000 UTC m=+123.936396532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6srd5" (UniqueName: "kubernetes.io/projected/98a0cc52-4219-45b7-a15f-d763979accbc-kube-api-access-6srd5") pod "apiserver-76f77b778f-pztgf" (UID: "98a0cc52-4219-45b7-a15f-d763979accbc") : failed to sync configmap cache: timed out waiting for the condition Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.761509 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.775169 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrv4q\" (UniqueName: \"kubernetes.io/projected/0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c-kube-api-access-hrv4q\") pod \"router-default-5444994796-5ns6b\" (UID: \"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c\") " pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.781045 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.794317 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz"] Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.801288 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.802802 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-c82bd"] Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.809977 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gvxf\" (UniqueName: \"kubernetes.io/projected/6757f226-348a-4d6c-a9ee-22c6315701af-kube-api-access-2gvxf\") pod \"dns-operator-744455d44c-fp9vb\" (UID: \"6757f226-348a-4d6c-a9ee-22c6315701af\") " pod="openshift-dns-operator/dns-operator-744455d44c-fp9vb" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.817120 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d"] Mar 12 13:11:40 crc kubenswrapper[4921]: W0312 13:11:40.818502 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49132f14_e873_424a_8ffb_ebaa836c2db5.slice/crio-1867d8d6ceadf6770480e5c4849097d5d7908967bbbc480177cfa6f55620adcc WatchSource:0}: Error finding container 1867d8d6ceadf6770480e5c4849097d5d7908967bbbc480177cfa6f55620adcc: Status 404 returned error can't find the container with id 1867d8d6ceadf6770480e5c4849097d5d7908967bbbc480177cfa6f55620adcc Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.821459 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: W0312 13:11:40.828763 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2221a476_fe26_4767_9773_b88c8c9bfa7e.slice/crio-7c1b039a3e757204065615a9433794835a6cb3e56d4f9a0932165f2922125598 WatchSource:0}: Error finding container 7c1b039a3e757204065615a9433794835a6cb3e56d4f9a0932165f2922125598: Status 404 returned error can't find the container with id 7c1b039a3e757204065615a9433794835a6cb3e56d4f9a0932165f2922125598 Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.831016 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvk6r\" (UniqueName: \"kubernetes.io/projected/e334bcf0-dbe3-41d4-974b-222a58148c43-kube-api-access-pvk6r\") pod \"openshift-config-operator-7777fb866f-r6qq6\" (UID: \"e334bcf0-dbe3-41d4-974b-222a58148c43\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.840897 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.846376 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc2wq\" (UniqueName: \"kubernetes.io/projected/675e0fd3-342d-46b4-968a-33dd611eb8c0-kube-api-access-pc2wq\") pod \"apiserver-7bbb656c7d-96jwt\" (UID: \"675e0fd3-342d-46b4-968a-33dd611eb8c0\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.860316 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.867630 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsb8c\" (UniqueName: \"kubernetes.io/projected/a55aa2b4-ed3e-414c-8fa3-cba24092f81a-kube-api-access-zsb8c\") pod \"machine-approver-56656f9798-9jx6b\" (UID: \"a55aa2b4-ed3e-414c-8fa3-cba24092f81a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.881108 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.888783 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghn6b\" (UniqueName: \"kubernetes.io/projected/e0e82943-c5ab-4f7e-91d2-f99937a1ad40-kube-api-access-ghn6b\") pod \"openshift-apiserver-operator-796bbdcf4f-9lvzt\" (UID: \"e0e82943-c5ab-4f7e-91d2-f99937a1ad40\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.900063 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.910060 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbqb2\" (UniqueName: \"kubernetes.io/projected/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-kube-api-access-zbqb2\") pod \"controller-manager-879f6c89f-j5pwt\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.922763 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.929166 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvnml\" (UniqueName: \"kubernetes.io/projected/63c96b06-6182-4472-b8a8-393c627c77c9-kube-api-access-vvnml\") pod \"cluster-samples-operator-665b6dd947-tfcm2\" (UID: \"63c96b06-6182-4472-b8a8-393c627c77c9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.960554 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.966366 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zft6\" (UniqueName: \"kubernetes.io/projected/57677fcb-c7a5-431c-b751-ec13d22484b1-kube-api-access-7zft6\") pod \"console-f9d7485db-22xz2\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.980704 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 12 13:11:40 crc kubenswrapper[4921]: I0312 13:11:40.987018 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9wwt\" (UniqueName: \"kubernetes.io/projected/ec32db1f-c08c-4ea3-93c0-13dee21a1deb-kube-api-access-h9wwt\") pod \"console-operator-58897d9998-gdgrq\" (UID: \"ec32db1f-c08c-4ea3-93c0-13dee21a1deb\") " pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.002507 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.008975 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4h6c\" (UniqueName: \"kubernetes.io/projected/42780a2c-c305-4915-9be7-799cec82b8b8-kube-api-access-h4h6c\") pod \"etcd-operator-b45778765-xk2qg\" (UID: \"42780a2c-c305-4915-9be7-799cec82b8b8\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.761374 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" event={"ID":"49132f14-e873-424a-8ffb-ebaa836c2db5","Type":"ContainerStarted","Data":"b1de8b62f44fdb8b1223c7c9df6b7ff49ff96b0863afa57a3052082463a969c7"} Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.761450 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" event={"ID":"49132f14-e873-424a-8ffb-ebaa836c2db5","Type":"ContainerStarted","Data":"1867d8d6ceadf6770480e5c4849097d5d7908967bbbc480177cfa6f55620adcc"} Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.763470 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nxwgt" event={"ID":"10dc730d-4b35-4b62-885e-9592696ba259","Type":"ContainerStarted","Data":"35665332c466d1b28649086da167c825151d866d0740cf129c6bf41f2984cf45"} Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.763503 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nxwgt" event={"ID":"10dc730d-4b35-4b62-885e-9592696ba259","Type":"ContainerStarted","Data":"4f56e5067bfd158c993151496e696d411c2f850034e6a0c95f3cdeae61aeb26d"} Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.779003 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx" event={"ID":"ebdafa2f-f106-4273-be68-5f14d68904dc","Type":"ContainerStarted","Data":"bd5880eaa9ddf71797d42c904293e3778cbded4d72235ed39c53b4a3cf4074a4"} Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.779052 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx" event={"ID":"ebdafa2f-f106-4273-be68-5f14d68904dc","Type":"ContainerStarted","Data":"738b073779048ec13e0c5678624c06b86ed78dc24b10c897e874440c75b68da8"} Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.780861 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" event={"ID":"3931e667-2d91-47d8-91ef-f8df2f96d75e","Type":"ContainerStarted","Data":"d289fa265bca996cc6f6fac0164ebdadabf6f2db1ffa80ee793ca2e568edc19e"} Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.783696 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5" event={"ID":"7c43287c-9ccf-472f-b671-fb54307ee938","Type":"ContainerStarted","Data":"10a7d047c698db473e406bba7f19706da9328401e53da7f4caee2e75fb60396c"} Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.784023 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5" Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.785746 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c82bd" event={"ID":"40158ae9-4fb8-4e44-b9b1-d4abe50533d2","Type":"ContainerStarted","Data":"16107ddf80b574c0e9e9129fc4ec3db48929c887902f0e4bb71b956bd5aedd4b"} Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.785786 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c82bd" event={"ID":"40158ae9-4fb8-4e44-b9b1-d4abe50533d2","Type":"ContainerStarted","Data":"7a5764a6d4cc2716887d891bc75223f0eadf656847e84905f290f82de06d6ded"} Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.785798 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-c82bd" event={"ID":"40158ae9-4fb8-4e44-b9b1-d4abe50533d2","Type":"ContainerStarted","Data":"0141ae45cff9751e6b94051bc314a83d6b3ada8a220b9e1d20ee9bd1349bb56f"} Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.787342 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" event={"ID":"2221a476-fe26-4767-9773-b88c8c9bfa7e","Type":"ContainerStarted","Data":"6b3cf66c8e66bba1d19f733e388bc897e9f151ddffc55a80626bbe56e89d7dfd"} Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.787445 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" event={"ID":"2221a476-fe26-4767-9773-b88c8c9bfa7e","Type":"ContainerStarted","Data":"7c1b039a3e757204065615a9433794835a6cb3e56d4f9a0932165f2922125598"} Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.787528 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.789603 4921 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rxv7d container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.789649 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" podUID="2221a476-fe26-4767-9773-b88c8c9bfa7e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 12 13:11:41 crc kubenswrapper[4921]: I0312 13:11:41.913376 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nl5fb" podStartSLOduration=58.913344393 podStartE2EDuration="58.913344393s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:41.911330731 +0000 UTC m=+124.601402702" watchObservedRunningTime="2026-03-12 13:11:41.913344393 +0000 UTC m=+124.603416364" Mar 12 13:11:42 crc kubenswrapper[4921]: I0312 13:11:42.029589 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vjkr" podStartSLOduration=59.029570564 podStartE2EDuration="59.029570564s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:42.028286134 +0000 UTC m=+124.718358115" watchObservedRunningTime="2026-03-12 13:11:42.029570564 +0000 UTC m=+124.719642535" Mar 12 13:11:42 crc kubenswrapper[4921]: I0312 13:11:42.115284 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-q684w" podStartSLOduration=59.115269456 podStartE2EDuration="59.115269456s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:42.114017317 +0000 UTC m=+124.804089298" watchObservedRunningTime="2026-03-12 13:11:42.115269456 +0000 UTC m=+124.805341427" Mar 12 13:11:42 crc kubenswrapper[4921]: I0312 13:11:42.192521 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" podStartSLOduration=59.192501555 podStartE2EDuration="59.192501555s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:42.190911516 +0000 UTC m=+124.880983507" watchObservedRunningTime="2026-03-12 13:11:42.192501555 +0000 UTC m=+124.882573526" Mar 12 13:11:42 crc kubenswrapper[4921]: I0312 13:11:42.228778 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" podStartSLOduration=60.228760832 podStartE2EDuration="1m0.228760832s" podCreationTimestamp="2026-03-12 13:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:42.2264479 +0000 UTC m=+124.916519881" watchObservedRunningTime="2026-03-12 13:11:42.228760832 +0000 UTC m=+124.918832803" Mar 12 13:11:42 crc kubenswrapper[4921]: I0312 13:11:42.478146 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8zqbl" podStartSLOduration=59.478127158 podStartE2EDuration="59.478127158s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:42.475834017 +0000 UTC m=+125.165905988" watchObservedRunningTime="2026-03-12 13:11:42.478127158 +0000 UTC m=+125.168199129" Mar 12 13:11:42 crc kubenswrapper[4921]: I0312 13:11:42.508222 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-c2f7f" podStartSLOduration=59.508204313 podStartE2EDuration="59.508204313s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:42.508203243 +0000 UTC m=+125.198275214" watchObservedRunningTime="2026-03-12 13:11:42.508204313 +0000 UTC m=+125.198276284" Mar 12 13:11:42 crc kubenswrapper[4921]: I0312 13:11:42.649783 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nxwgt" podStartSLOduration=59.64976609 podStartE2EDuration="59.64976609s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:42.647598483 +0000 UTC m=+125.337670454" watchObservedRunningTime="2026-03-12 13:11:42.64976609 +0000 UTC m=+125.339838061" Mar 12 13:11:42 crc kubenswrapper[4921]: I0312 13:11:42.689953 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" podStartSLOduration=59.689920757 podStartE2EDuration="59.689920757s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:42.689329099 +0000 UTC m=+125.379401070" watchObservedRunningTime="2026-03-12 13:11:42.689920757 +0000 UTC m=+125.379992728" Mar 12 13:11:42 crc kubenswrapper[4921]: I0312 13:11:42.730220 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f9kwz" podStartSLOduration=59.730178397 podStartE2EDuration="59.730178397s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:42.72960193 +0000 UTC m=+125.419673901" watchObservedRunningTime="2026-03-12 13:11:42.730178397 +0000 UTC m=+125.420250368" Mar 12 13:11:42 crc kubenswrapper[4921]: I0312 13:11:42.776395 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-c82bd" podStartSLOduration=59.776357652 podStartE2EDuration="59.776357652s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:42.773806193 +0000 UTC m=+125.463878164" watchObservedRunningTime="2026-03-12 13:11:42.776357652 +0000 UTC m=+125.466429623" Mar 12 13:11:42 crc kubenswrapper[4921]: I0312 13:11:42.813390 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qvprx" podStartSLOduration=59.813347712 podStartE2EDuration="59.813347712s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:42.812289058 +0000 UTC m=+125.502361029" watchObservedRunningTime="2026-03-12 13:11:42.813347712 +0000 UTC m=+125.503419683" Mar 12 13:11:42 crc kubenswrapper[4921]: I0312 13:11:42.853215 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x76dm" podStartSLOduration=59.853194899 podStartE2EDuration="59.853194899s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:42.849967059 +0000 UTC m=+125.540039040" watchObservedRunningTime="2026-03-12 13:11:42.853194899 +0000 UTC m=+125.543266870" Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.118672 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5" podStartSLOduration=60.118644976 podStartE2EDuration="1m0.118644976s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:42.897864877 +0000 UTC m=+125.587936848" watchObservedRunningTime="2026-03-12 13:11:43.118644976 +0000 UTC m=+125.808716987" Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.122688 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lvkz9"] Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.124212 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.126503 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.133014 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lvkz9"] Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.327224 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ll7bv"] Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.329127 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.331678 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.337921 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ll7bv"] Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.520989 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kv2xc"] Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.522169 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.542873 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kv2xc"] Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.724368 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7vlvg"] Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.726525 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.748737 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7vlvg"] Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.798927 4921 generic.go:334] "Generic (PLEG): container finished" podID="f403288d-b503-4f0c-bf83-3b29ff86ab94" containerID="fc91c9434028740655300c434b681676ae5f6bc96b088a1a58fa416bc98b3208" exitCode=0 Mar 12 13:11:43 crc kubenswrapper[4921]: I0312 13:11:43.799056 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" event={"ID":"f403288d-b503-4f0c-bf83-3b29ff86ab94","Type":"ContainerDied","Data":"fc91c9434028740655300c434b681676ae5f6bc96b088a1a58fa416bc98b3208"} Mar 12 13:11:45 crc kubenswrapper[4921]: I0312 13:11:45.125361 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6tzw9"] Mar 12 13:11:45 crc kubenswrapper[4921]: I0312 13:11:45.127342 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:11:45 crc kubenswrapper[4921]: I0312 13:11:45.129634 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 12 13:11:45 crc kubenswrapper[4921]: I0312 13:11:45.138002 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tzw9"] Mar 12 13:11:45 crc kubenswrapper[4921]: I0312 13:11:45.527848 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tfjjf"] Mar 12 13:11:45 crc kubenswrapper[4921]: I0312 13:11:45.530165 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:11:45 crc kubenswrapper[4921]: I0312 13:11:45.550998 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfjjf"] Mar 12 13:11:46 crc kubenswrapper[4921]: I0312 13:11:46.310957 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:11:46 crc kubenswrapper[4921]: I0312 13:11:46.325679 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m6c4k"] Mar 12 13:11:46 crc kubenswrapper[4921]: I0312 13:11:46.332148 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:11:46 crc kubenswrapper[4921]: I0312 13:11:46.337258 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 12 13:11:46 crc kubenswrapper[4921]: I0312 13:11:46.370987 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m6c4k"] Mar 12 13:11:46 crc kubenswrapper[4921]: I0312 13:11:46.721161 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xxd4x"] Mar 12 13:11:46 crc kubenswrapper[4921]: I0312 13:11:46.722119 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:11:46 crc kubenswrapper[4921]: I0312 13:11:46.738667 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xxd4x"] Mar 12 13:11:46 crc kubenswrapper[4921]: I0312 13:11:46.952065 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 12 13:11:46 crc kubenswrapper[4921]: I0312 13:11:46.953209 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 13:11:46 crc kubenswrapper[4921]: I0312 13:11:46.960709 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 12 13:11:46 crc kubenswrapper[4921]: I0312 13:11:46.961359 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 12 13:11:46 crc kubenswrapper[4921]: I0312 13:11:46.970504 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.119921 4921 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.144218 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x8rdl" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.144303 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.145173 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.145600 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.145952 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:47 crc kubenswrapper[4921]: E0312 13:11:47.146011 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:12:03.145986641 +0000 UTC m=+145.836058622 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.146051 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/29a3ac39-3f54-47f8-947e-c5d5f4709c23-ca-trust-extracted\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.146091 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/29a3ac39-3f54-47f8-947e-c5d5f4709c23-installation-pull-secrets\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.146452 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-registry-tls\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.146511 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.146564 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.146586 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.147129 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.147176 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-bound-sa-token\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.147209 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4kfc\" (UniqueName: \"kubernetes.io/projected/345c99f7-75d2-48da-9a45-6fd8ce5c92da-kube-api-access-x4kfc\") pod \"machine-api-operator-5694c8668f-r7sfx\" (UID: \"345c99f7-75d2-48da-9a45-6fd8ce5c92da\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.147239 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.147270 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.147353 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6srd5\" (UniqueName: \"kubernetes.io/projected/98a0cc52-4219-45b7-a15f-d763979accbc-kube-api-access-6srd5\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.147399 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29a3ac39-3f54-47f8-947e-c5d5f4709c23-trusted-ca\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.147442 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6kqr\" (UniqueName: \"kubernetes.io/projected/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-kube-api-access-n6kqr\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.147360 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 12 13:11:47 crc kubenswrapper[4921]: E0312 13:11:47.147474 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:47.647460898 +0000 UTC m=+130.337532879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.147861 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj8m6\" (UniqueName: \"kubernetes.io/projected/bd5e10c8-1017-4083-a5d8-550f2aca7920-kube-api-access-qj8m6\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.147930 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.147972 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs\") pod \"network-metrics-daemon-5jsfz\" (UID: \"9823f1cf-662f-4896-a6a0-a3bfb3aa660b\") " pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.148004 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/29a3ac39-3f54-47f8-947e-c5d5f4709c23-registry-certificates\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.148035 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt5fc\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-kube-api-access-nt5fc\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.148065 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmqtf\" (UniqueName: \"kubernetes.io/projected/345c9c4b-5322-4521-abdb-5736718e654c-kube-api-access-qmqtf\") pod \"route-controller-manager-6576b87f9c-zdw4r\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.149648 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.152039 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.152107 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.152157 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.152257 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.154769 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.155045 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.155125 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.155259 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.155499 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.155538 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.156366 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.156464 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.156848 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.156914 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fp9vb" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.157197 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.157299 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.157594 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.157601 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.158545 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.159999 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmqtf\" (UniqueName: \"kubernetes.io/projected/345c9c4b-5322-4521-abdb-5736718e654c-kube-api-access-qmqtf\") pod \"route-controller-manager-6576b87f9c-zdw4r\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.160323 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.161396 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4kfc\" (UniqueName: \"kubernetes.io/projected/345c99f7-75d2-48da-9a45-6fd8ce5c92da-kube-api-access-x4kfc\") pod \"machine-api-operator-5694c8668f-r7sfx\" (UID: \"345c99f7-75d2-48da-9a45-6fd8ce5c92da\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.164469 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6srd5\" (UniqueName: \"kubernetes.io/projected/98a0cc52-4219-45b7-a15f-d763979accbc-kube-api-access-6srd5\") pod \"apiserver-76f77b778f-pztgf\" (UID: \"98a0cc52-4219-45b7-a15f-d763979accbc\") " pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.165321 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj8m6\" (UniqueName: \"kubernetes.io/projected/bd5e10c8-1017-4083-a5d8-550f2aca7920-kube-api-access-qj8m6\") pod \"authentication-operator-69f744f599-4c92v\" (UID: \"bd5e10c8-1017-4083-a5d8-550f2aca7920\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.166761 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6kqr\" (UniqueName: \"kubernetes.io/projected/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-kube-api-access-n6kqr\") pod \"oauth-openshift-558db77b4-gg92s\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.173905 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9823f1cf-662f-4896-a6a0-a3bfb3aa660b-metrics-certs\") pod \"network-metrics-daemon-5jsfz\" (UID: \"9823f1cf-662f-4896-a6a0-a3bfb3aa660b\") " pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.220096 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.222419 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.224798 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.225030 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.234421 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.249392 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.249591 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-bound-sa-token\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: E0312 13:11:47.251196 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:47.74958591 +0000 UTC m=+130.439657881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.251270 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd66ee17-c8c5-42a4-b1ea-0cb841713ec1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tvbc\" (UID: \"dd66ee17-c8c5-42a4-b1ea-0cb841713ec1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.251358 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc904419-43b3-4164-8efb-b493171791cc-utilities\") pod \"community-operators-7vlvg\" (UID: \"dc904419-43b3-4164-8efb-b493171791cc\") " pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.251391 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5-config-volume\") pod \"dns-default-655n6\" (UID: \"5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5\") " pod="openshift-dns/dns-default-655n6" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.251444 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ksns\" (UniqueName: \"kubernetes.io/projected/dd66ee17-c8c5-42a4-b1ea-0cb841713ec1-kube-api-access-7ksns\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tvbc\" (UID: \"dd66ee17-c8c5-42a4-b1ea-0cb841713ec1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.252643 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nktws\" (UniqueName: \"kubernetes.io/projected/81ec102d-42ba-4d41-952d-d36fa110e626-kube-api-access-nktws\") pod \"cni-sysctl-allowlist-ds-lgwhc\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.252743 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5-metrics-tls\") pod \"dns-default-655n6\" (UID: \"5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5\") " pod="openshift-dns/dns-default-655n6" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.252872 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29a3ac39-3f54-47f8-947e-c5d5f4709c23-trusted-ca\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.252953 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tm68\" (UniqueName: \"kubernetes.io/projected/50255da2-a710-48bc-8a00-36146dec247a-kube-api-access-4tm68\") pod \"ingress-operator-5b745b69d9-bs76n\" (UID: \"50255da2-a710-48bc-8a00-36146dec247a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.252990 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd66ee17-c8c5-42a4-b1ea-0cb841713ec1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tvbc\" (UID: \"dd66ee17-c8c5-42a4-b1ea-0cb841713ec1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253036 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9lfs\" (UniqueName: \"kubernetes.io/projected/a7c45059-acf8-4cb3-b1f6-f07128d72141-kube-api-access-p9lfs\") pod \"downloads-7954f5f757-4z7zk\" (UID: \"a7c45059-acf8-4cb3-b1f6-f07128d72141\") " pod="openshift-console/downloads-7954f5f757-4z7zk" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253058 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsbrp\" (UniqueName: \"kubernetes.io/projected/dc904419-43b3-4164-8efb-b493171791cc-kube-api-access-dsbrp\") pod \"community-operators-7vlvg\" (UID: \"dc904419-43b3-4164-8efb-b493171791cc\") " pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253097 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/50255da2-a710-48bc-8a00-36146dec247a-metrics-tls\") pod \"ingress-operator-5b745b69d9-bs76n\" (UID: \"50255da2-a710-48bc-8a00-36146dec247a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253116 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50255da2-a710-48bc-8a00-36146dec247a-trusted-ca\") pod \"ingress-operator-5b745b69d9-bs76n\" (UID: \"50255da2-a710-48bc-8a00-36146dec247a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253147 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50255da2-a710-48bc-8a00-36146dec247a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bs76n\" (UID: \"50255da2-a710-48bc-8a00-36146dec247a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253165 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff9848fa-2816-4e2c-96a8-b7bc9a13ceed-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ff9848fa-2816-4e2c-96a8-b7bc9a13ceed\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253202 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b934596-5580-41ba-8ad2-8722f4cf476d-catalog-content\") pod \"redhat-marketplace-6tzw9\" (UID: \"2b934596-5580-41ba-8ad2-8722f4cf476d\") " pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253261 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc904419-43b3-4164-8efb-b493171791cc-catalog-content\") pod \"community-operators-7vlvg\" (UID: \"dc904419-43b3-4164-8efb-b493171791cc\") " pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253350 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/81ec102d-42ba-4d41-952d-d36fa110e626-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lgwhc\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253477 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/29a3ac39-3f54-47f8-947e-c5d5f4709c23-registry-certificates\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253561 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt5fc\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-kube-api-access-nt5fc\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253602 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b934596-5580-41ba-8ad2-8722f4cf476d-utilities\") pod \"redhat-marketplace-6tzw9\" (UID: \"2b934596-5580-41ba-8ad2-8722f4cf476d\") " pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253659 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8nv5\" (UniqueName: \"kubernetes.io/projected/2b934596-5580-41ba-8ad2-8722f4cf476d-kube-api-access-f8nv5\") pod \"redhat-marketplace-6tzw9\" (UID: \"2b934596-5580-41ba-8ad2-8722f4cf476d\") " pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253751 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/29a3ac39-3f54-47f8-947e-c5d5f4709c23-ca-trust-extracted\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253784 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66c4t\" (UniqueName: \"kubernetes.io/projected/5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5-kube-api-access-66c4t\") pod \"dns-default-655n6\" (UID: \"5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5\") " pod="openshift-dns/dns-default-655n6" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.253833 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/29a3ac39-3f54-47f8-947e-c5d5f4709c23-installation-pull-secrets\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.263284 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29a3ac39-3f54-47f8-947e-c5d5f4709c23-trusted-ca\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.264190 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/81ec102d-42ba-4d41-952d-d36fa110e626-ready\") pod \"cni-sysctl-allowlist-ds-lgwhc\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.264233 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj5hx\" (UniqueName: \"kubernetes.io/projected/8766c23e-233b-4eab-9d32-793e70fa9284-kube-api-access-hj5hx\") pod \"ingress-canary-sg6kz\" (UID: \"8766c23e-233b-4eab-9d32-793e70fa9284\") " pod="openshift-ingress-canary/ingress-canary-sg6kz" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.264278 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff9848fa-2816-4e2c-96a8-b7bc9a13ceed-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ff9848fa-2816-4e2c-96a8-b7bc9a13ceed\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.264318 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8766c23e-233b-4eab-9d32-793e70fa9284-cert\") pod \"ingress-canary-sg6kz\" (UID: \"8766c23e-233b-4eab-9d32-793e70fa9284\") " pod="openshift-ingress-canary/ingress-canary-sg6kz" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.264401 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/81ec102d-42ba-4d41-952d-d36fa110e626-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lgwhc\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.264456 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-registry-tls\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.272024 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/29a3ac39-3f54-47f8-947e-c5d5f4709c23-registry-certificates\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.274363 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/29a3ac39-3f54-47f8-947e-c5d5f4709c23-ca-trust-extracted\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.279769 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.282090 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-bound-sa-token\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.284152 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.294941 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.295149 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt5fc\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-kube-api-access-nt5fc\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.295375 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/29a3ac39-3f54-47f8-947e-c5d5f4709c23-installation-pull-secrets\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.298888 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-registry-tls\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.303938 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.311256 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.340252 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.344557 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365195 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365231 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd66ee17-c8c5-42a4-b1ea-0cb841713ec1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tvbc\" (UID: \"dd66ee17-c8c5-42a4-b1ea-0cb841713ec1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365262 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc904419-43b3-4164-8efb-b493171791cc-utilities\") pod \"community-operators-7vlvg\" (UID: \"dc904419-43b3-4164-8efb-b493171791cc\") " pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365287 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5-config-volume\") pod \"dns-default-655n6\" (UID: \"5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5\") " pod="openshift-dns/dns-default-655n6" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365313 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6868925-795c-4765-9343-0b147db98216-catalog-content\") pod \"community-operators-ll7bv\" (UID: \"d6868925-795c-4765-9343-0b147db98216\") " pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365332 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ksns\" (UniqueName: \"kubernetes.io/projected/dd66ee17-c8c5-42a4-b1ea-0cb841713ec1-kube-api-access-7ksns\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tvbc\" (UID: \"dd66ee17-c8c5-42a4-b1ea-0cb841713ec1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365351 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nktws\" (UniqueName: \"kubernetes.io/projected/81ec102d-42ba-4d41-952d-d36fa110e626-kube-api-access-nktws\") pod \"cni-sysctl-allowlist-ds-lgwhc\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365369 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8f8a5699-e034-4255-b82a-d58becd6a2aa-signing-key\") pod \"service-ca-9c57cc56f-hl25r\" (UID: \"8f8a5699-e034-4255-b82a-d58becd6a2aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-hl25r" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365386 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82dff338-35e1-44df-8f20-a4d4d8b3c198-catalog-content\") pod \"certified-operators-lvkz9\" (UID: \"82dff338-35e1-44df-8f20-a4d4d8b3c198\") " pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365405 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5-metrics-tls\") pod \"dns-default-655n6\" (UID: \"5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5\") " pod="openshift-dns/dns-default-655n6" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365423 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8f8a5699-e034-4255-b82a-d58becd6a2aa-signing-cabundle\") pod \"service-ca-9c57cc56f-hl25r\" (UID: \"8f8a5699-e034-4255-b82a-d58becd6a2aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-hl25r" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365443 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc884fdf-9890-4cc6-b0cf-9028a290209b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"dc884fdf-9890-4cc6-b0cf-9028a290209b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365465 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tm68\" (UniqueName: \"kubernetes.io/projected/50255da2-a710-48bc-8a00-36146dec247a-kube-api-access-4tm68\") pod \"ingress-operator-5b745b69d9-bs76n\" (UID: \"50255da2-a710-48bc-8a00-36146dec247a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365484 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd66ee17-c8c5-42a4-b1ea-0cb841713ec1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tvbc\" (UID: \"dd66ee17-c8c5-42a4-b1ea-0cb841713ec1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365503 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9lfs\" (UniqueName: \"kubernetes.io/projected/a7c45059-acf8-4cb3-b1f6-f07128d72141-kube-api-access-p9lfs\") pod \"downloads-7954f5f757-4z7zk\" (UID: \"a7c45059-acf8-4cb3-b1f6-f07128d72141\") " pod="openshift-console/downloads-7954f5f757-4z7zk" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365519 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/21d99397-dae5-442c-b0e7-bfb634866216-certs\") pod \"machine-config-server-ms2ld\" (UID: \"21d99397-dae5-442c-b0e7-bfb634866216\") " pod="openshift-machine-config-operator/machine-config-server-ms2ld" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365537 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec0983c2-4cd5-41aa-972c-60dd47817a5b-catalog-content\") pod \"certified-operators-kv2xc\" (UID: \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\") " pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365552 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsbrp\" (UniqueName: \"kubernetes.io/projected/dc904419-43b3-4164-8efb-b493171791cc-kube-api-access-dsbrp\") pod \"community-operators-7vlvg\" (UID: \"dc904419-43b3-4164-8efb-b493171791cc\") " pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365577 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8433ae-09da-4dfb-98c6-922fcfbaa546-utilities\") pod \"redhat-marketplace-tfjjf\" (UID: \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\") " pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365606 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/50255da2-a710-48bc-8a00-36146dec247a-metrics-tls\") pod \"ingress-operator-5b745b69d9-bs76n\" (UID: \"50255da2-a710-48bc-8a00-36146dec247a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365623 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50255da2-a710-48bc-8a00-36146dec247a-trusted-ca\") pod \"ingress-operator-5b745b69d9-bs76n\" (UID: \"50255da2-a710-48bc-8a00-36146dec247a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365638 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50255da2-a710-48bc-8a00-36146dec247a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bs76n\" (UID: \"50255da2-a710-48bc-8a00-36146dec247a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365661 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff9848fa-2816-4e2c-96a8-b7bc9a13ceed-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ff9848fa-2816-4e2c-96a8-b7bc9a13ceed\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365677 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rnbt\" (UniqueName: \"kubernetes.io/projected/82dff338-35e1-44df-8f20-a4d4d8b3c198-kube-api-access-2rnbt\") pod \"certified-operators-lvkz9\" (UID: \"82dff338-35e1-44df-8f20-a4d4d8b3c198\") " pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365706 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b934596-5580-41ba-8ad2-8722f4cf476d-catalog-content\") pod \"redhat-marketplace-6tzw9\" (UID: \"2b934596-5580-41ba-8ad2-8722f4cf476d\") " pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365730 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc904419-43b3-4164-8efb-b493171791cc-catalog-content\") pod \"community-operators-7vlvg\" (UID: \"dc904419-43b3-4164-8efb-b493171791cc\") " pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365747 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99ms7\" (UniqueName: \"kubernetes.io/projected/9a31a895-ced3-4285-8105-448501c3ceac-kube-api-access-99ms7\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365766 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/81ec102d-42ba-4d41-952d-d36fa110e626-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lgwhc\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365782 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fef7638-98df-405a-b04b-f47997b46eac-catalog-content\") pod \"redhat-operators-m6c4k\" (UID: \"7fef7638-98df-405a-b04b-f47997b46eac\") " pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:11:47 crc kubenswrapper[4921]: E0312 13:11:47.365851 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:47.86578738 +0000 UTC m=+130.555859521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365931 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc884fdf-9890-4cc6-b0cf-9028a290209b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"dc884fdf-9890-4cc6-b0cf-9028a290209b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.365979 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6484\" (UniqueName: \"kubernetes.io/projected/8f8a5699-e034-4255-b82a-d58becd6a2aa-kube-api-access-q6484\") pod \"service-ca-9c57cc56f-hl25r\" (UID: \"8f8a5699-e034-4255-b82a-d58becd6a2aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-hl25r" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366023 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlcrc\" (UniqueName: \"kubernetes.io/projected/21d99397-dae5-442c-b0e7-bfb634866216-kube-api-access-xlcrc\") pod \"machine-config-server-ms2ld\" (UID: \"21d99397-dae5-442c-b0e7-bfb634866216\") " pod="openshift-machine-config-operator/machine-config-server-ms2ld" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366054 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0840f674-6e13-4336-ad20-a67b979ae5ba-catalog-content\") pod \"redhat-operators-xxd4x\" (UID: \"0840f674-6e13-4336-ad20-a67b979ae5ba\") " pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366089 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82dff338-35e1-44df-8f20-a4d4d8b3c198-utilities\") pod \"certified-operators-lvkz9\" (UID: \"82dff338-35e1-44df-8f20-a4d4d8b3c198\") " pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366161 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b934596-5580-41ba-8ad2-8722f4cf476d-utilities\") pod \"redhat-marketplace-6tzw9\" (UID: \"2b934596-5580-41ba-8ad2-8722f4cf476d\") " pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366199 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-plugins-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366226 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8nv5\" (UniqueName: \"kubernetes.io/projected/2b934596-5580-41ba-8ad2-8722f4cf476d-kube-api-access-f8nv5\") pod \"redhat-marketplace-6tzw9\" (UID: \"2b934596-5580-41ba-8ad2-8722f4cf476d\") " pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366252 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-socket-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366287 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-mountpoint-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366331 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66c4t\" (UniqueName: \"kubernetes.io/projected/5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5-kube-api-access-66c4t\") pod \"dns-default-655n6\" (UID: \"5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5\") " pod="openshift-dns/dns-default-655n6" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366355 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6868925-795c-4765-9343-0b147db98216-utilities\") pod \"community-operators-ll7bv\" (UID: \"d6868925-795c-4765-9343-0b147db98216\") " pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366379 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fef7638-98df-405a-b04b-f47997b46eac-utilities\") pod \"redhat-operators-m6c4k\" (UID: \"7fef7638-98df-405a-b04b-f47997b46eac\") " pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366419 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqxbg\" (UniqueName: \"kubernetes.io/projected/d6868925-795c-4765-9343-0b147db98216-kube-api-access-xqxbg\") pod \"community-operators-ll7bv\" (UID: \"d6868925-795c-4765-9343-0b147db98216\") " pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366489 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj5hx\" (UniqueName: \"kubernetes.io/projected/8766c23e-233b-4eab-9d32-793e70fa9284-kube-api-access-hj5hx\") pod \"ingress-canary-sg6kz\" (UID: \"8766c23e-233b-4eab-9d32-793e70fa9284\") " pod="openshift-ingress-canary/ingress-canary-sg6kz" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366520 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/81ec102d-42ba-4d41-952d-d36fa110e626-ready\") pod \"cni-sysctl-allowlist-ds-lgwhc\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366548 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec0983c2-4cd5-41aa-972c-60dd47817a5b-utilities\") pod \"certified-operators-kv2xc\" (UID: \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\") " pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366572 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dcb2\" (UniqueName: \"kubernetes.io/projected/0a8433ae-09da-4dfb-98c6-922fcfbaa546-kube-api-access-4dcb2\") pod \"redhat-marketplace-tfjjf\" (UID: \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\") " pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366603 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8766c23e-233b-4eab-9d32-793e70fa9284-cert\") pod \"ingress-canary-sg6kz\" (UID: \"8766c23e-233b-4eab-9d32-793e70fa9284\") " pod="openshift-ingress-canary/ingress-canary-sg6kz" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366628 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff9848fa-2816-4e2c-96a8-b7bc9a13ceed-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ff9848fa-2816-4e2c-96a8-b7bc9a13ceed\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366644 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5-config-volume\") pod \"dns-default-655n6\" (UID: \"5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5\") " pod="openshift-dns/dns-default-655n6" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366652 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f8tt\" (UniqueName: \"kubernetes.io/projected/ec0983c2-4cd5-41aa-972c-60dd47817a5b-kube-api-access-7f8tt\") pod \"certified-operators-kv2xc\" (UID: \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\") " pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366681 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-csi-data-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366713 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gmld\" (UniqueName: \"kubernetes.io/projected/0840f674-6e13-4336-ad20-a67b979ae5ba-kube-api-access-5gmld\") pod \"redhat-operators-xxd4x\" (UID: \"0840f674-6e13-4336-ad20-a67b979ae5ba\") " pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366742 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzpbh\" (UniqueName: \"kubernetes.io/projected/7fef7638-98df-405a-b04b-f47997b46eac-kube-api-access-gzpbh\") pod \"redhat-operators-m6c4k\" (UID: \"7fef7638-98df-405a-b04b-f47997b46eac\") " pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366792 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/21d99397-dae5-442c-b0e7-bfb634866216-node-bootstrap-token\") pod \"machine-config-server-ms2ld\" (UID: \"21d99397-dae5-442c-b0e7-bfb634866216\") " pod="openshift-machine-config-operator/machine-config-server-ms2ld" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366853 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/81ec102d-42ba-4d41-952d-d36fa110e626-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lgwhc\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366888 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-registration-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.367615 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/81ec102d-42ba-4d41-952d-d36fa110e626-ready\") pod \"cni-sysctl-allowlist-ds-lgwhc\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.367651 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b934596-5580-41ba-8ad2-8722f4cf476d-utilities\") pod \"redhat-marketplace-6tzw9\" (UID: \"2b934596-5580-41ba-8ad2-8722f4cf476d\") " pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.366160 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc904419-43b3-4164-8efb-b493171791cc-utilities\") pod \"community-operators-7vlvg\" (UID: \"dc904419-43b3-4164-8efb-b493171791cc\") " pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.368608 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/81ec102d-42ba-4d41-952d-d36fa110e626-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-lgwhc\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.368631 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff9848fa-2816-4e2c-96a8-b7bc9a13ceed-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ff9848fa-2816-4e2c-96a8-b7bc9a13ceed\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.368840 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc904419-43b3-4164-8efb-b493171791cc-catalog-content\") pod \"community-operators-7vlvg\" (UID: \"dc904419-43b3-4164-8efb-b493171791cc\") " pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.368992 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b934596-5580-41ba-8ad2-8722f4cf476d-catalog-content\") pod \"redhat-marketplace-6tzw9\" (UID: \"2b934596-5580-41ba-8ad2-8722f4cf476d\") " pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.369319 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/81ec102d-42ba-4d41-952d-d36fa110e626-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-lgwhc\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.370313 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50255da2-a710-48bc-8a00-36146dec247a-trusted-ca\") pod \"ingress-operator-5b745b69d9-bs76n\" (UID: \"50255da2-a710-48bc-8a00-36146dec247a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.372476 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8766c23e-233b-4eab-9d32-793e70fa9284-cert\") pod \"ingress-canary-sg6kz\" (UID: \"8766c23e-233b-4eab-9d32-793e70fa9284\") " pod="openshift-ingress-canary/ingress-canary-sg6kz" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.373680 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/50255da2-a710-48bc-8a00-36146dec247a-metrics-tls\") pod \"ingress-operator-5b745b69d9-bs76n\" (UID: \"50255da2-a710-48bc-8a00-36146dec247a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.374118 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5-metrics-tls\") pod \"dns-default-655n6\" (UID: \"5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5\") " pod="openshift-dns/dns-default-655n6" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.376180 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0840f674-6e13-4336-ad20-a67b979ae5ba-utilities\") pod \"redhat-operators-xxd4x\" (UID: \"0840f674-6e13-4336-ad20-a67b979ae5ba\") " pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.376257 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8433ae-09da-4dfb-98c6-922fcfbaa546-catalog-content\") pod \"redhat-marketplace-tfjjf\" (UID: \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\") " pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.377608 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.385188 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.385574 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.386652 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.386993 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd66ee17-c8c5-42a4-b1ea-0cb841713ec1-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tvbc\" (UID: \"dd66ee17-c8c5-42a4-b1ea-0cb841713ec1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.389917 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.395846 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.397287 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nktws\" (UniqueName: \"kubernetes.io/projected/81ec102d-42ba-4d41-952d-d36fa110e626-kube-api-access-nktws\") pod \"cni-sysctl-allowlist-ds-lgwhc\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.398321 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.399298 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd66ee17-c8c5-42a4-b1ea-0cb841713ec1-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tvbc\" (UID: \"dd66ee17-c8c5-42a4-b1ea-0cb841713ec1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.400740 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.402178 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.405055 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jsfz" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.408063 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tm68\" (UniqueName: \"kubernetes.io/projected/50255da2-a710-48bc-8a00-36146dec247a-kube-api-access-4tm68\") pod \"ingress-operator-5b745b69d9-bs76n\" (UID: \"50255da2-a710-48bc-8a00-36146dec247a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.408544 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9lfs\" (UniqueName: \"kubernetes.io/projected/a7c45059-acf8-4cb3-b1f6-f07128d72141-kube-api-access-p9lfs\") pod \"downloads-7954f5f757-4z7zk\" (UID: \"a7c45059-acf8-4cb3-b1f6-f07128d72141\") " pod="openshift-console/downloads-7954f5f757-4z7zk" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.410608 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.412412 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/50255da2-a710-48bc-8a00-36146dec247a-bound-sa-token\") pod \"ingress-operator-5b745b69d9-bs76n\" (UID: \"50255da2-a710-48bc-8a00-36146dec247a\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.413573 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66c4t\" (UniqueName: \"kubernetes.io/projected/5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5-kube-api-access-66c4t\") pod \"dns-default-655n6\" (UID: \"5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5\") " pod="openshift-dns/dns-default-655n6" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.417953 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff9848fa-2816-4e2c-96a8-b7bc9a13ceed-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ff9848fa-2816-4e2c-96a8-b7bc9a13ceed\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.418504 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8nv5\" (UniqueName: \"kubernetes.io/projected/2b934596-5580-41ba-8ad2-8722f4cf476d-kube-api-access-f8nv5\") pod \"redhat-marketplace-6tzw9\" (UID: \"2b934596-5580-41ba-8ad2-8722f4cf476d\") " pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.418693 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj5hx\" (UniqueName: \"kubernetes.io/projected/8766c23e-233b-4eab-9d32-793e70fa9284-kube-api-access-hj5hx\") pod \"ingress-canary-sg6kz\" (UID: \"8766c23e-233b-4eab-9d32-793e70fa9284\") " pod="openshift-ingress-canary/ingress-canary-sg6kz" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.418922 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-4z7zk" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.422217 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.422511 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ksns\" (UniqueName: \"kubernetes.io/projected/dd66ee17-c8c5-42a4-b1ea-0cb841713ec1-kube-api-access-7ksns\") pod \"openshift-controller-manager-operator-756b6f6bc6-6tvbc\" (UID: \"dd66ee17-c8c5-42a4-b1ea-0cb841713ec1\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.425498 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsbrp\" (UniqueName: \"kubernetes.io/projected/dc904419-43b3-4164-8efb-b493171791cc-kube-api-access-dsbrp\") pod \"community-operators-7vlvg\" (UID: \"dc904419-43b3-4164-8efb-b493171791cc\") " pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.430076 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477433 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477619 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/21d99397-dae5-442c-b0e7-bfb634866216-certs\") pod \"machine-config-server-ms2ld\" (UID: \"21d99397-dae5-442c-b0e7-bfb634866216\") " pod="openshift-machine-config-operator/machine-config-server-ms2ld" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477668 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec0983c2-4cd5-41aa-972c-60dd47817a5b-catalog-content\") pod \"certified-operators-kv2xc\" (UID: \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\") " pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477691 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8433ae-09da-4dfb-98c6-922fcfbaa546-utilities\") pod \"redhat-marketplace-tfjjf\" (UID: \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\") " pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477713 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rnbt\" (UniqueName: \"kubernetes.io/projected/82dff338-35e1-44df-8f20-a4d4d8b3c198-kube-api-access-2rnbt\") pod \"certified-operators-lvkz9\" (UID: \"82dff338-35e1-44df-8f20-a4d4d8b3c198\") " pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477744 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99ms7\" (UniqueName: \"kubernetes.io/projected/9a31a895-ced3-4285-8105-448501c3ceac-kube-api-access-99ms7\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477763 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fef7638-98df-405a-b04b-f47997b46eac-catalog-content\") pod \"redhat-operators-m6c4k\" (UID: \"7fef7638-98df-405a-b04b-f47997b46eac\") " pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477786 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6484\" (UniqueName: \"kubernetes.io/projected/8f8a5699-e034-4255-b82a-d58becd6a2aa-kube-api-access-q6484\") pod \"service-ca-9c57cc56f-hl25r\" (UID: \"8f8a5699-e034-4255-b82a-d58becd6a2aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-hl25r" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477807 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc884fdf-9890-4cc6-b0cf-9028a290209b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"dc884fdf-9890-4cc6-b0cf-9028a290209b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477848 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlcrc\" (UniqueName: \"kubernetes.io/projected/21d99397-dae5-442c-b0e7-bfb634866216-kube-api-access-xlcrc\") pod \"machine-config-server-ms2ld\" (UID: \"21d99397-dae5-442c-b0e7-bfb634866216\") " pod="openshift-machine-config-operator/machine-config-server-ms2ld" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477870 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0840f674-6e13-4336-ad20-a67b979ae5ba-catalog-content\") pod \"redhat-operators-xxd4x\" (UID: \"0840f674-6e13-4336-ad20-a67b979ae5ba\") " pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477892 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82dff338-35e1-44df-8f20-a4d4d8b3c198-utilities\") pod \"certified-operators-lvkz9\" (UID: \"82dff338-35e1-44df-8f20-a4d4d8b3c198\") " pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477920 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-plugins-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477941 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-socket-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477964 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-mountpoint-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.477983 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6868925-795c-4765-9343-0b147db98216-utilities\") pod \"community-operators-ll7bv\" (UID: \"d6868925-795c-4765-9343-0b147db98216\") " pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478005 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fef7638-98df-405a-b04b-f47997b46eac-utilities\") pod \"redhat-operators-m6c4k\" (UID: \"7fef7638-98df-405a-b04b-f47997b46eac\") " pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478031 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqxbg\" (UniqueName: \"kubernetes.io/projected/d6868925-795c-4765-9343-0b147db98216-kube-api-access-xqxbg\") pod \"community-operators-ll7bv\" (UID: \"d6868925-795c-4765-9343-0b147db98216\") " pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478060 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec0983c2-4cd5-41aa-972c-60dd47817a5b-utilities\") pod \"certified-operators-kv2xc\" (UID: \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\") " pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478082 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dcb2\" (UniqueName: \"kubernetes.io/projected/0a8433ae-09da-4dfb-98c6-922fcfbaa546-kube-api-access-4dcb2\") pod \"redhat-marketplace-tfjjf\" (UID: \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\") " pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478103 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gmld\" (UniqueName: \"kubernetes.io/projected/0840f674-6e13-4336-ad20-a67b979ae5ba-kube-api-access-5gmld\") pod \"redhat-operators-xxd4x\" (UID: \"0840f674-6e13-4336-ad20-a67b979ae5ba\") " pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478122 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzpbh\" (UniqueName: \"kubernetes.io/projected/7fef7638-98df-405a-b04b-f47997b46eac-kube-api-access-gzpbh\") pod \"redhat-operators-m6c4k\" (UID: \"7fef7638-98df-405a-b04b-f47997b46eac\") " pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478152 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f8tt\" (UniqueName: \"kubernetes.io/projected/ec0983c2-4cd5-41aa-972c-60dd47817a5b-kube-api-access-7f8tt\") pod \"certified-operators-kv2xc\" (UID: \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\") " pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478176 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-csi-data-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478240 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec0983c2-4cd5-41aa-972c-60dd47817a5b-catalog-content\") pod \"certified-operators-kv2xc\" (UID: \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\") " pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:11:47 crc kubenswrapper[4921]: E0312 13:11:47.478237 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:47.978191091 +0000 UTC m=+130.668263062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478307 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/21d99397-dae5-442c-b0e7-bfb634866216-node-bootstrap-token\") pod \"machine-config-server-ms2ld\" (UID: \"21d99397-dae5-442c-b0e7-bfb634866216\") " pod="openshift-machine-config-operator/machine-config-server-ms2ld" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478359 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-registration-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478384 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0840f674-6e13-4336-ad20-a67b979ae5ba-utilities\") pod \"redhat-operators-xxd4x\" (UID: \"0840f674-6e13-4336-ad20-a67b979ae5ba\") " pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478409 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8433ae-09da-4dfb-98c6-922fcfbaa546-catalog-content\") pod \"redhat-marketplace-tfjjf\" (UID: \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\") " pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478457 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478491 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6868925-795c-4765-9343-0b147db98216-catalog-content\") pod \"community-operators-ll7bv\" (UID: \"d6868925-795c-4765-9343-0b147db98216\") " pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478523 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8f8a5699-e034-4255-b82a-d58becd6a2aa-signing-key\") pod \"service-ca-9c57cc56f-hl25r\" (UID: \"8f8a5699-e034-4255-b82a-d58becd6a2aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-hl25r" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478542 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82dff338-35e1-44df-8f20-a4d4d8b3c198-catalog-content\") pod \"certified-operators-lvkz9\" (UID: \"82dff338-35e1-44df-8f20-a4d4d8b3c198\") " pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478576 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8f8a5699-e034-4255-b82a-d58becd6a2aa-signing-cabundle\") pod \"service-ca-9c57cc56f-hl25r\" (UID: \"8f8a5699-e034-4255-b82a-d58becd6a2aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-hl25r" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478606 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc884fdf-9890-4cc6-b0cf-9028a290209b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"dc884fdf-9890-4cc6-b0cf-9028a290209b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478611 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0840f674-6e13-4336-ad20-a67b979ae5ba-catalog-content\") pod \"redhat-operators-xxd4x\" (UID: \"0840f674-6e13-4336-ad20-a67b979ae5ba\") " pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478823 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fef7638-98df-405a-b04b-f47997b46eac-utilities\") pod \"redhat-operators-m6c4k\" (UID: \"7fef7638-98df-405a-b04b-f47997b46eac\") " pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478852 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fef7638-98df-405a-b04b-f47997b46eac-catalog-content\") pod \"redhat-operators-m6c4k\" (UID: \"7fef7638-98df-405a-b04b-f47997b46eac\") " pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.478275 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-csi-data-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.479080 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8433ae-09da-4dfb-98c6-922fcfbaa546-catalog-content\") pod \"redhat-marketplace-tfjjf\" (UID: \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\") " pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.479230 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82dff338-35e1-44df-8f20-a4d4d8b3c198-utilities\") pod \"certified-operators-lvkz9\" (UID: \"82dff338-35e1-44df-8f20-a4d4d8b3c198\") " pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.479351 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6868925-795c-4765-9343-0b147db98216-catalog-content\") pod \"community-operators-ll7bv\" (UID: \"d6868925-795c-4765-9343-0b147db98216\") " pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.479414 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-registration-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.479491 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-plugins-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.479608 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec0983c2-4cd5-41aa-972c-60dd47817a5b-utilities\") pod \"certified-operators-kv2xc\" (UID: \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\") " pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.479659 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc884fdf-9890-4cc6-b0cf-9028a290209b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"dc884fdf-9890-4cc6-b0cf-9028a290209b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.479724 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-socket-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.479790 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6868925-795c-4765-9343-0b147db98216-utilities\") pod \"community-operators-ll7bv\" (UID: \"d6868925-795c-4765-9343-0b147db98216\") " pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.479925 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9a31a895-ced3-4285-8105-448501c3ceac-mountpoint-dir\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.480028 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0840f674-6e13-4336-ad20-a67b979ae5ba-utilities\") pod \"redhat-operators-xxd4x\" (UID: \"0840f674-6e13-4336-ad20-a67b979ae5ba\") " pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.480056 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82dff338-35e1-44df-8f20-a4d4d8b3c198-catalog-content\") pod \"certified-operators-lvkz9\" (UID: \"82dff338-35e1-44df-8f20-a4d4d8b3c198\") " pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:11:47 crc kubenswrapper[4921]: E0312 13:11:47.480342 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:47.980331158 +0000 UTC m=+130.670403129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.480635 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8f8a5699-e034-4255-b82a-d58becd6a2aa-signing-cabundle\") pod \"service-ca-9c57cc56f-hl25r\" (UID: \"8f8a5699-e034-4255-b82a-d58becd6a2aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-hl25r" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.480847 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8433ae-09da-4dfb-98c6-922fcfbaa546-utilities\") pod \"redhat-marketplace-tfjjf\" (UID: \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\") " pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.486314 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xk2qg"] Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.494299 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/21d99397-dae5-442c-b0e7-bfb634866216-node-bootstrap-token\") pod \"machine-config-server-ms2ld\" (UID: \"21d99397-dae5-442c-b0e7-bfb634866216\") " pod="openshift-machine-config-operator/machine-config-server-ms2ld" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.494749 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlcrc\" (UniqueName: \"kubernetes.io/projected/21d99397-dae5-442c-b0e7-bfb634866216-kube-api-access-xlcrc\") pod \"machine-config-server-ms2ld\" (UID: \"21d99397-dae5-442c-b0e7-bfb634866216\") " pod="openshift-machine-config-operator/machine-config-server-ms2ld" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.501855 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8f8a5699-e034-4255-b82a-d58becd6a2aa-signing-key\") pod \"service-ca-9c57cc56f-hl25r\" (UID: \"8f8a5699-e034-4255-b82a-d58becd6a2aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-hl25r" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.503306 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/21d99397-dae5-442c-b0e7-bfb634866216-certs\") pod \"machine-config-server-ms2ld\" (UID: \"21d99397-dae5-442c-b0e7-bfb634866216\") " pod="openshift-machine-config-operator/machine-config-server-ms2ld" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.504459 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc884fdf-9890-4cc6-b0cf-9028a290209b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"dc884fdf-9890-4cc6-b0cf-9028a290209b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.505318 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99ms7\" (UniqueName: \"kubernetes.io/projected/9a31a895-ced3-4285-8105-448501c3ceac-kube-api-access-99ms7\") pod \"csi-hostpathplugin-dctml\" (UID: \"9a31a895-ced3-4285-8105-448501c3ceac\") " pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.508055 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dcb2\" (UniqueName: \"kubernetes.io/projected/0a8433ae-09da-4dfb-98c6-922fcfbaa546-kube-api-access-4dcb2\") pod \"redhat-marketplace-tfjjf\" (UID: \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\") " pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.508224 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rnbt\" (UniqueName: \"kubernetes.io/projected/82dff338-35e1-44df-8f20-a4d4d8b3c198-kube-api-access-2rnbt\") pod \"certified-operators-lvkz9\" (UID: \"82dff338-35e1-44df-8f20-a4d4d8b3c198\") " pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.508949 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gmld\" (UniqueName: \"kubernetes.io/projected/0840f674-6e13-4336-ad20-a67b979ae5ba-kube-api-access-5gmld\") pod \"redhat-operators-xxd4x\" (UID: \"0840f674-6e13-4336-ad20-a67b979ae5ba\") " pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.510647 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzpbh\" (UniqueName: \"kubernetes.io/projected/7fef7638-98df-405a-b04b-f47997b46eac-kube-api-access-gzpbh\") pod \"redhat-operators-m6c4k\" (UID: \"7fef7638-98df-405a-b04b-f47997b46eac\") " pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.507165 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f8tt\" (UniqueName: \"kubernetes.io/projected/ec0983c2-4cd5-41aa-972c-60dd47817a5b-kube-api-access-7f8tt\") pod \"certified-operators-kv2xc\" (UID: \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\") " pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.512046 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqxbg\" (UniqueName: \"kubernetes.io/projected/d6868925-795c-4765-9343-0b147db98216-kube-api-access-xqxbg\") pod \"community-operators-ll7bv\" (UID: \"d6868925-795c-4765-9343-0b147db98216\") " pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.513929 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sg6kz" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.514297 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6484\" (UniqueName: \"kubernetes.io/projected/8f8a5699-e034-4255-b82a-d58becd6a2aa-kube-api-access-q6484\") pod \"service-ca-9c57cc56f-hl25r\" (UID: \"8f8a5699-e034-4255-b82a-d58becd6a2aa\") " pod="openshift-service-ca/service-ca-9c57cc56f-hl25r" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.519443 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ms2ld" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.532161 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.549548 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.555128 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.574857 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.579728 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:47 crc kubenswrapper[4921]: E0312 13:11:47.579988 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:48.079971083 +0000 UTC m=+130.770043064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.580092 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: E0312 13:11:47.580356 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:48.080347535 +0000 UTC m=+130.770419516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.587224 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.589659 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.596358 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.613713 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.621415 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-655n6" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.638929 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.642923 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.651280 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.655198 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.680976 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:47 crc kubenswrapper[4921]: E0312 13:11:47.681297 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:48.18128155 +0000 UTC m=+130.871353521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.698890 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.738020 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.777359 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hl25r" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.782860 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: E0312 13:11:47.783197 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:48.283184105 +0000 UTC m=+130.973256076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.798755 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dctml" Mar 12 13:11:47 crc kubenswrapper[4921]: W0312 13:11:47.802482 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21d99397_dae5_442c_b0e7_bfb634866216.slice/crio-5841d7953eea7c4eb45a6797d72a0c13982a7567268a163537bf6c2914fcf3b9 WatchSource:0}: Error finding container 5841d7953eea7c4eb45a6797d72a0c13982a7567268a163537bf6c2914fcf3b9: Status 404 returned error can't find the container with id 5841d7953eea7c4eb45a6797d72a0c13982a7567268a163537bf6c2914fcf3b9 Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.818902 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5ns6b" event={"ID":"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c","Type":"ContainerStarted","Data":"0bca4aa358da49fd9072c4b64bd69c108325de2394c2c6adfedc4e646405bbeb"} Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.820524 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ms2ld" event={"ID":"21d99397-dae5-442c-b0e7-bfb634866216","Type":"ContainerStarted","Data":"5841d7953eea7c4eb45a6797d72a0c13982a7567268a163537bf6c2914fcf3b9"} Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.827378 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" event={"ID":"81ec102d-42ba-4d41-952d-d36fa110e626","Type":"ContainerStarted","Data":"10f2850688d3a00bee279341c32453288841f7df15daa586ef137ac7d27713ea"} Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.828769 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" event={"ID":"a55aa2b4-ed3e-414c-8fa3-cba24092f81a","Type":"ContainerStarted","Data":"10da746a9af117f221b3762258635cb98995b26e1a145f943d523d6e3fd6f421"} Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.829929 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" event={"ID":"42780a2c-c305-4915-9be7-799cec82b8b8","Type":"ContainerStarted","Data":"53377e784a4e6e04439b3a3fff8b714956b4404018bb2c92dff20d1dfde79775"} Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.837381 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rxv7d" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.838603 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fcwhd" Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.866320 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x8rdl"] Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.889284 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:47 crc kubenswrapper[4921]: E0312 13:11:47.891531 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:48.391500741 +0000 UTC m=+131.081572712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:47 crc kubenswrapper[4921]: I0312 13:11:47.991320 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:47 crc kubenswrapper[4921]: E0312 13:11:47.991693 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:48.491676912 +0000 UTC m=+131.181748883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.092664 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:48 crc kubenswrapper[4921]: E0312 13:11:48.093338 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:48.59332181 +0000 UTC m=+131.283393781 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.196010 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:48 crc kubenswrapper[4921]: E0312 13:11:48.197085 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:48.697061283 +0000 UTC m=+131.387133254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.298471 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:48 crc kubenswrapper[4921]: E0312 13:11:48.298778 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:48.798756091 +0000 UTC m=+131.488828062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.302436 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:48 crc kubenswrapper[4921]: E0312 13:11:48.303273 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:48.803259192 +0000 UTC m=+131.493331163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.404735 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:48 crc kubenswrapper[4921]: E0312 13:11:48.405648 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:48.905632071 +0000 UTC m=+131.595704042 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.506449 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:48 crc kubenswrapper[4921]: E0312 13:11:48.506745 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:49.006734362 +0000 UTC m=+131.696806333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.608068 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:48 crc kubenswrapper[4921]: E0312 13:11:48.608398 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:49.10838148 +0000 UTC m=+131.798453451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.710008 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:48 crc kubenswrapper[4921]: E0312 13:11:48.710325 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:49.210313226 +0000 UTC m=+131.900385197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.815302 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:48 crc kubenswrapper[4921]: E0312 13:11:48.815459 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:49.315434422 +0000 UTC m=+132.005506393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.815620 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:48 crc kubenswrapper[4921]: E0312 13:11:48.815964 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:49.315951368 +0000 UTC m=+132.006023339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.835184 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ms2ld" event={"ID":"21d99397-dae5-442c-b0e7-bfb634866216","Type":"ContainerStarted","Data":"18a5e238ad2c191147053866e1c795148059de6809c8ec18ac8b05cc2adbbc7e"} Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.836803 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x8rdl" event={"ID":"5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4","Type":"ContainerStarted","Data":"eb25cd504634ad0f0d828a6ceb8904f2e55d5fe03a7d2d77afadc6b82076fa4f"} Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.836860 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x8rdl" event={"ID":"5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4","Type":"ContainerStarted","Data":"50037e2a86ef7361cf9ae44166b3748056d420132df1013077d4d6098156a6b7"} Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.838487 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" event={"ID":"81ec102d-42ba-4d41-952d-d36fa110e626","Type":"ContainerStarted","Data":"f5555b0ef0e3d30a1c2751b3c85c0f7c91680c00716833d8ce887a67c2b15813"} Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.838608 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.840215 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" event={"ID":"a55aa2b4-ed3e-414c-8fa3-cba24092f81a","Type":"ContainerStarted","Data":"dab00b074f40a058f44c4ad5fcf069fead23cf3a8ec673faee02055ca89ed3d8"} Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.840257 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" event={"ID":"a55aa2b4-ed3e-414c-8fa3-cba24092f81a","Type":"ContainerStarted","Data":"f76317a50558d5b3dccb7ea3a38082decbfc9086b421041b985fecd508b3d9f9"} Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.841307 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" event={"ID":"42780a2c-c305-4915-9be7-799cec82b8b8","Type":"ContainerStarted","Data":"0f4fccb06da86afdcba51cee7b99df9868f6887910603fa3254268b953b01927"} Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.843229 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5ns6b" event={"ID":"0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c","Type":"ContainerStarted","Data":"4f0d1e668fea40e095d0592a668976231621ea70fa45f8fa42e40797226a928a"} Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.851900 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-ms2ld" podStartSLOduration=11.851885854 podStartE2EDuration="11.851885854s" podCreationTimestamp="2026-03-12 13:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:48.846417684 +0000 UTC m=+131.536489655" watchObservedRunningTime="2026-03-12 13:11:48.851885854 +0000 UTC m=+131.541957825" Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.871253 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" podStartSLOduration=11.871228665 podStartE2EDuration="11.871228665s" podCreationTimestamp="2026-03-12 13:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:48.863091102 +0000 UTC m=+131.553163343" watchObservedRunningTime="2026-03-12 13:11:48.871228665 +0000 UTC m=+131.561300636" Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.884746 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-22xz2"] Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.886694 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt"] Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.895559 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-x8rdl" podStartSLOduration=65.89553395 podStartE2EDuration="1m5.89553395s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:48.881466633 +0000 UTC m=+131.571538604" watchObservedRunningTime="2026-03-12 13:11:48.89553395 +0000 UTC m=+131.585605921" Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.911624 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt"] Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.916350 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:48 crc kubenswrapper[4921]: E0312 13:11:48.917898 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:49.417875834 +0000 UTC m=+132.107947825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.921916 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j5pwt"] Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.922222 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xk2qg" podStartSLOduration=65.922207069 podStartE2EDuration="1m5.922207069s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:48.916224353 +0000 UTC m=+131.606296344" watchObservedRunningTime="2026-03-12 13:11:48.922207069 +0000 UTC m=+131.612279040" Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.928060 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7gdw7"] Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.949394 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fp9vb"] Mar 12 13:11:48 crc kubenswrapper[4921]: W0312 13:11:48.956200 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0e82943_c5ab_4f7e_91d2_f99937a1ad40.slice/crio-343b7a935599220bfcf3793f83b0b093b258ccf1005f892af4f4df2f13d5564b WatchSource:0}: Error finding container 343b7a935599220bfcf3793f83b0b093b258ccf1005f892af4f4df2f13d5564b: Status 404 returned error can't find the container with id 343b7a935599220bfcf3793f83b0b093b258ccf1005f892af4f4df2f13d5564b Mar 12 13:11:48 crc kubenswrapper[4921]: W0312 13:11:48.958038 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57677fcb_c7a5_431c_b751_ec13d22484b1.slice/crio-00a058fd86aefa56b6722c8efea6027b50404dae9316d21f1f3b9db3d3b66af0 WatchSource:0}: Error finding container 00a058fd86aefa56b6722c8efea6027b50404dae9316d21f1f3b9db3d3b66af0: Status 404 returned error can't find the container with id 00a058fd86aefa56b6722c8efea6027b50404dae9316d21f1f3b9db3d3b66af0 Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.965603 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6"] Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.966487 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9jx6b" podStartSLOduration=66.966459033 podStartE2EDuration="1m6.966459033s" podCreationTimestamp="2026-03-12 13:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:48.934482269 +0000 UTC m=+131.624554240" watchObservedRunningTime="2026-03-12 13:11:48.966459033 +0000 UTC m=+131.656531004" Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.972385 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gdgrq"] Mar 12 13:11:48 crc kubenswrapper[4921]: I0312 13:11:48.972841 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-5ns6b" podStartSLOduration=65.97280455 podStartE2EDuration="1m5.97280455s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:48.967499535 +0000 UTC m=+131.657571506" watchObservedRunningTime="2026-03-12 13:11:48.97280455 +0000 UTC m=+131.662876521" Mar 12 13:11:48 crc kubenswrapper[4921]: W0312 13:11:48.975487 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6757f226_348a_4d6c_a9ee_22c6315701af.slice/crio-9b9eec0af14c0670bc87455d1fc6807c2b705b7e61d152a0e5d38be71aef8f25 WatchSource:0}: Error finding container 9b9eec0af14c0670bc87455d1fc6807c2b705b7e61d152a0e5d38be71aef8f25: Status 404 returned error can't find the container with id 9b9eec0af14c0670bc87455d1fc6807c2b705b7e61d152a0e5d38be71aef8f25 Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.008437 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.022895 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:49 crc kubenswrapper[4921]: E0312 13:11:49.024487 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:49.524470695 +0000 UTC m=+132.214542666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.125170 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:49 crc kubenswrapper[4921]: E0312 13:11:49.125895 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:49.625874315 +0000 UTC m=+132.315946286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.145417 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.151438 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:11:49 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:11:49 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:11:49 crc kubenswrapper[4921]: healthz check failed Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.151527 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.212261 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sg6kz"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.218733 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4c92v"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.224988 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7vlvg"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.226602 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.226863 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lvkz9"] Mar 12 13:11:49 crc kubenswrapper[4921]: E0312 13:11:49.227112 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:49.727089519 +0000 UTC m=+132.417161490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.244610 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tzw9"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.259830 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5jsfz"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.259879 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-4z7zk"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.264777 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pztgf"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.278585 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-r7sfx"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.280375 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gg92s"] Mar 12 13:11:49 crc kubenswrapper[4921]: W0312 13:11:49.280379 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc904419_43b3_4164_8efb_b493171791cc.slice/crio-737b7a0e9d1bf6c8f9f36aeeee1f7067b2978ff3ff389e9a6d17e7985d911a25 WatchSource:0}: Error finding container 737b7a0e9d1bf6c8f9f36aeeee1f7067b2978ff3ff389e9a6d17e7985d911a25: Status 404 returned error can't find the container with id 737b7a0e9d1bf6c8f9f36aeeee1f7067b2978ff3ff389e9a6d17e7985d911a25 Mar 12 13:11:49 crc kubenswrapper[4921]: W0312 13:11:49.315593 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82dff338_35e1_44df_8f20_a4d4d8b3c198.slice/crio-6981606ed8b29046bde701e12e11093518e4ea9b9abe1c310554637754003ec3 WatchSource:0}: Error finding container 6981606ed8b29046bde701e12e11093518e4ea9b9abe1c310554637754003ec3: Status 404 returned error can't find the container with id 6981606ed8b29046bde701e12e11093518e4ea9b9abe1c310554637754003ec3 Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.315739 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.320022 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.323654 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ll7bv"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.325497 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-655n6"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.327372 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:49 crc kubenswrapper[4921]: E0312 13:11:49.327680 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:49.827656833 +0000 UTC m=+132.517728804 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.327895 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:49 crc kubenswrapper[4921]: E0312 13:11:49.328372 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:49.828356885 +0000 UTC m=+132.518428856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.365447 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.413430 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m6c4k"] Mar 12 13:11:49 crc kubenswrapper[4921]: W0312 13:11:49.417618 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9823f1cf_662f_4896_a6a0_a3bfb3aa660b.slice/crio-9aa19d924e8d1ce76fb342f5342be95c679fba266015142d6815e56af97a23e0 WatchSource:0}: Error finding container 9aa19d924e8d1ce76fb342f5342be95c679fba266015142d6815e56af97a23e0: Status 404 returned error can't find the container with id 9aa19d924e8d1ce76fb342f5342be95c679fba266015142d6815e56af97a23e0 Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.430523 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f403288d-b503-4f0c-bf83-3b29ff86ab94-secret-volume\") pod \"f403288d-b503-4f0c-bf83-3b29ff86ab94\" (UID: \"f403288d-b503-4f0c-bf83-3b29ff86ab94\") " Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.430953 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dfpw\" (UniqueName: \"kubernetes.io/projected/f403288d-b503-4f0c-bf83-3b29ff86ab94-kube-api-access-5dfpw\") pod \"f403288d-b503-4f0c-bf83-3b29ff86ab94\" (UID: \"f403288d-b503-4f0c-bf83-3b29ff86ab94\") " Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.431182 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.431298 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f403288d-b503-4f0c-bf83-3b29ff86ab94-config-volume\") pod \"f403288d-b503-4f0c-bf83-3b29ff86ab94\" (UID: \"f403288d-b503-4f0c-bf83-3b29ff86ab94\") " Mar 12 13:11:49 crc kubenswrapper[4921]: E0312 13:11:49.431585 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:49.931547681 +0000 UTC m=+132.621619652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.431784 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:49 crc kubenswrapper[4921]: E0312 13:11:49.432170 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:49.932163559 +0000 UTC m=+132.622235530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.432504 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f403288d-b503-4f0c-bf83-3b29ff86ab94-config-volume" (OuterVolumeSpecName: "config-volume") pod "f403288d-b503-4f0c-bf83-3b29ff86ab94" (UID: "f403288d-b503-4f0c-bf83-3b29ff86ab94"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.443026 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f403288d-b503-4f0c-bf83-3b29ff86ab94-kube-api-access-5dfpw" (OuterVolumeSpecName: "kube-api-access-5dfpw") pod "f403288d-b503-4f0c-bf83-3b29ff86ab94" (UID: "f403288d-b503-4f0c-bf83-3b29ff86ab94"). InnerVolumeSpecName "kube-api-access-5dfpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.445131 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f403288d-b503-4f0c-bf83-3b29ff86ab94-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f403288d-b503-4f0c-bf83-3b29ff86ab94" (UID: "f403288d-b503-4f0c-bf83-3b29ff86ab94"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:49 crc kubenswrapper[4921]: W0312 13:11:49.452653 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-7f23679c5837a2d02481be39b62a24898aa6fb887cc6cf6fbeb05ed44d0320b9 WatchSource:0}: Error finding container 7f23679c5837a2d02481be39b62a24898aa6fb887cc6cf6fbeb05ed44d0320b9: Status 404 returned error can't find the container with id 7f23679c5837a2d02481be39b62a24898aa6fb887cc6cf6fbeb05ed44d0320b9 Mar 12 13:11:49 crc kubenswrapper[4921]: W0312 13:11:49.454723 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod345c9c4b_5322_4521_abdb_5736718e654c.slice/crio-0d8c85d3741c915c71a189d421d3037f757c8248f2984b69f283e1154f6099e6 WatchSource:0}: Error finding container 0d8c85d3741c915c71a189d421d3037f757c8248f2984b69f283e1154f6099e6: Status 404 returned error can't find the container with id 0d8c85d3741c915c71a189d421d3037f757c8248f2984b69f283e1154f6099e6 Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.480404 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.534038 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:49 crc kubenswrapper[4921]: E0312 13:11:49.534304 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:50.034277632 +0000 UTC m=+132.724349603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.534432 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.534669 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f403288d-b503-4f0c-bf83-3b29ff86ab94-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.534732 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f403288d-b503-4f0c-bf83-3b29ff86ab94-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.534847 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dfpw\" (UniqueName: \"kubernetes.io/projected/f403288d-b503-4f0c-bf83-3b29ff86ab94-kube-api-access-5dfpw\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:49 crc kubenswrapper[4921]: E0312 13:11:49.534995 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:50.034980524 +0000 UTC m=+132.725052495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.636300 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:49 crc kubenswrapper[4921]: E0312 13:11:49.636693 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:50.136656332 +0000 UTC m=+132.826728303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.636936 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:49 crc kubenswrapper[4921]: E0312 13:11:49.637336 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:50.137328283 +0000 UTC m=+132.827400254 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.654012 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.711714 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xxd4x"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.739756 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:49 crc kubenswrapper[4921]: E0312 13:11:49.740208 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:50.240188958 +0000 UTC m=+132.930260929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.744603 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hl25r"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.755777 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dctml"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.786038 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfjjf"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.802727 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.818613 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kv2xc"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.842156 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:49 crc kubenswrapper[4921]: E0312 13:11:49.842762 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:50.342745654 +0000 UTC m=+133.032817625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.858038 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.895222 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" event={"ID":"bc5516ca-a316-4768-85b7-1acc90471ad3","Type":"ContainerStarted","Data":"406bf6974ef86e6b0d5167044ab2a13826619728d1e45a746e9a8192847f5507"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.895724 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" event={"ID":"bc5516ca-a316-4768-85b7-1acc90471ad3","Type":"ContainerStarted","Data":"33851eab32f6fa78604c75e3906316a2089238fc15d93fd0d4fa9067c585afab"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.897297 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.898193 4921 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6pfpd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" start-of-body= Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.898257 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" podUID="bc5516ca-a316-4768-85b7-1acc90471ad3" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.31:5443/healthz\": dial tcp 10.217.0.31:5443: connect: connection refused" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.898743 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" event={"ID":"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce","Type":"ContainerStarted","Data":"44162cb4e9c6f62c5c9294b6b379e6c7c7bf059550665fcddd2041e755a22127"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.905987 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt" event={"ID":"e0e82943-c5ab-4f7e-91d2-f99937a1ad40","Type":"ContainerStarted","Data":"7d162a6109501d840d707da019508fa2136209f3ffa3aa07c747ae03b5a18c4b"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.906030 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt" event={"ID":"e0e82943-c5ab-4f7e-91d2-f99937a1ad40","Type":"ContainerStarted","Data":"343b7a935599220bfcf3793f83b0b093b258ccf1005f892af4f4df2f13d5564b"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.912345 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" podStartSLOduration=66.912325946 podStartE2EDuration="1m6.912325946s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:49.910836189 +0000 UTC m=+132.600908160" watchObservedRunningTime="2026-03-12 13:11:49.912325946 +0000 UTC m=+132.602397917" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.915457 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvkz9" event={"ID":"82dff338-35e1-44df-8f20-a4d4d8b3c198","Type":"ContainerStarted","Data":"6981606ed8b29046bde701e12e11093518e4ea9b9abe1c310554637754003ec3"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.920039 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"756de89da87a0935e28492ff8f6bdcc0a9ddc58f26aec68cc67882a77fe81dfa"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.920837 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.927044 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" event={"ID":"bd5e10c8-1017-4083-a5d8-550f2aca7920","Type":"ContainerStarted","Data":"682e4b1ea6931bc5bd199efa644778ab4f1a990421186a114992f78d0d239578"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.928011 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9lvzt" podStartSLOduration=67.927985582 podStartE2EDuration="1m7.927985582s" podCreationTimestamp="2026-03-12 13:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:49.923398919 +0000 UTC m=+132.613470900" watchObservedRunningTime="2026-03-12 13:11:49.927985582 +0000 UTC m=+132.618057553" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.930375 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6c4k" event={"ID":"7fef7638-98df-405a-b04b-f47997b46eac","Type":"ContainerStarted","Data":"2088bf489087b1ad5f240daf5b7a6b8bd0ad936491075b8b1a3102de7c356546"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.934828 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxd4x" event={"ID":"0840f674-6e13-4336-ad20-a67b979ae5ba","Type":"ContainerStarted","Data":"e31317af4b6c9d980d16ea822c5a98a1a2f2559c32cca024a2f244625fd26be0"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.939722 4921 ???:1] "http: TLS handshake error from 192.168.126.11:48430: no serving certificate available for the kubelet" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.943095 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:49 crc kubenswrapper[4921]: E0312 13:11:49.945123 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:50.445106674 +0000 UTC m=+133.135178645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:49 crc kubenswrapper[4921]: W0312 13:11:49.953650 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podff9848fa_2816_4e2c_96a8_b7bc9a13ceed.slice/crio-60d4bac1444c4758688f786b262b1c259b3262c85f5dcb32b8db910b0423c883 WatchSource:0}: Error finding container 60d4bac1444c4758688f786b262b1c259b3262c85f5dcb32b8db910b0423c883: Status 404 returned error can't find the container with id 60d4bac1444c4758688f786b262b1c259b3262c85f5dcb32b8db910b0423c883 Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.954101 4921 generic.go:334] "Generic (PLEG): container finished" podID="675e0fd3-342d-46b4-968a-33dd611eb8c0" containerID="28160d6c0aa8d75013a253a1cb719b272124d624c7164d20a322be880fce9211" exitCode=0 Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.954157 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" event={"ID":"675e0fd3-342d-46b4-968a-33dd611eb8c0","Type":"ContainerDied","Data":"28160d6c0aa8d75013a253a1cb719b272124d624c7164d20a322be880fce9211"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.954186 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" event={"ID":"675e0fd3-342d-46b4-968a-33dd611eb8c0","Type":"ContainerStarted","Data":"ca14e64a66b5eaa9762116ead8f0861cdfee31fef42a0c92c6731a21e9dd2685"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.956712 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-655n6" event={"ID":"5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5","Type":"ContainerStarted","Data":"034b57e95d5fd4b7de6b3154c634cc29116f2063c7e3b2736b1f85ce2504f6a4"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.958297 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2" event={"ID":"63c96b06-6182-4472-b8a8-393c627c77c9","Type":"ContainerStarted","Data":"5833de90ffecb54e3ae2422a8be22879fce1ae45408920b800093d082a322e6b"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.958961 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pztgf" event={"ID":"98a0cc52-4219-45b7-a15f-d763979accbc","Type":"ContainerStarted","Data":"30ad0ee2ae6413762789abc15914b7ee1f5c5782516f4be922b4dda0c2a77fe3"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.961282 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fp9vb" event={"ID":"6757f226-348a-4d6c-a9ee-22c6315701af","Type":"ContainerStarted","Data":"94a6984f0d73fb7ff440d7c738b48cf9ef5ed87bb3ac7b63fcb11cbcb355c1fb"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.961304 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fp9vb" event={"ID":"6757f226-348a-4d6c-a9ee-22c6315701af","Type":"ContainerStarted","Data":"9b9eec0af14c0670bc87455d1fc6807c2b705b7e61d152a0e5d38be71aef8f25"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.972945 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" event={"ID":"f403288d-b503-4f0c-bf83-3b29ff86ab94","Type":"ContainerDied","Data":"7a3e9667630e8d4701542a5a65087e6697285fb7735bf6b95e5484279fac4394"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.972986 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a3e9667630e8d4701542a5a65087e6697285fb7735bf6b95e5484279fac4394" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.973059 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.979636 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gdgrq" event={"ID":"ec32db1f-c08c-4ea3-93c0-13dee21a1deb","Type":"ContainerStarted","Data":"6e1c70536d362095bb93178839090e76965f03dcbd57a6b2cc5b0459ed3853d9"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.979680 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gdgrq" event={"ID":"ec32db1f-c08c-4ea3-93c0-13dee21a1deb","Type":"ContainerStarted","Data":"dd792cd41d34c4336da2cff055144ed9b49aa87c9a9e3cee24576475a07ec653"} Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.980096 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.984597 4921 patch_prober.go:28] interesting pod/console-operator-58897d9998-gdgrq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Mar 12 13:11:49 crc kubenswrapper[4921]: I0312 13:11:49.984649 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gdgrq" podUID="ec32db1f-c08c-4ea3-93c0-13dee21a1deb" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.012490 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-gdgrq" podStartSLOduration=67.012469476 podStartE2EDuration="1m7.012469476s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:50.002879329 +0000 UTC m=+132.692951300" watchObservedRunningTime="2026-03-12 13:11:50.012469476 +0000 UTC m=+132.702541447" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.040759 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfjjf" event={"ID":"0a8433ae-09da-4dfb-98c6-922fcfbaa546","Type":"ContainerStarted","Data":"1410175a8aafdda4091710a667d32ec77eb01332bfd1b617627fb5e5920ade5e"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.040797 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.041113 4921 ???:1] "http: TLS handshake error from 192.168.126.11:48446: no serving certificate available for the kubelet" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.051239 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.051293 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5jsfz" event={"ID":"9823f1cf-662f-4896-a6a0-a3bfb3aa660b","Type":"ContainerStarted","Data":"9aa19d924e8d1ce76fb342f5342be95c679fba266015142d6815e56af97a23e0"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.051320 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" event={"ID":"680f8033-da87-4897-bf8c-23b2ad8af659","Type":"ContainerStarted","Data":"422deaa84203af77ccb5781bfdd59512046bd17f282a52a9d3f6b65053857949"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.051336 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" event={"ID":"680f8033-da87-4897-bf8c-23b2ad8af659","Type":"ContainerStarted","Data":"0fdae05221a58a82c0c0e416b5d7fd3d85e42031c179195c3bea64c40366cd33"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.051345 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" event={"ID":"345c99f7-75d2-48da-9a45-6fd8ce5c92da","Type":"ContainerStarted","Data":"22e6c95684eb311d7129225949c0dcf1be5237c16fae604d8aebc3eef10fe70a"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.053973 4921 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7gdw7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.054029 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" podUID="680f8033-da87-4897-bf8c-23b2ad8af659" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.054405 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.054891 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" podStartSLOduration=67.054879784 podStartE2EDuration="1m7.054879784s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:50.054376118 +0000 UTC m=+132.744448089" watchObservedRunningTime="2026-03-12 13:11:50.054879784 +0000 UTC m=+132.744951755" Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.058767 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:50.558752654 +0000 UTC m=+133.248824625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.059595 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4z7zk" event={"ID":"a7c45059-acf8-4cb3-b1f6-f07128d72141","Type":"ContainerStarted","Data":"0ae4c2914a9c6a591490973e62702b106d71c1eff1e4f17d40f439d8983617ce"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.060239 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-4z7zk" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.068796 4921 patch_prober.go:28] interesting pod/downloads-7954f5f757-4z7zk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.068864 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4z7zk" podUID="a7c45059-acf8-4cb3-b1f6-f07128d72141" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.072490 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" event={"ID":"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21","Type":"ContainerStarted","Data":"57464ac9462fdb3f64170d72c165d6e904c91474503786b3e10ce2aec76543b7"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.072554 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" event={"ID":"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21","Type":"ContainerStarted","Data":"4695ea916f6c28ee297a09f6c9d010b9cc32e8ce5d17780612467b079d88546c"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.074032 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.081557 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-4z7zk" podStartSLOduration=67.081538802 podStartE2EDuration="1m7.081538802s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:50.079895041 +0000 UTC m=+132.769967012" watchObservedRunningTime="2026-03-12 13:11:50.081538802 +0000 UTC m=+132.771610773" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.082041 4921 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-j5pwt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.082115 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" podUID="5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.085758 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-22xz2" event={"ID":"57677fcb-c7a5-431c-b751-ec13d22484b1","Type":"ContainerStarted","Data":"2299919c3c69fcf2f0ea10e2661454216b2d2709cba49d0ce908da4140433782"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.085800 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-22xz2" event={"ID":"57677fcb-c7a5-431c-b751-ec13d22484b1","Type":"ContainerStarted","Data":"00a058fd86aefa56b6722c8efea6027b50404dae9316d21f1f3b9db3d3b66af0"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.100318 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.100298315 podStartE2EDuration="1.100298315s" podCreationTimestamp="2026-03-12 13:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:50.094381951 +0000 UTC m=+132.784453922" watchObservedRunningTime="2026-03-12 13:11:50.100298315 +0000 UTC m=+132.790370276" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.108104 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sg6kz" event={"ID":"8766c23e-233b-4eab-9d32-793e70fa9284","Type":"ContainerStarted","Data":"f10fe3b481f89bc610f6836a0d932ed258a26586a30c77d7169d3a4df35f0a68"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.108155 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sg6kz" event={"ID":"8766c23e-233b-4eab-9d32-793e70fa9284","Type":"ContainerStarted","Data":"672f1c878fe413dace8e6686cf752e2896ba34099b79289d8b94ceef47b8ed6c"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.115343 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" event={"ID":"345c9c4b-5322-4521-abdb-5736718e654c","Type":"ContainerStarted","Data":"0d8c85d3741c915c71a189d421d3037f757c8248f2984b69f283e1154f6099e6"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.127189 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-22xz2" podStartSLOduration=67.127170339 podStartE2EDuration="1m7.127170339s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:50.127047125 +0000 UTC m=+132.817119096" watchObservedRunningTime="2026-03-12 13:11:50.127170339 +0000 UTC m=+132.817242310" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.129918 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" event={"ID":"50255da2-a710-48bc-8a00-36146dec247a","Type":"ContainerStarted","Data":"51049b1a7fddf07356608454f08aba65ee13229c747703186b913b5506079a45"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.133746 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tzw9" event={"ID":"2b934596-5580-41ba-8ad2-8722f4cf476d","Type":"ContainerStarted","Data":"1030c6eede1e10e8324e130cc3a0269d20c3774c6fecbfbec94ac44e1a5bebf3"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.135954 4921 ???:1] "http: TLS handshake error from 192.168.126.11:48454: no serving certificate available for the kubelet" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.136214 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"dc884fdf-9890-4cc6-b0cf-9028a290209b","Type":"ContainerStarted","Data":"1b295437fc60623ad3203050f31d85f446fbe03b0cbee3955e9797353e386002"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.142246 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8e7567bd6660415886ad5407e4957ecb8cb58f285411e9b385d709b87b4ac986"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.152050 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:11:50 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:11:50 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:11:50 crc kubenswrapper[4921]: healthz check failed Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.152115 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.154103 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" podStartSLOduration=67.154086386 podStartE2EDuration="1m7.154086386s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:50.14394419 +0000 UTC m=+132.834016161" watchObservedRunningTime="2026-03-12 13:11:50.154086386 +0000 UTC m=+132.844158357" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.156101 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.157028 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hl25r" event={"ID":"8f8a5699-e034-4255-b82a-d58becd6a2aa","Type":"ContainerStarted","Data":"ea9a32431221bd4e426cd426a604e83d8b1f7bb7fcfa8078fb3d7746c72303c3"} Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.157564 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:50.657543913 +0000 UTC m=+133.347615884 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.166464 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ll7bv" event={"ID":"d6868925-795c-4765-9343-0b147db98216","Type":"ContainerStarted","Data":"b20f7dc4ec8a60f8e268eca74ad424470f1ab46f3317b3392240dbffbe2d776e"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.171954 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7f23679c5837a2d02481be39b62a24898aa6fb887cc6cf6fbeb05ed44d0320b9"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.173879 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.176892 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7vlvg" event={"ID":"dc904419-43b3-4164-8efb-b493171791cc","Type":"ContainerStarted","Data":"5cc786e0764817cbde2bb24b1ebed0ddc75fc26e757dba2a7eccc80eab22b728"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.176923 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7vlvg" event={"ID":"dc904419-43b3-4164-8efb-b493171791cc","Type":"ContainerStarted","Data":"737b7a0e9d1bf6c8f9f36aeeee1f7067b2978ff3ff389e9a6d17e7985d911a25"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.183468 4921 generic.go:334] "Generic (PLEG): container finished" podID="e334bcf0-dbe3-41d4-974b-222a58148c43" containerID="6a193fab784a4d67d1e7d064f797a239cab780cff6a7634c847f159100ea797e" exitCode=0 Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.184163 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" event={"ID":"e334bcf0-dbe3-41d4-974b-222a58148c43","Type":"ContainerDied","Data":"6a193fab784a4d67d1e7d064f797a239cab780cff6a7634c847f159100ea797e"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.184192 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" event={"ID":"e334bcf0-dbe3-41d4-974b-222a58148c43","Type":"ContainerStarted","Data":"68807427c3961271a478b733b9e4debd59461ce024e3297cd4e85103bbd86fab"} Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.206002 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sg6kz" podStartSLOduration=13.205983398 podStartE2EDuration="13.205983398s" podCreationTimestamp="2026-03-12 13:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:50.20542076 +0000 UTC m=+132.895492731" watchObservedRunningTime="2026-03-12 13:11:50.205983398 +0000 UTC m=+132.896055369" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.243878 4921 ???:1] "http: TLS handshake error from 192.168.126.11:48470: no serving certificate available for the kubelet" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.258711 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.262222 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:50.762200054 +0000 UTC m=+133.452272025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.268743 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.338368 4921 ???:1] "http: TLS handshake error from 192.168.126.11:48480: no serving certificate available for the kubelet" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.360543 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.360706 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:50.860682834 +0000 UTC m=+133.550754805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.361038 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.361420 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:50.861406666 +0000 UTC m=+133.551478637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.368409 4921 ???:1] "http: TLS handshake error from 192.168.126.11:48488: no serving certificate available for the kubelet" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.443388 4921 ???:1] "http: TLS handshake error from 192.168.126.11:48490: no serving certificate available for the kubelet" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.463774 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.464060 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:50.964039094 +0000 UTC m=+133.654111065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.464578 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.464886 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:50.964877601 +0000 UTC m=+133.654949572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.565332 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.565553 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:51.065514197 +0000 UTC m=+133.755586178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.566587 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.566964 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:51.066949841 +0000 UTC m=+133.757021812 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.667554 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.667805 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:51.167779443 +0000 UTC m=+133.857851414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.668326 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.668716 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:51.168701052 +0000 UTC m=+133.858773023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.770321 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.771390 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:51.271365501 +0000 UTC m=+133.961437472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.799295 4921 ???:1] "http: TLS handshake error from 192.168.126.11:48504: no serving certificate available for the kubelet" Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.879509 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.879852 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:51.37983748 +0000 UTC m=+134.069909441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.980044 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.980149 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:51.480127446 +0000 UTC m=+134.170199417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:50 crc kubenswrapper[4921]: I0312 13:11:50.980538 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:50 crc kubenswrapper[4921]: E0312 13:11:50.980792 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:51.480782185 +0000 UTC m=+134.170854146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.082020 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:51 crc kubenswrapper[4921]: E0312 13:11:51.082322 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:51.582291968 +0000 UTC m=+134.272363939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.082557 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:51 crc kubenswrapper[4921]: E0312 13:11:51.082912 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:51.582900498 +0000 UTC m=+134.272972469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.154388 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:11:51 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:11:51 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:11:51 crc kubenswrapper[4921]: healthz check failed Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.154476 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.197707 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:51 crc kubenswrapper[4921]: E0312 13:11:51.198307 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:51.698293092 +0000 UTC m=+134.388365063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.204430 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-4z7zk" event={"ID":"a7c45059-acf8-4cb3-b1f6-f07128d72141","Type":"ContainerStarted","Data":"d55f807b32ba5aaaa444ddad2b654added39ad8193b49cbcc72125bd31ef3ea9"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.206010 4921 patch_prober.go:28] interesting pod/downloads-7954f5f757-4z7zk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.206178 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4z7zk" podUID="a7c45059-acf8-4cb3-b1f6-f07128d72141" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.212250 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lgwhc"] Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.219548 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"38628427d285a22bca278cf54f00405c7eff1ee8a47bac11290e4e1bf3d7b072"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.239910 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc" event={"ID":"dd66ee17-c8c5-42a4-b1ea-0cb841713ec1","Type":"ContainerStarted","Data":"63677e91be12a797b0abae13209f3e3315e83634e54b656d12d240c205a98406"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.239952 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc" event={"ID":"dd66ee17-c8c5-42a4-b1ea-0cb841713ec1","Type":"ContainerStarted","Data":"75045b8be944b66664fefe15dc77a5fe056ce425d04b202e4d81d57ac576ef8f"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.249754 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" event={"ID":"345c9c4b-5322-4521-abdb-5736718e654c","Type":"ContainerStarted","Data":"2b07b501298358a146643fbd981403403515df6d0354560f1c1ed58c213fa421"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.250130 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.252251 4921 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-zdw4r container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.252289 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" podUID="345c9c4b-5322-4521-abdb-5736718e654c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.262940 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-6tvbc" podStartSLOduration=68.26292768 podStartE2EDuration="1m8.26292768s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:51.261418613 +0000 UTC m=+133.951490584" watchObservedRunningTime="2026-03-12 13:11:51.26292768 +0000 UTC m=+133.952999651" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.273197 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" event={"ID":"e334bcf0-dbe3-41d4-974b-222a58148c43","Type":"ContainerStarted","Data":"5f66559d9f83122006c2abad5fe8d6ac7d61a53a3b234839c7cbaace1f1fad78"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.274007 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.297996 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-655n6" event={"ID":"5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5","Type":"ContainerStarted","Data":"52989974d0e0b0418509c4d1bcef94cb893effd3f4223c5a1887cff0eda27465"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.298041 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-655n6" event={"ID":"5bb67fa1-cc41-45e2-bee8-a1f7313ae3b5","Type":"ContainerStarted","Data":"407413461f596bc6b57ff35a61523116874f6ab02803944238397ae65589ecc0"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.299049 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:51 crc kubenswrapper[4921]: E0312 13:11:51.299933 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:51.799922259 +0000 UTC m=+134.489994230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.302393 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-655n6" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.305365 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b83d5848871fd0a624607d24cba2c51600796aabe2b5f733be6ba1ca21ad2397"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.307162 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" podStartSLOduration=68.307147524 podStartE2EDuration="1m8.307147524s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:51.279388261 +0000 UTC m=+133.969460232" watchObservedRunningTime="2026-03-12 13:11:51.307147524 +0000 UTC m=+133.997219495" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.308948 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" podStartSLOduration=68.308942719 podStartE2EDuration="1m8.308942719s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:51.306826224 +0000 UTC m=+133.996898195" watchObservedRunningTime="2026-03-12 13:11:51.308942719 +0000 UTC m=+133.999014690" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.323346 4921 generic.go:334] "Generic (PLEG): container finished" podID="0840f674-6e13-4336-ad20-a67b979ae5ba" containerID="90d3a5563b82d98d5b3f5b84e37a859b1f7c1a1f1ace331931ada64ed8f3ab47" exitCode=0 Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.323452 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxd4x" event={"ID":"0840f674-6e13-4336-ad20-a67b979ae5ba","Type":"ContainerDied","Data":"90d3a5563b82d98d5b3f5b84e37a859b1f7c1a1f1ace331931ada64ed8f3ab47"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.329743 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" event={"ID":"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce","Type":"ContainerStarted","Data":"a911277ac8f2809155389ac0eafd14fe03e913d28177a70db104bfab58669e17"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.331655 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.353175 4921 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-gg92s container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.5:6443/healthz\": dial tcp 10.217.0.5:6443: connect: connection refused" start-of-body= Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.353463 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" podUID="88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.5:6443/healthz\": dial tcp 10.217.0.5:6443: connect: connection refused" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.353739 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fp9vb" event={"ID":"6757f226-348a-4d6c-a9ee-22c6315701af","Type":"ContainerStarted","Data":"9e39f3acb33c8434eab891280cd5fbc81b93a1ee4edb6708c133e1d00b2427e8"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.367016 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0f5218191988e9267be623fdae0bcf91e2acc35e6760aa471b52c16c8633d8cd"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.386966 4921 generic.go:334] "Generic (PLEG): container finished" podID="7fef7638-98df-405a-b04b-f47997b46eac" containerID="8a428a004257dc86d124063cbca02a8f630a0448db405cd9804ec3c2ca410ac8" exitCode=0 Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.387970 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6c4k" event={"ID":"7fef7638-98df-405a-b04b-f47997b46eac","Type":"ContainerDied","Data":"8a428a004257dc86d124063cbca02a8f630a0448db405cd9804ec3c2ca410ac8"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.404495 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:51 crc kubenswrapper[4921]: E0312 13:11:51.405739 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:51.905723016 +0000 UTC m=+134.595794987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.406132 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" event={"ID":"50255da2-a710-48bc-8a00-36146dec247a","Type":"ContainerStarted","Data":"b40ebdc2d126fdd27fba30ed8edf7ec7397c6b20495fd9679d68be6a35c9f435"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.406235 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" event={"ID":"50255da2-a710-48bc-8a00-36146dec247a","Type":"ContainerStarted","Data":"2305ee4f5d56e4ed376d76acaefbfa8bc79bcb02ac439b5150641d502d6e7321"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.415486 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" event={"ID":"345c99f7-75d2-48da-9a45-6fd8ce5c92da","Type":"ContainerStarted","Data":"be90ba38bb3b826de6c3414560d9cf232f9ea7bd9e1647ed9a1cb3ebd2f0ec3f"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.415526 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" event={"ID":"345c99f7-75d2-48da-9a45-6fd8ce5c92da","Type":"ContainerStarted","Data":"b9088505d879094da904f77caa88dd8a75f112bec6d2181030b711687c357421"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.416011 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-655n6" podStartSLOduration=14.415999425 podStartE2EDuration="14.415999425s" podCreationTimestamp="2026-03-12 13:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:51.331478659 +0000 UTC m=+134.021550640" watchObservedRunningTime="2026-03-12 13:11:51.415999425 +0000 UTC m=+134.106071396" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.440106 4921 generic.go:334] "Generic (PLEG): container finished" podID="0a8433ae-09da-4dfb-98c6-922fcfbaa546" containerID="26db567317c4d838c1ba4430fd11155f11bcc177ad72e43713aae1ce4b31637a" exitCode=0 Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.440194 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfjjf" event={"ID":"0a8433ae-09da-4dfb-98c6-922fcfbaa546","Type":"ContainerDied","Data":"26db567317c4d838c1ba4430fd11155f11bcc177ad72e43713aae1ce4b31637a"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.451986 4921 generic.go:334] "Generic (PLEG): container finished" podID="2b934596-5580-41ba-8ad2-8722f4cf476d" containerID="e2fb925d9c390d588c644d7893e9e701b9fd982b2d5ab2f17b19f9d6ea8f2acc" exitCode=0 Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.452757 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tzw9" event={"ID":"2b934596-5580-41ba-8ad2-8722f4cf476d","Type":"ContainerDied","Data":"e2fb925d9c390d588c644d7893e9e701b9fd982b2d5ab2f17b19f9d6ea8f2acc"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.472002 4921 generic.go:334] "Generic (PLEG): container finished" podID="ec0983c2-4cd5-41aa-972c-60dd47817a5b" containerID="82b9d14f50937c345f0b422d4cfa5bb524775f1e32d61224f90b085dd36101eb" exitCode=0 Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.472076 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv2xc" event={"ID":"ec0983c2-4cd5-41aa-972c-60dd47817a5b","Type":"ContainerDied","Data":"82b9d14f50937c345f0b422d4cfa5bb524775f1e32d61224f90b085dd36101eb"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.472103 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv2xc" event={"ID":"ec0983c2-4cd5-41aa-972c-60dd47817a5b","Type":"ContainerStarted","Data":"0f6f61564077b52afe0655e33f51df2254868c50feeb4b55cd7844c02ce9a90a"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.477742 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ff9848fa-2816-4e2c-96a8-b7bc9a13ceed","Type":"ContainerStarted","Data":"a70f4053bb5eb87eaaee854cab3d5616af2e5eb8c697c419132c37ab9bfff786"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.477865 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ff9848fa-2816-4e2c-96a8-b7bc9a13ceed","Type":"ContainerStarted","Data":"60d4bac1444c4758688f786b262b1c259b3262c85f5dcb32b8db910b0423c883"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.484442 4921 generic.go:334] "Generic (PLEG): container finished" podID="d6868925-795c-4765-9343-0b147db98216" containerID="188e540798880f0c525ad97b0007253b8558e5f555ef06379a81211555274157" exitCode=0 Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.484503 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ll7bv" event={"ID":"d6868925-795c-4765-9343-0b147db98216","Type":"ContainerDied","Data":"188e540798880f0c525ad97b0007253b8558e5f555ef06379a81211555274157"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.494790 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" event={"ID":"bd5e10c8-1017-4083-a5d8-550f2aca7920","Type":"ContainerStarted","Data":"355987f2d41ec05269d6545d24241a0496332ae1a95e8a73248e72c657697a7c"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.498038 4921 generic.go:334] "Generic (PLEG): container finished" podID="98a0cc52-4219-45b7-a15f-d763979accbc" containerID="2fdb60c392fbb8f8c9ea8b7ccad712152ff961f32dc62f2f535160880bf7c9f4" exitCode=0 Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.498098 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pztgf" event={"ID":"98a0cc52-4219-45b7-a15f-d763979accbc","Type":"ContainerDied","Data":"2fdb60c392fbb8f8c9ea8b7ccad712152ff961f32dc62f2f535160880bf7c9f4"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.503981 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" podStartSLOduration=69.503964897 podStartE2EDuration="1m9.503964897s" podCreationTimestamp="2026-03-12 13:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:51.496461944 +0000 UTC m=+134.186533915" watchObservedRunningTime="2026-03-12 13:11:51.503964897 +0000 UTC m=+134.194036868" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.504826 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" event={"ID":"675e0fd3-342d-46b4-968a-33dd611eb8c0","Type":"ContainerStarted","Data":"925264d5a1cbf1711a64c48e1e9ed9d0be3826de4e57f19940781ac536f03560"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.505826 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:51 crc kubenswrapper[4921]: E0312 13:11:51.508234 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.00822159 +0000 UTC m=+134.698293561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.508580 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dctml" event={"ID":"9a31a895-ced3-4285-8105-448501c3ceac","Type":"ContainerStarted","Data":"23936c8c76a5bb78bb1d39b4d828812f24128e6dd10c6c6cc059ab7f6717f220"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.521509 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hl25r" event={"ID":"8f8a5699-e034-4255-b82a-d58becd6a2aa","Type":"ContainerStarted","Data":"c2ce19cb9a59d93d932dcb9e447967aa82349107fa0aa1c75b92d28aae0c78f8"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.526361 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-fp9vb" podStartSLOduration=68.526346282 podStartE2EDuration="1m8.526346282s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:51.525186827 +0000 UTC m=+134.215258788" watchObservedRunningTime="2026-03-12 13:11:51.526346282 +0000 UTC m=+134.216418253" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.532797 4921 generic.go:334] "Generic (PLEG): container finished" podID="82dff338-35e1-44df-8f20-a4d4d8b3c198" containerID="9ad326b7d0a378ea95e6e8aecd64182c5ec970f35774884b64f7bb38728636ab" exitCode=0 Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.532960 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvkz9" event={"ID":"82dff338-35e1-44df-8f20-a4d4d8b3c198","Type":"ContainerDied","Data":"9ad326b7d0a378ea95e6e8aecd64182c5ec970f35774884b64f7bb38728636ab"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.544786 4921 ???:1] "http: TLS handshake error from 192.168.126.11:48510: no serving certificate available for the kubelet" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.553587 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2" event={"ID":"63c96b06-6182-4472-b8a8-393c627c77c9","Type":"ContainerStarted","Data":"89d380a24f41c0e07bd38dec9b1cf75fd4c5e212572661fbed97c0317fa2d977"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.553632 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2" event={"ID":"63c96b06-6182-4472-b8a8-393c627c77c9","Type":"ContainerStarted","Data":"fe08edfa583c6559368cab328e773f024fee38ecc8d9042e0a891c111bbc667f"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.570658 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"dc884fdf-9890-4cc6-b0cf-9028a290209b","Type":"ContainerStarted","Data":"7caf05766f7630931488c0c122aa3079a1e0fd9d6ceb68aa09e91b669f91b14a"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.573005 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-4c92v" podStartSLOduration=69.572991492 podStartE2EDuration="1m9.572991492s" podCreationTimestamp="2026-03-12 13:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:51.571164375 +0000 UTC m=+134.261236346" watchObservedRunningTime="2026-03-12 13:11:51.572991492 +0000 UTC m=+134.263063463" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.579769 4921 generic.go:334] "Generic (PLEG): container finished" podID="dc904419-43b3-4164-8efb-b493171791cc" containerID="5cc786e0764817cbde2bb24b1ebed0ddc75fc26e757dba2a7eccc80eab22b728" exitCode=0 Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.579978 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7vlvg" event={"ID":"dc904419-43b3-4164-8efb-b493171791cc","Type":"ContainerDied","Data":"5cc786e0764817cbde2bb24b1ebed0ddc75fc26e757dba2a7eccc80eab22b728"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.583893 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5jsfz" event={"ID":"9823f1cf-662f-4896-a6a0-a3bfb3aa660b","Type":"ContainerStarted","Data":"07eb8860db454cdd76df4d31736c942672df8d2aba9a7d89cbc4785889451ac3"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.584010 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5jsfz" event={"ID":"9823f1cf-662f-4896-a6a0-a3bfb3aa660b","Type":"ContainerStarted","Data":"172bdd1235f03f8f5e0572984aa23f6e2d5a2dbd63a9e75ac3f2d811de747157"} Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.597977 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.598265 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.605347 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-gdgrq" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.606919 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:51 crc kubenswrapper[4921]: E0312 13:11:51.608550 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.108527956 +0000 UTC m=+134.798599927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.721411 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.727973 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=4.727953085 podStartE2EDuration="4.727953085s" podCreationTimestamp="2026-03-12 13:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:51.72777393 +0000 UTC m=+134.417845901" watchObservedRunningTime="2026-03-12 13:11:51.727953085 +0000 UTC m=+134.418025056" Mar 12 13:11:51 crc kubenswrapper[4921]: E0312 13:11:51.734029 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.234010394 +0000 UTC m=+134.924082365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.747753 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-r7sfx" podStartSLOduration=68.74773363 podStartE2EDuration="1m8.74773363s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:51.70651274 +0000 UTC m=+134.396584711" watchObservedRunningTime="2026-03-12 13:11:51.74773363 +0000 UTC m=+134.437805601" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.788864 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-bs76n" podStartSLOduration=68.788845888 podStartE2EDuration="1m8.788845888s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:51.764127019 +0000 UTC m=+134.454198990" watchObservedRunningTime="2026-03-12 13:11:51.788845888 +0000 UTC m=+134.478917859" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.791695 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5jsfz" podStartSLOduration=68.791667085 podStartE2EDuration="1m8.791667085s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:51.786729601 +0000 UTC m=+134.476801592" watchObservedRunningTime="2026-03-12 13:11:51.791667085 +0000 UTC m=+134.481739066" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.811789 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-hl25r" podStartSLOduration=68.811772849 podStartE2EDuration="1m8.811772849s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:51.811265474 +0000 UTC m=+134.501337455" watchObservedRunningTime="2026-03-12 13:11:51.811772849 +0000 UTC m=+134.501844820" Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.827470 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:51 crc kubenswrapper[4921]: E0312 13:11:51.828022 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.328004834 +0000 UTC m=+135.018076805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.930480 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:51 crc kubenswrapper[4921]: E0312 13:11:51.930803 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.430791327 +0000 UTC m=+135.120863298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:51 crc kubenswrapper[4921]: I0312 13:11:51.944770 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" podStartSLOduration=68.944750531 podStartE2EDuration="1m8.944750531s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:51.919646381 +0000 UTC m=+134.609718352" watchObservedRunningTime="2026-03-12 13:11:51.944750531 +0000 UTC m=+134.634822502" Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.031255 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.031423 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.531392972 +0000 UTC m=+135.221464953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.031485 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.032149 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.532129635 +0000 UTC m=+135.222201606 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.045379 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tfcm2" podStartSLOduration=70.045363536 podStartE2EDuration="1m10.045363536s" podCreationTimestamp="2026-03-12 13:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:52.018638736 +0000 UTC m=+134.708710707" watchObservedRunningTime="2026-03-12 13:11:52.045363536 +0000 UTC m=+134.735435507" Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.133254 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.134076 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.634055891 +0000 UTC m=+135.324127862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.134756 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.135103 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.635090173 +0000 UTC m=+135.325162144 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.158488 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.158664 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:11:52 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:11:52 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:11:52 crc kubenswrapper[4921]: healthz check failed Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.158695 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.158799 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.167043 4921 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-96jwt container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.167091 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" podUID="675e0fd3-342d-46b4-968a-33dd611eb8c0" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.9:8443/livez\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.177797 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6pfpd" Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.236476 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.236617 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.736600587 +0000 UTC m=+135.426672548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.236674 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.237007 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.737000119 +0000 UTC m=+135.427072090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.339402 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.339596 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.839566575 +0000 UTC m=+135.529638546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.339650 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.340235 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.840224916 +0000 UTC m=+135.530296887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.441138 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.441234 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.941212783 +0000 UTC m=+135.631284764 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.441362 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.441618 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:52.941607906 +0000 UTC m=+135.631679877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.542248 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.542410 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:53.042386166 +0000 UTC m=+135.732458137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.543010 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.543289 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:53.043277633 +0000 UTC m=+135.733349604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.600957 4921 generic.go:334] "Generic (PLEG): container finished" podID="ff9848fa-2816-4e2c-96a8-b7bc9a13ceed" containerID="a70f4053bb5eb87eaaee854cab3d5616af2e5eb8c697c419132c37ab9bfff786" exitCode=0 Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.601019 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ff9848fa-2816-4e2c-96a8-b7bc9a13ceed","Type":"ContainerDied","Data":"a70f4053bb5eb87eaaee854cab3d5616af2e5eb8c697c419132c37ab9bfff786"} Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.615863 4921 generic.go:334] "Generic (PLEG): container finished" podID="dc884fdf-9890-4cc6-b0cf-9028a290209b" containerID="7caf05766f7630931488c0c122aa3079a1e0fd9d6ceb68aa09e91b669f91b14a" exitCode=0 Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.615943 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"dc884fdf-9890-4cc6-b0cf-9028a290209b","Type":"ContainerDied","Data":"7caf05766f7630931488c0c122aa3079a1e0fd9d6ceb68aa09e91b669f91b14a"} Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.644453 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.644759 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:53.144744815 +0000 UTC m=+135.834816786 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.654954 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pztgf" event={"ID":"98a0cc52-4219-45b7-a15f-d763979accbc","Type":"ContainerStarted","Data":"c4a7441ff56aee4fe579f608993d5f240103609c02b75c8f4f7c7c19758b8fbc"} Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.654997 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pztgf" event={"ID":"98a0cc52-4219-45b7-a15f-d763979accbc","Type":"ContainerStarted","Data":"1e5a5d2b7e7f7a2345b4206c378753d0ed3fee3876f8ade59e41090dfe01695e"} Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.663707 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dctml" event={"ID":"9a31a895-ced3-4285-8105-448501c3ceac","Type":"ContainerStarted","Data":"8a98472dc1136873ad0055554b5414ee4a35b7a325513eaea31a6300161886ae"} Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.673230 4921 patch_prober.go:28] interesting pod/downloads-7954f5f757-4z7zk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.673278 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4z7zk" podUID="a7c45059-acf8-4cb3-b1f6-f07128d72141" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.681559 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" podUID="81ec102d-42ba-4d41-952d-d36fa110e626" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://f5555b0ef0e3d30a1c2751b3c85c0f7c91680c00716833d8ce887a67c2b15813" gracePeriod=30 Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.685866 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.695919 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-pztgf" podStartSLOduration=70.695897155 podStartE2EDuration="1m10.695897155s" podCreationTimestamp="2026-03-12 13:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:52.693907002 +0000 UTC m=+135.383978973" watchObservedRunningTime="2026-03-12 13:11:52.695897155 +0000 UTC m=+135.385969126" Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.746594 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.754434 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:53.254419442 +0000 UTC m=+135.944491413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.849221 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.850673 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:53.350646408 +0000 UTC m=+136.040718379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.850861 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.852756 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:53.352744165 +0000 UTC m=+136.042816136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.919602 4921 ???:1] "http: TLS handshake error from 192.168.126.11:48514: no serving certificate available for the kubelet" Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.962589 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.962687 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:53.462670732 +0000 UTC m=+136.152742703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:52 crc kubenswrapper[4921]: I0312 13:11:52.962923 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:52 crc kubenswrapper[4921]: E0312 13:11:52.963228 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:53.463220087 +0000 UTC m=+136.153292058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.065234 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:53 crc kubenswrapper[4921]: E0312 13:11:53.065592 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:53.5655734 +0000 UTC m=+136.255645371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.126206 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.158096 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:11:53 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:11:53 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:11:53 crc kubenswrapper[4921]: healthz check failed Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.158154 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.166733 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:53 crc kubenswrapper[4921]: E0312 13:11:53.167091 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:53.66708046 +0000 UTC m=+136.357152431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.185321 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.269691 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc884fdf-9890-4cc6-b0cf-9028a290209b-kube-api-access\") pod \"dc884fdf-9890-4cc6-b0cf-9028a290209b\" (UID: \"dc884fdf-9890-4cc6-b0cf-9028a290209b\") " Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.269874 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.269897 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc884fdf-9890-4cc6-b0cf-9028a290209b-kubelet-dir\") pod \"dc884fdf-9890-4cc6-b0cf-9028a290209b\" (UID: \"dc884fdf-9890-4cc6-b0cf-9028a290209b\") " Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.270159 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc884fdf-9890-4cc6-b0cf-9028a290209b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dc884fdf-9890-4cc6-b0cf-9028a290209b" (UID: "dc884fdf-9890-4cc6-b0cf-9028a290209b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:11:53 crc kubenswrapper[4921]: E0312 13:11:53.270229 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:53.770215765 +0000 UTC m=+136.460287736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.301052 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc884fdf-9890-4cc6-b0cf-9028a290209b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dc884fdf-9890-4cc6-b0cf-9028a290209b" (UID: "dc884fdf-9890-4cc6-b0cf-9028a290209b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.381602 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.381671 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc884fdf-9890-4cc6-b0cf-9028a290209b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.381684 4921 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc884fdf-9890-4cc6-b0cf-9028a290209b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:53 crc kubenswrapper[4921]: E0312 13:11:53.381953 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:53.88194192 +0000 UTC m=+136.572013891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.482848 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:53 crc kubenswrapper[4921]: E0312 13:11:53.483179 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:53.983163882 +0000 UTC m=+136.673235853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.584110 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:53 crc kubenswrapper[4921]: E0312 13:11:53.584489 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:54.084471847 +0000 UTC m=+136.774543808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.634994 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r6qq6" Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.684898 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:53 crc kubenswrapper[4921]: E0312 13:11:53.685045 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:54.185020971 +0000 UTC m=+136.875092942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.685222 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:53 crc kubenswrapper[4921]: E0312 13:11:53.685512 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:54.185500534 +0000 UTC m=+136.875572495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.717661 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.719248 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"dc884fdf-9890-4cc6-b0cf-9028a290209b","Type":"ContainerDied","Data":"1b295437fc60623ad3203050f31d85f446fbe03b0cbee3955e9797353e386002"} Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.719281 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b295437fc60623ad3203050f31d85f446fbe03b0cbee3955e9797353e386002" Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.792164 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:53 crc kubenswrapper[4921]: E0312 13:11:53.792386 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:54.292361199 +0000 UTC m=+136.982433170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.792546 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:53 crc kubenswrapper[4921]: E0312 13:11:53.792944 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:54.292937055 +0000 UTC m=+136.983009026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.827669 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j5pwt"] Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.884551 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r"] Mar 12 13:11:53 crc kubenswrapper[4921]: I0312 13:11:53.899069 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:53 crc kubenswrapper[4921]: E0312 13:11:53.899952 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:54.399936603 +0000 UTC m=+137.090008574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.003773 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:54 crc kubenswrapper[4921]: E0312 13:11:54.004603 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:54.504591437 +0000 UTC m=+137.194663408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.082780 4921 ???:1] "http: TLS handshake error from 192.168.126.11:59622: no serving certificate available for the kubelet" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.118582 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:54 crc kubenswrapper[4921]: E0312 13:11:54.118954 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-12 13:11:54.618917532 +0000 UTC m=+137.308989493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.119728 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:54 crc kubenswrapper[4921]: E0312 13:11:54.120125 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-12 13:11:54.620112225 +0000 UTC m=+137.310184196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-f2bw7" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.127300 4921 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.137193 4921 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-12T13:11:54.127333329Z","Handler":null,"Name":""} Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.145369 4921 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.145400 4921 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.151803 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:11:54 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:11:54 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:11:54 crc kubenswrapper[4921]: healthz check failed Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.151884 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.177631 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.226353 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.239512 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.327835 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff9848fa-2816-4e2c-96a8-b7bc9a13ceed-kube-api-access\") pod \"ff9848fa-2816-4e2c-96a8-b7bc9a13ceed\" (UID: \"ff9848fa-2816-4e2c-96a8-b7bc9a13ceed\") " Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.327880 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff9848fa-2816-4e2c-96a8-b7bc9a13ceed-kubelet-dir\") pod \"ff9848fa-2816-4e2c-96a8-b7bc9a13ceed\" (UID: \"ff9848fa-2816-4e2c-96a8-b7bc9a13ceed\") " Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.328036 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.328438 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff9848fa-2816-4e2c-96a8-b7bc9a13ceed-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ff9848fa-2816-4e2c-96a8-b7bc9a13ceed" (UID: "ff9848fa-2816-4e2c-96a8-b7bc9a13ceed"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.340352 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff9848fa-2816-4e2c-96a8-b7bc9a13ceed-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ff9848fa-2816-4e2c-96a8-b7bc9a13ceed" (UID: "ff9848fa-2816-4e2c-96a8-b7bc9a13ceed"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.362224 4921 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.362261 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.429489 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff9848fa-2816-4e2c-96a8-b7bc9a13ceed-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.429523 4921 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff9848fa-2816-4e2c-96a8-b7bc9a13ceed-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.499725 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-f2bw7\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.762903 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ff9848fa-2816-4e2c-96a8-b7bc9a13ceed","Type":"ContainerDied","Data":"60d4bac1444c4758688f786b262b1c259b3262c85f5dcb32b8db910b0423c883"} Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.763220 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60d4bac1444c4758688f786b262b1c259b3262c85f5dcb32b8db910b0423c883" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.763002 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.767572 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" podUID="345c9c4b-5322-4521-abdb-5736718e654c" containerName="route-controller-manager" containerID="cri-o://2b07b501298358a146643fbd981403403515df6d0354560f1c1ed58c213fa421" gracePeriod=30 Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.767860 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dctml" event={"ID":"9a31a895-ced3-4285-8105-448501c3ceac","Type":"ContainerStarted","Data":"f421172b445672e895963cdf80c929330cc88ef2db4de400b0c8005746318eb7"} Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.767891 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dctml" event={"ID":"9a31a895-ced3-4285-8105-448501c3ceac","Type":"ContainerStarted","Data":"9b57df3ca96246e1049b0c4daf08c06ab7171184370880e1434884fba398a836"} Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.768595 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" podUID="5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21" containerName="controller-manager" containerID="cri-o://57464ac9462fdb3f64170d72c165d6e904c91474503786b3e10ce2aec76543b7" gracePeriod=30 Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.807317 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 12 13:11:54 crc kubenswrapper[4921]: I0312 13:11:54.814799 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.150358 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:11:55 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:11:55 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:11:55 crc kubenswrapper[4921]: healthz check failed Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.150627 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.306573 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.445941 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmqtf\" (UniqueName: \"kubernetes.io/projected/345c9c4b-5322-4521-abdb-5736718e654c-kube-api-access-qmqtf\") pod \"345c9c4b-5322-4521-abdb-5736718e654c\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.446053 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/345c9c4b-5322-4521-abdb-5736718e654c-client-ca\") pod \"345c9c4b-5322-4521-abdb-5736718e654c\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.446151 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345c9c4b-5322-4521-abdb-5736718e654c-config\") pod \"345c9c4b-5322-4521-abdb-5736718e654c\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.446190 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/345c9c4b-5322-4521-abdb-5736718e654c-serving-cert\") pod \"345c9c4b-5322-4521-abdb-5736718e654c\" (UID: \"345c9c4b-5322-4521-abdb-5736718e654c\") " Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.447148 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/345c9c4b-5322-4521-abdb-5736718e654c-client-ca" (OuterVolumeSpecName: "client-ca") pod "345c9c4b-5322-4521-abdb-5736718e654c" (UID: "345c9c4b-5322-4521-abdb-5736718e654c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.447212 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/345c9c4b-5322-4521-abdb-5736718e654c-config" (OuterVolumeSpecName: "config") pod "345c9c4b-5322-4521-abdb-5736718e654c" (UID: "345c9c4b-5322-4521-abdb-5736718e654c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.452405 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/345c9c4b-5322-4521-abdb-5736718e654c-kube-api-access-qmqtf" (OuterVolumeSpecName: "kube-api-access-qmqtf") pod "345c9c4b-5322-4521-abdb-5736718e654c" (UID: "345c9c4b-5322-4521-abdb-5736718e654c"). InnerVolumeSpecName "kube-api-access-qmqtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.452587 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/345c9c4b-5322-4521-abdb-5736718e654c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "345c9c4b-5322-4521-abdb-5736718e654c" (UID: "345c9c4b-5322-4521-abdb-5736718e654c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.473087 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f2bw7"] Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.514402 4921 ???:1] "http: TLS handshake error from 192.168.126.11:59632: no serving certificate available for the kubelet" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.547277 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmqtf\" (UniqueName: \"kubernetes.io/projected/345c9c4b-5322-4521-abdb-5736718e654c-kube-api-access-qmqtf\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.547305 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/345c9c4b-5322-4521-abdb-5736718e654c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.547316 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/345c9c4b-5322-4521-abdb-5736718e654c-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.547324 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/345c9c4b-5322-4521-abdb-5736718e654c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:55 crc kubenswrapper[4921]: W0312 13:11:55.550029 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29a3ac39_3f54_47f8_947e_c5d5f4709c23.slice/crio-ef4379a2a58f57caa146d2690162fcab9e8047b97f9b297a5f0933354c774d1b WatchSource:0}: Error finding container ef4379a2a58f57caa146d2690162fcab9e8047b97f9b297a5f0933354c774d1b: Status 404 returned error can't find the container with id ef4379a2a58f57caa146d2690162fcab9e8047b97f9b297a5f0933354c774d1b Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.557662 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.648553 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-serving-cert\") pod \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.648591 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-proxy-ca-bundles\") pod \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.648608 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-config\") pod \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.648688 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbqb2\" (UniqueName: \"kubernetes.io/projected/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-kube-api-access-zbqb2\") pod \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.648728 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-client-ca\") pod \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\" (UID: \"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21\") " Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.649725 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21" (UID: "5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.649901 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-client-ca" (OuterVolumeSpecName: "client-ca") pod "5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21" (UID: "5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.652360 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-config" (OuterVolumeSpecName: "config") pod "5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21" (UID: "5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.655765 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21" (UID: "5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.658427 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-kube-api-access-zbqb2" (OuterVolumeSpecName: "kube-api-access-zbqb2") pod "5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21" (UID: "5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21"). InnerVolumeSpecName "kube-api-access-zbqb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.749886 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.749921 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.749939 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.749951 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbqb2\" (UniqueName: \"kubernetes.io/projected/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-kube-api-access-zbqb2\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.749962 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.778213 4921 generic.go:334] "Generic (PLEG): container finished" podID="5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21" containerID="57464ac9462fdb3f64170d72c165d6e904c91474503786b3e10ce2aec76543b7" exitCode=0 Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.778268 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.778329 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" event={"ID":"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21","Type":"ContainerDied","Data":"57464ac9462fdb3f64170d72c165d6e904c91474503786b3e10ce2aec76543b7"} Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.778393 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-j5pwt" event={"ID":"5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21","Type":"ContainerDied","Data":"4695ea916f6c28ee297a09f6c9d010b9cc32e8ce5d17780612467b079d88546c"} Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.778432 4921 scope.go:117] "RemoveContainer" containerID="57464ac9462fdb3f64170d72c165d6e904c91474503786b3e10ce2aec76543b7" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.781253 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" event={"ID":"29a3ac39-3f54-47f8-947e-c5d5f4709c23","Type":"ContainerStarted","Data":"1285ddcfe93f2f1759ca10ab17d22ad416b14b5247186241e5f08b7fd0b691ed"} Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.781287 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" event={"ID":"29a3ac39-3f54-47f8-947e-c5d5f4709c23","Type":"ContainerStarted","Data":"ef4379a2a58f57caa146d2690162fcab9e8047b97f9b297a5f0933354c774d1b"} Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.781322 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.784545 4921 generic.go:334] "Generic (PLEG): container finished" podID="345c9c4b-5322-4521-abdb-5736718e654c" containerID="2b07b501298358a146643fbd981403403515df6d0354560f1c1ed58c213fa421" exitCode=0 Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.784574 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.784609 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" event={"ID":"345c9c4b-5322-4521-abdb-5736718e654c","Type":"ContainerDied","Data":"2b07b501298358a146643fbd981403403515df6d0354560f1c1ed58c213fa421"} Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.784633 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r" event={"ID":"345c9c4b-5322-4521-abdb-5736718e654c","Type":"ContainerDied","Data":"0d8c85d3741c915c71a189d421d3037f757c8248f2984b69f283e1154f6099e6"} Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.790290 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dctml" event={"ID":"9a31a895-ced3-4285-8105-448501c3ceac","Type":"ContainerStarted","Data":"b65d5aaaef06b2e2beb75d89e87907f9e46dcdd5b009daed6db673583c897f2f"} Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.809371 4921 scope.go:117] "RemoveContainer" containerID="57464ac9462fdb3f64170d72c165d6e904c91474503786b3e10ce2aec76543b7" Mar 12 13:11:55 crc kubenswrapper[4921]: E0312 13:11:55.820207 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57464ac9462fdb3f64170d72c165d6e904c91474503786b3e10ce2aec76543b7\": container with ID starting with 57464ac9462fdb3f64170d72c165d6e904c91474503786b3e10ce2aec76543b7 not found: ID does not exist" containerID="57464ac9462fdb3f64170d72c165d6e904c91474503786b3e10ce2aec76543b7" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.821322 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57464ac9462fdb3f64170d72c165d6e904c91474503786b3e10ce2aec76543b7"} err="failed to get container status \"57464ac9462fdb3f64170d72c165d6e904c91474503786b3e10ce2aec76543b7\": rpc error: code = NotFound desc = could not find container \"57464ac9462fdb3f64170d72c165d6e904c91474503786b3e10ce2aec76543b7\": container with ID starting with 57464ac9462fdb3f64170d72c165d6e904c91474503786b3e10ce2aec76543b7 not found: ID does not exist" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.821363 4921 scope.go:117] "RemoveContainer" containerID="2b07b501298358a146643fbd981403403515df6d0354560f1c1ed58c213fa421" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.822272 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" podStartSLOduration=72.82225276 podStartE2EDuration="1m12.82225276s" podCreationTimestamp="2026-03-12 13:10:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:55.800126415 +0000 UTC m=+138.490198386" watchObservedRunningTime="2026-03-12 13:11:55.82225276 +0000 UTC m=+138.512324741" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.833018 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j5pwt"] Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.840049 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-j5pwt"] Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.841035 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-dctml" podStartSLOduration=18.841026004 podStartE2EDuration="18.841026004s" podCreationTimestamp="2026-03-12 13:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:55.831725335 +0000 UTC m=+138.521797306" watchObservedRunningTime="2026-03-12 13:11:55.841026004 +0000 UTC m=+138.531097975" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.851232 4921 scope.go:117] "RemoveContainer" containerID="2b07b501298358a146643fbd981403403515df6d0354560f1c1ed58c213fa421" Mar 12 13:11:55 crc kubenswrapper[4921]: E0312 13:11:55.855460 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b07b501298358a146643fbd981403403515df6d0354560f1c1ed58c213fa421\": container with ID starting with 2b07b501298358a146643fbd981403403515df6d0354560f1c1ed58c213fa421 not found: ID does not exist" containerID="2b07b501298358a146643fbd981403403515df6d0354560f1c1ed58c213fa421" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.855536 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b07b501298358a146643fbd981403403515df6d0354560f1c1ed58c213fa421"} err="failed to get container status \"2b07b501298358a146643fbd981403403515df6d0354560f1c1ed58c213fa421\": rpc error: code = NotFound desc = could not find container \"2b07b501298358a146643fbd981403403515df6d0354560f1c1ed58c213fa421\": container with ID starting with 2b07b501298358a146643fbd981403403515df6d0354560f1c1ed58c213fa421 not found: ID does not exist" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.855596 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r"] Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.861880 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-zdw4r"] Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.990105 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="345c9c4b-5322-4521-abdb-5736718e654c" path="/var/lib/kubelet/pods/345c9c4b-5322-4521-abdb-5736718e654c/volumes" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.990933 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21" path="/var/lib/kubelet/pods/5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21/volumes" Mar 12 13:11:55 crc kubenswrapper[4921]: I0312 13:11:55.991738 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.150865 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:11:56 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:11:56 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:11:56 crc kubenswrapper[4921]: healthz check failed Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.150937 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.403947 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.691864 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4"] Mar 12 13:11:56 crc kubenswrapper[4921]: E0312 13:11:56.692186 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9848fa-2816-4e2c-96a8-b7bc9a13ceed" containerName="pruner" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.692202 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9848fa-2816-4e2c-96a8-b7bc9a13ceed" containerName="pruner" Mar 12 13:11:56 crc kubenswrapper[4921]: E0312 13:11:56.692216 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc884fdf-9890-4cc6-b0cf-9028a290209b" containerName="pruner" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.692224 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc884fdf-9890-4cc6-b0cf-9028a290209b" containerName="pruner" Mar 12 13:11:56 crc kubenswrapper[4921]: E0312 13:11:56.692249 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f403288d-b503-4f0c-bf83-3b29ff86ab94" containerName="collect-profiles" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.692255 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f403288d-b503-4f0c-bf83-3b29ff86ab94" containerName="collect-profiles" Mar 12 13:11:56 crc kubenswrapper[4921]: E0312 13:11:56.692269 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="345c9c4b-5322-4521-abdb-5736718e654c" containerName="route-controller-manager" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.692275 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="345c9c4b-5322-4521-abdb-5736718e654c" containerName="route-controller-manager" Mar 12 13:11:56 crc kubenswrapper[4921]: E0312 13:11:56.692284 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21" containerName="controller-manager" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.692290 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21" containerName="controller-manager" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.692388 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c9974c3-36df-4c1b-b6a2-c5bfb5e98f21" containerName="controller-manager" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.692425 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9848fa-2816-4e2c-96a8-b7bc9a13ceed" containerName="pruner" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.692435 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc884fdf-9890-4cc6-b0cf-9028a290209b" containerName="pruner" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.692444 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="345c9c4b-5322-4521-abdb-5736718e654c" containerName="route-controller-manager" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.692452 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f403288d-b503-4f0c-bf83-3b29ff86ab94" containerName="collect-profiles" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.693005 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.696865 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.697044 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.697193 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.697329 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.698024 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.698385 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.707727 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz"] Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.708952 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.712307 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.712969 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.713029 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.713272 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.713454 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.722061 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.737046 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz"] Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.738033 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.742831 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4"] Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.767985 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870dbb4d-dc1c-4be9-840c-e1a7450587cc-config\") pod \"route-controller-manager-76876fdf5-62fl4\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.768491 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/870dbb4d-dc1c-4be9-840c-e1a7450587cc-client-ca\") pod \"route-controller-manager-76876fdf5-62fl4\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.768645 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf8tl\" (UniqueName: \"kubernetes.io/projected/870dbb4d-dc1c-4be9-840c-e1a7450587cc-kube-api-access-hf8tl\") pod \"route-controller-manager-76876fdf5-62fl4\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.768824 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/870dbb4d-dc1c-4be9-840c-e1a7450587cc-serving-cert\") pod \"route-controller-manager-76876fdf5-62fl4\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.869922 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-config\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.869986 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-client-ca\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.870020 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/870dbb4d-dc1c-4be9-840c-e1a7450587cc-client-ca\") pod \"route-controller-manager-76876fdf5-62fl4\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.870042 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klv66\" (UniqueName: \"kubernetes.io/projected/8feaebdc-ac88-4341-8989-d9617b8ef2a1-kube-api-access-klv66\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.870078 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8feaebdc-ac88-4341-8989-d9617b8ef2a1-serving-cert\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.870099 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf8tl\" (UniqueName: \"kubernetes.io/projected/870dbb4d-dc1c-4be9-840c-e1a7450587cc-kube-api-access-hf8tl\") pod \"route-controller-manager-76876fdf5-62fl4\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.870128 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-proxy-ca-bundles\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.870153 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/870dbb4d-dc1c-4be9-840c-e1a7450587cc-serving-cert\") pod \"route-controller-manager-76876fdf5-62fl4\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.870192 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870dbb4d-dc1c-4be9-840c-e1a7450587cc-config\") pod \"route-controller-manager-76876fdf5-62fl4\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.871328 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/870dbb4d-dc1c-4be9-840c-e1a7450587cc-client-ca\") pod \"route-controller-manager-76876fdf5-62fl4\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.871694 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870dbb4d-dc1c-4be9-840c-e1a7450587cc-config\") pod \"route-controller-manager-76876fdf5-62fl4\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.877947 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/870dbb4d-dc1c-4be9-840c-e1a7450587cc-serving-cert\") pod \"route-controller-manager-76876fdf5-62fl4\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.902527 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf8tl\" (UniqueName: \"kubernetes.io/projected/870dbb4d-dc1c-4be9-840c-e1a7450587cc-kube-api-access-hf8tl\") pod \"route-controller-manager-76876fdf5-62fl4\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.940052 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz"] Mar 12 13:11:56 crc kubenswrapper[4921]: E0312 13:11:56.940506 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-klv66 proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" podUID="8feaebdc-ac88-4341-8989-d9617b8ef2a1" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.962411 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4"] Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.962864 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.971325 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-proxy-ca-bundles\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.971429 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-config\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.971466 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-client-ca\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.971485 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klv66\" (UniqueName: \"kubernetes.io/projected/8feaebdc-ac88-4341-8989-d9617b8ef2a1-kube-api-access-klv66\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.971674 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8feaebdc-ac88-4341-8989-d9617b8ef2a1-serving-cert\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.973423 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-proxy-ca-bundles\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.997690 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-client-ca\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:56 crc kubenswrapper[4921]: I0312 13:11:56.998410 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-config\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.002668 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8feaebdc-ac88-4341-8989-d9617b8ef2a1-serving-cert\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.024246 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klv66\" (UniqueName: \"kubernetes.io/projected/8feaebdc-ac88-4341-8989-d9617b8ef2a1-kube-api-access-klv66\") pod \"controller-manager-5969dfd8c4-h5fhz\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.155076 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.156803 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.156841 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.161554 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:11:57 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:11:57 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:11:57 crc kubenswrapper[4921]: healthz check failed Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.161623 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.161643 4921 patch_prober.go:28] interesting pod/console-f9d7485db-22xz2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.161693 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-22xz2" podUID="57677fcb-c7a5-431c-b751-ec13d22484b1" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.181327 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.186909 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-96jwt" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.411973 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.412423 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.419668 4921 patch_prober.go:28] interesting pod/downloads-7954f5f757-4z7zk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.419717 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-4z7zk" podUID="a7c45059-acf8-4cb3-b1f6-f07128d72141" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.420072 4921 patch_prober.go:28] interesting pod/downloads-7954f5f757-4z7zk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.420087 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-4z7zk" podUID="a7c45059-acf8-4cb3-b1f6-f07128d72141" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.24:8080/\": dial tcp 10.217.0.24:8080: connect: connection refused" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.420531 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.466519 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4"] Mar 12 13:11:57 crc kubenswrapper[4921]: E0312 13:11:57.537766 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5555b0ef0e3d30a1c2751b3c85c0f7c91680c00716833d8ce887a67c2b15813" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 13:11:57 crc kubenswrapper[4921]: E0312 13:11:57.539279 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5555b0ef0e3d30a1c2751b3c85c0f7c91680c00716833d8ce887a67c2b15813" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 13:11:57 crc kubenswrapper[4921]: E0312 13:11:57.553304 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5555b0ef0e3d30a1c2751b3c85c0f7c91680c00716833d8ce887a67c2b15813" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 13:11:57 crc kubenswrapper[4921]: E0312 13:11:57.553385 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" podUID="81ec102d-42ba-4d41-952d-d36fa110e626" containerName="kube-multus-additional-cni-plugins" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.828544 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" event={"ID":"870dbb4d-dc1c-4be9-840c-e1a7450587cc","Type":"ContainerStarted","Data":"3e80d7224b65dafe4605a244ac828926f7d8e04a8708f5943aa1385260a08b3c"} Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.829067 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" event={"ID":"870dbb4d-dc1c-4be9-840c-e1a7450587cc","Type":"ContainerStarted","Data":"be73937bc4d6321f45a26d65ab5cb72240beb70b3522ebca09a4beecf5be9fcf"} Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.829140 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.838660 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-pztgf" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.844910 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.993575 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-proxy-ca-bundles\") pod \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.993644 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-config\") pod \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.993706 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klv66\" (UniqueName: \"kubernetes.io/projected/8feaebdc-ac88-4341-8989-d9617b8ef2a1-kube-api-access-klv66\") pod \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.993915 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8feaebdc-ac88-4341-8989-d9617b8ef2a1-serving-cert\") pod \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.993942 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-client-ca\") pod \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\" (UID: \"8feaebdc-ac88-4341-8989-d9617b8ef2a1\") " Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.995369 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8feaebdc-ac88-4341-8989-d9617b8ef2a1" (UID: "8feaebdc-ac88-4341-8989-d9617b8ef2a1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.995666 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-client-ca" (OuterVolumeSpecName: "client-ca") pod "8feaebdc-ac88-4341-8989-d9617b8ef2a1" (UID: "8feaebdc-ac88-4341-8989-d9617b8ef2a1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:57 crc kubenswrapper[4921]: I0312 13:11:57.995876 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-config" (OuterVolumeSpecName: "config") pod "8feaebdc-ac88-4341-8989-d9617b8ef2a1" (UID: "8feaebdc-ac88-4341-8989-d9617b8ef2a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:58 crc kubenswrapper[4921]: I0312 13:11:58.009185 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8feaebdc-ac88-4341-8989-d9617b8ef2a1-kube-api-access-klv66" (OuterVolumeSpecName: "kube-api-access-klv66") pod "8feaebdc-ac88-4341-8989-d9617b8ef2a1" (UID: "8feaebdc-ac88-4341-8989-d9617b8ef2a1"). InnerVolumeSpecName "kube-api-access-klv66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:58 crc kubenswrapper[4921]: I0312 13:11:58.021139 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8feaebdc-ac88-4341-8989-d9617b8ef2a1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8feaebdc-ac88-4341-8989-d9617b8ef2a1" (UID: "8feaebdc-ac88-4341-8989-d9617b8ef2a1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:58 crc kubenswrapper[4921]: I0312 13:11:58.095717 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8feaebdc-ac88-4341-8989-d9617b8ef2a1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:58 crc kubenswrapper[4921]: I0312 13:11:58.095758 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:58 crc kubenswrapper[4921]: I0312 13:11:58.095768 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:58 crc kubenswrapper[4921]: I0312 13:11:58.095782 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8feaebdc-ac88-4341-8989-d9617b8ef2a1-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:58 crc kubenswrapper[4921]: I0312 13:11:58.095795 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klv66\" (UniqueName: \"kubernetes.io/projected/8feaebdc-ac88-4341-8989-d9617b8ef2a1-kube-api-access-klv66\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:58 crc kubenswrapper[4921]: I0312 13:11:58.152498 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:11:58 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:11:58 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:11:58 crc kubenswrapper[4921]: healthz check failed Mar 12 13:11:58 crc kubenswrapper[4921]: I0312 13:11:58.152580 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:11:58 crc kubenswrapper[4921]: I0312 13:11:58.836660 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz" Mar 12 13:11:58 crc kubenswrapper[4921]: I0312 13:11:58.836878 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" podUID="870dbb4d-dc1c-4be9-840c-e1a7450587cc" containerName="route-controller-manager" containerID="cri-o://3e80d7224b65dafe4605a244ac828926f7d8e04a8708f5943aa1385260a08b3c" gracePeriod=30 Mar 12 13:11:58 crc kubenswrapper[4921]: I0312 13:11:58.857103 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" podStartSLOduration=3.857085912 podStartE2EDuration="3.857085912s" podCreationTimestamp="2026-03-12 13:11:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:11:58.85331715 +0000 UTC m=+141.543389121" watchObservedRunningTime="2026-03-12 13:11:58.857085912 +0000 UTC m=+141.547157883" Mar 12 13:11:58 crc kubenswrapper[4921]: I0312 13:11:58.885459 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz"] Mar 12 13:11:58 crc kubenswrapper[4921]: I0312 13:11:58.889512 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5969dfd8c4-h5fhz"] Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.148136 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:11:59 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:11:59 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:11:59 crc kubenswrapper[4921]: healthz check failed Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.148456 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.309780 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.434169 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/870dbb4d-dc1c-4be9-840c-e1a7450587cc-client-ca\") pod \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.434236 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf8tl\" (UniqueName: \"kubernetes.io/projected/870dbb4d-dc1c-4be9-840c-e1a7450587cc-kube-api-access-hf8tl\") pod \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.434288 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/870dbb4d-dc1c-4be9-840c-e1a7450587cc-serving-cert\") pod \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.434323 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870dbb4d-dc1c-4be9-840c-e1a7450587cc-config\") pod \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\" (UID: \"870dbb4d-dc1c-4be9-840c-e1a7450587cc\") " Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.435343 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/870dbb4d-dc1c-4be9-840c-e1a7450587cc-config" (OuterVolumeSpecName: "config") pod "870dbb4d-dc1c-4be9-840c-e1a7450587cc" (UID: "870dbb4d-dc1c-4be9-840c-e1a7450587cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.435764 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/870dbb4d-dc1c-4be9-840c-e1a7450587cc-client-ca" (OuterVolumeSpecName: "client-ca") pod "870dbb4d-dc1c-4be9-840c-e1a7450587cc" (UID: "870dbb4d-dc1c-4be9-840c-e1a7450587cc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.444258 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870dbb4d-dc1c-4be9-840c-e1a7450587cc-kube-api-access-hf8tl" (OuterVolumeSpecName: "kube-api-access-hf8tl") pod "870dbb4d-dc1c-4be9-840c-e1a7450587cc" (UID: "870dbb4d-dc1c-4be9-840c-e1a7450587cc"). InnerVolumeSpecName "kube-api-access-hf8tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.452032 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870dbb4d-dc1c-4be9-840c-e1a7450587cc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "870dbb4d-dc1c-4be9-840c-e1a7450587cc" (UID: "870dbb4d-dc1c-4be9-840c-e1a7450587cc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.536124 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/870dbb4d-dc1c-4be9-840c-e1a7450587cc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.536158 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870dbb4d-dc1c-4be9-840c-e1a7450587cc-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.536166 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/870dbb4d-dc1c-4be9-840c-e1a7450587cc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.536176 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf8tl\" (UniqueName: \"kubernetes.io/projected/870dbb4d-dc1c-4be9-840c-e1a7450587cc-kube-api-access-hf8tl\") on node \"crc\" DevicePath \"\"" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.701246 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5c4c9867ff-skt86"] Mar 12 13:11:59 crc kubenswrapper[4921]: E0312 13:11:59.701445 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870dbb4d-dc1c-4be9-840c-e1a7450587cc" containerName="route-controller-manager" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.701456 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="870dbb4d-dc1c-4be9-840c-e1a7450587cc" containerName="route-controller-manager" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.701557 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="870dbb4d-dc1c-4be9-840c-e1a7450587cc" containerName="route-controller-manager" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.701936 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.706276 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z"] Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.707106 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.708000 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.708220 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.708759 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.708946 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.709201 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.710959 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z"] Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.714132 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c4c9867ff-skt86"] Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.716847 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.718343 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.746115 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-config\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.746154 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9dmj\" (UniqueName: \"kubernetes.io/projected/82232748-66cb-4f41-857a-b6bbd9e03cf4-kube-api-access-q9dmj\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.746175 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-proxy-ca-bundles\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.746201 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-client-ca\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.746232 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82232748-66cb-4f41-857a-b6bbd9e03cf4-serving-cert\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.844410 4921 generic.go:334] "Generic (PLEG): container finished" podID="870dbb4d-dc1c-4be9-840c-e1a7450587cc" containerID="3e80d7224b65dafe4605a244ac828926f7d8e04a8708f5943aa1385260a08b3c" exitCode=0 Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.844474 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.844535 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" event={"ID":"870dbb4d-dc1c-4be9-840c-e1a7450587cc","Type":"ContainerDied","Data":"3e80d7224b65dafe4605a244ac828926f7d8e04a8708f5943aa1385260a08b3c"} Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.844570 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4" event={"ID":"870dbb4d-dc1c-4be9-840c-e1a7450587cc","Type":"ContainerDied","Data":"be73937bc4d6321f45a26d65ab5cb72240beb70b3522ebca09a4beecf5be9fcf"} Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.844598 4921 scope.go:117] "RemoveContainer" containerID="3e80d7224b65dafe4605a244ac828926f7d8e04a8708f5943aa1385260a08b3c" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.847673 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1fa61e61-04e1-4a45-966b-a347b6491128-client-ca\") pod \"route-controller-manager-6fdd7bd669-tm58z\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.847725 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fa61e61-04e1-4a45-966b-a347b6491128-serving-cert\") pod \"route-controller-manager-6fdd7bd669-tm58z\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.847774 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-config\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.847801 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9dmj\" (UniqueName: \"kubernetes.io/projected/82232748-66cb-4f41-857a-b6bbd9e03cf4-kube-api-access-q9dmj\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.847841 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-proxy-ca-bundles\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.847872 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-client-ca\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.847901 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa61e61-04e1-4a45-966b-a347b6491128-config\") pod \"route-controller-manager-6fdd7bd669-tm58z\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.847924 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksh5m\" (UniqueName: \"kubernetes.io/projected/1fa61e61-04e1-4a45-966b-a347b6491128-kube-api-access-ksh5m\") pod \"route-controller-manager-6fdd7bd669-tm58z\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.847955 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82232748-66cb-4f41-857a-b6bbd9e03cf4-serving-cert\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.849665 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-client-ca\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.850536 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-proxy-ca-bundles\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.850587 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-config\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.854665 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82232748-66cb-4f41-857a-b6bbd9e03cf4-serving-cert\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.869482 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9dmj\" (UniqueName: \"kubernetes.io/projected/82232748-66cb-4f41-857a-b6bbd9e03cf4-kube-api-access-q9dmj\") pod \"controller-manager-5c4c9867ff-skt86\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.885929 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4"] Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.892483 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76876fdf5-62fl4"] Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.962399 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa61e61-04e1-4a45-966b-a347b6491128-config\") pod \"route-controller-manager-6fdd7bd669-tm58z\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.962442 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksh5m\" (UniqueName: \"kubernetes.io/projected/1fa61e61-04e1-4a45-966b-a347b6491128-kube-api-access-ksh5m\") pod \"route-controller-manager-6fdd7bd669-tm58z\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.962500 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1fa61e61-04e1-4a45-966b-a347b6491128-client-ca\") pod \"route-controller-manager-6fdd7bd669-tm58z\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.962553 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fa61e61-04e1-4a45-966b-a347b6491128-serving-cert\") pod \"route-controller-manager-6fdd7bd669-tm58z\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.964144 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1fa61e61-04e1-4a45-966b-a347b6491128-client-ca\") pod \"route-controller-manager-6fdd7bd669-tm58z\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.964331 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa61e61-04e1-4a45-966b-a347b6491128-config\") pod \"route-controller-manager-6fdd7bd669-tm58z\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.979244 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fa61e61-04e1-4a45-966b-a347b6491128-serving-cert\") pod \"route-controller-manager-6fdd7bd669-tm58z\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:11:59 crc kubenswrapper[4921]: I0312 13:11:59.985397 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksh5m\" (UniqueName: \"kubernetes.io/projected/1fa61e61-04e1-4a45-966b-a347b6491128-kube-api-access-ksh5m\") pod \"route-controller-manager-6fdd7bd669-tm58z\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.030454 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870dbb4d-dc1c-4be9-840c-e1a7450587cc" path="/var/lib/kubelet/pods/870dbb4d-dc1c-4be9-840c-e1a7450587cc/volumes" Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.030993 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8feaebdc-ac88-4341-8989-d9617b8ef2a1" path="/var/lib/kubelet/pods/8feaebdc-ac88-4341-8989-d9617b8ef2a1/volumes" Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.035251 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.059760 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.142400 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555352-kdcsz"] Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.143265 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555352-kdcsz" Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.145131 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.145310 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.146759 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555352-kdcsz"] Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.146844 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.148628 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:12:00 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:12:00 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:12:00 crc kubenswrapper[4921]: healthz check failed Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.148685 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.266111 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnnlg\" (UniqueName: \"kubernetes.io/projected/fc6eb617-cfea-4abf-81fd-8417dc305d9c-kube-api-access-nnnlg\") pod \"auto-csr-approver-29555352-kdcsz\" (UID: \"fc6eb617-cfea-4abf-81fd-8417dc305d9c\") " pod="openshift-infra/auto-csr-approver-29555352-kdcsz" Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.367367 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnnlg\" (UniqueName: \"kubernetes.io/projected/fc6eb617-cfea-4abf-81fd-8417dc305d9c-kube-api-access-nnnlg\") pod \"auto-csr-approver-29555352-kdcsz\" (UID: \"fc6eb617-cfea-4abf-81fd-8417dc305d9c\") " pod="openshift-infra/auto-csr-approver-29555352-kdcsz" Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.399152 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnnlg\" (UniqueName: \"kubernetes.io/projected/fc6eb617-cfea-4abf-81fd-8417dc305d9c-kube-api-access-nnnlg\") pod \"auto-csr-approver-29555352-kdcsz\" (UID: \"fc6eb617-cfea-4abf-81fd-8417dc305d9c\") " pod="openshift-infra/auto-csr-approver-29555352-kdcsz" Mar 12 13:12:00 crc kubenswrapper[4921]: I0312 13:12:00.487280 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555352-kdcsz" Mar 12 13:12:01 crc kubenswrapper[4921]: I0312 13:12:01.148642 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:12:01 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:12:01 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:12:01 crc kubenswrapper[4921]: healthz check failed Mar 12 13:12:01 crc kubenswrapper[4921]: I0312 13:12:01.148933 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:12:02 crc kubenswrapper[4921]: I0312 13:12:02.158021 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:12:02 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:12:02 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:12:02 crc kubenswrapper[4921]: healthz check failed Mar 12 13:12:02 crc kubenswrapper[4921]: I0312 13:12:02.158073 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:12:02 crc kubenswrapper[4921]: I0312 13:12:02.629635 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-655n6" Mar 12 13:12:03 crc kubenswrapper[4921]: I0312 13:12:03.149287 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:12:03 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:12:03 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:12:03 crc kubenswrapper[4921]: healthz check failed Mar 12 13:12:03 crc kubenswrapper[4921]: I0312 13:12:03.149344 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:12:04 crc kubenswrapper[4921]: I0312 13:12:04.150074 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:12:04 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:12:04 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:12:04 crc kubenswrapper[4921]: healthz check failed Mar 12 13:12:04 crc kubenswrapper[4921]: I0312 13:12:04.151963 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:12:05 crc kubenswrapper[4921]: I0312 13:12:05.147356 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:12:05 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:12:05 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:12:05 crc kubenswrapper[4921]: healthz check failed Mar 12 13:12:05 crc kubenswrapper[4921]: I0312 13:12:05.147422 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:12:05 crc kubenswrapper[4921]: I0312 13:12:05.775266 4921 ???:1] "http: TLS handshake error from 192.168.126.11:49688: no serving certificate available for the kubelet" Mar 12 13:12:06 crc kubenswrapper[4921]: I0312 13:12:06.179748 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:12:06 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:12:06 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:12:06 crc kubenswrapper[4921]: healthz check failed Mar 12 13:12:06 crc kubenswrapper[4921]: I0312 13:12:06.179925 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:12:06 crc kubenswrapper[4921]: I0312 13:12:06.310987 4921 scope.go:117] "RemoveContainer" containerID="3e80d7224b65dafe4605a244ac828926f7d8e04a8708f5943aa1385260a08b3c" Mar 12 13:12:06 crc kubenswrapper[4921]: E0312 13:12:06.311700 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e80d7224b65dafe4605a244ac828926f7d8e04a8708f5943aa1385260a08b3c\": container with ID starting with 3e80d7224b65dafe4605a244ac828926f7d8e04a8708f5943aa1385260a08b3c not found: ID does not exist" containerID="3e80d7224b65dafe4605a244ac828926f7d8e04a8708f5943aa1385260a08b3c" Mar 12 13:12:06 crc kubenswrapper[4921]: I0312 13:12:06.311743 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e80d7224b65dafe4605a244ac828926f7d8e04a8708f5943aa1385260a08b3c"} err="failed to get container status \"3e80d7224b65dafe4605a244ac828926f7d8e04a8708f5943aa1385260a08b3c\": rpc error: code = NotFound desc = could not find container \"3e80d7224b65dafe4605a244ac828926f7d8e04a8708f5943aa1385260a08b3c\": container with ID starting with 3e80d7224b65dafe4605a244ac828926f7d8e04a8708f5943aa1385260a08b3c not found: ID does not exist" Mar 12 13:12:07 crc kubenswrapper[4921]: I0312 13:12:07.148928 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:12:07 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:12:07 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:12:07 crc kubenswrapper[4921]: healthz check failed Mar 12 13:12:07 crc kubenswrapper[4921]: I0312 13:12:07.149008 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:12:07 crc kubenswrapper[4921]: I0312 13:12:07.157515 4921 patch_prober.go:28] interesting pod/console-f9d7485db-22xz2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 12 13:12:07 crc kubenswrapper[4921]: I0312 13:12:07.157593 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-22xz2" podUID="57677fcb-c7a5-431c-b751-ec13d22484b1" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 12 13:12:07 crc kubenswrapper[4921]: I0312 13:12:07.431783 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-4z7zk" Mar 12 13:12:07 crc kubenswrapper[4921]: E0312 13:12:07.536825 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5555b0ef0e3d30a1c2751b3c85c0f7c91680c00716833d8ce887a67c2b15813" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 13:12:07 crc kubenswrapper[4921]: E0312 13:12:07.544035 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5555b0ef0e3d30a1c2751b3c85c0f7c91680c00716833d8ce887a67c2b15813" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 13:12:07 crc kubenswrapper[4921]: E0312 13:12:07.546526 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5555b0ef0e3d30a1c2751b3c85c0f7c91680c00716833d8ce887a67c2b15813" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 13:12:07 crc kubenswrapper[4921]: E0312 13:12:07.546583 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" podUID="81ec102d-42ba-4d41-952d-d36fa110e626" containerName="kube-multus-additional-cni-plugins" Mar 12 13:12:07 crc kubenswrapper[4921]: I0312 13:12:07.831257 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5c4c9867ff-skt86"] Mar 12 13:12:07 crc kubenswrapper[4921]: I0312 13:12:07.902145 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z"] Mar 12 13:12:07 crc kubenswrapper[4921]: I0312 13:12:07.918881 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" event={"ID":"82232748-66cb-4f41-857a-b6bbd9e03cf4","Type":"ContainerStarted","Data":"e2d6164e1a60abec9ea08cb5fea1bdd999fc0505ad904523c1ee150ff65bd3bc"} Mar 12 13:12:08 crc kubenswrapper[4921]: I0312 13:12:08.012535 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 12 13:12:08 crc kubenswrapper[4921]: I0312 13:12:08.100275 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555352-kdcsz"] Mar 12 13:12:08 crc kubenswrapper[4921]: W0312 13:12:08.108869 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc6eb617_cfea_4abf_81fd_8417dc305d9c.slice/crio-3b744f06e733cebedc9b6d7695d011670917a223a570617d82c4417ff78b41e3 WatchSource:0}: Error finding container 3b744f06e733cebedc9b6d7695d011670917a223a570617d82c4417ff78b41e3: Status 404 returned error can't find the container with id 3b744f06e733cebedc9b6d7695d011670917a223a570617d82c4417ff78b41e3 Mar 12 13:12:08 crc kubenswrapper[4921]: I0312 13:12:08.151682 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:12:08 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:12:08 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:12:08 crc kubenswrapper[4921]: healthz check failed Mar 12 13:12:08 crc kubenswrapper[4921]: I0312 13:12:08.152172 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:12:08 crc kubenswrapper[4921]: I0312 13:12:08.927240 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555352-kdcsz" event={"ID":"fc6eb617-cfea-4abf-81fd-8417dc305d9c","Type":"ContainerStarted","Data":"3b744f06e733cebedc9b6d7695d011670917a223a570617d82c4417ff78b41e3"} Mar 12 13:12:08 crc kubenswrapper[4921]: I0312 13:12:08.929446 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" event={"ID":"82232748-66cb-4f41-857a-b6bbd9e03cf4","Type":"ContainerStarted","Data":"292f63dbe64ce24c16a4731b5e3ca97db3bd234d46c97826674dcc990a3ec97f"} Mar 12 13:12:08 crc kubenswrapper[4921]: I0312 13:12:08.929527 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:12:08 crc kubenswrapper[4921]: I0312 13:12:08.934515 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" event={"ID":"1fa61e61-04e1-4a45-966b-a347b6491128","Type":"ContainerStarted","Data":"c7842a09fa638ae438b87666d44051084a29b051aef3571e9a197f16eee8a5d7"} Mar 12 13:12:08 crc kubenswrapper[4921]: I0312 13:12:08.934556 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" event={"ID":"1fa61e61-04e1-4a45-966b-a347b6491128","Type":"ContainerStarted","Data":"097b011c0795a0f1a2341d4a0805d3d81de5be38f8b966839b68ef323e9e5c13"} Mar 12 13:12:08 crc kubenswrapper[4921]: I0312 13:12:08.934995 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:12:08 crc kubenswrapper[4921]: I0312 13:12:08.947591 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:12:08 crc kubenswrapper[4921]: I0312 13:12:08.947691 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:12:08 crc kubenswrapper[4921]: I0312 13:12:08.964423 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" podStartSLOduration=10.964401962 podStartE2EDuration="10.964401962s" podCreationTimestamp="2026-03-12 13:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:12:08.960142587 +0000 UTC m=+151.650214558" watchObservedRunningTime="2026-03-12 13:12:08.964401962 +0000 UTC m=+151.654473933" Mar 12 13:12:08 crc kubenswrapper[4921]: I0312 13:12:08.991552 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=1.991530182 podStartE2EDuration="1.991530182s" podCreationTimestamp="2026-03-12 13:12:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:12:08.984378369 +0000 UTC m=+151.674450340" watchObservedRunningTime="2026-03-12 13:12:08.991530182 +0000 UTC m=+151.681602143" Mar 12 13:12:09 crc kubenswrapper[4921]: I0312 13:12:09.016053 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" podStartSLOduration=11.01602818 podStartE2EDuration="11.01602818s" podCreationTimestamp="2026-03-12 13:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:12:09.004709946 +0000 UTC m=+151.694781917" watchObservedRunningTime="2026-03-12 13:12:09.01602818 +0000 UTC m=+151.706100151" Mar 12 13:12:09 crc kubenswrapper[4921]: I0312 13:12:09.151773 4921 patch_prober.go:28] interesting pod/router-default-5444994796-5ns6b container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 12 13:12:09 crc kubenswrapper[4921]: [-]has-synced failed: reason withheld Mar 12 13:12:09 crc kubenswrapper[4921]: [+]process-running ok Mar 12 13:12:09 crc kubenswrapper[4921]: healthz check failed Mar 12 13:12:09 crc kubenswrapper[4921]: I0312 13:12:09.151857 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5ns6b" podUID="0a2f1c9e-853d-4c03-b3ef-e56e61fe5e7c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 13:12:10 crc kubenswrapper[4921]: I0312 13:12:10.148873 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:12:10 crc kubenswrapper[4921]: I0312 13:12:10.151804 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-5ns6b" Mar 12 13:12:10 crc kubenswrapper[4921]: I0312 13:12:10.182471 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-grsp5" Mar 12 13:12:14 crc kubenswrapper[4921]: I0312 13:12:14.819850 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:12:17 crc kubenswrapper[4921]: I0312 13:12:17.165968 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:12:17 crc kubenswrapper[4921]: I0312 13:12:17.170958 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:12:17 crc kubenswrapper[4921]: E0312 13:12:17.536647 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5555b0ef0e3d30a1c2751b3c85c0f7c91680c00716833d8ce887a67c2b15813" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 13:12:17 crc kubenswrapper[4921]: E0312 13:12:17.539514 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5555b0ef0e3d30a1c2751b3c85c0f7c91680c00716833d8ce887a67c2b15813" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 13:12:17 crc kubenswrapper[4921]: E0312 13:12:17.541655 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5555b0ef0e3d30a1c2751b3c85c0f7c91680c00716833d8ce887a67c2b15813" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 13:12:17 crc kubenswrapper[4921]: E0312 13:12:17.541748 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" podUID="81ec102d-42ba-4d41-952d-d36fa110e626" containerName="kube-multus-additional-cni-plugins" Mar 12 13:12:18 crc kubenswrapper[4921]: E0312 13:12:18.831511 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 12 13:12:18 crc kubenswrapper[4921]: E0312 13:12:18.831700 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xqxbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ll7bv_openshift-marketplace(d6868925-795c-4765-9343-0b147db98216): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 13:12:18 crc kubenswrapper[4921]: E0312 13:12:18.833322 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ll7bv" podUID="d6868925-795c-4765-9343-0b147db98216" Mar 12 13:12:19 crc kubenswrapper[4921]: I0312 13:12:19.004942 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 12 13:12:22 crc kubenswrapper[4921]: I0312 13:12:22.360793 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 12 13:12:22 crc kubenswrapper[4921]: I0312 13:12:22.363282 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 13:12:22 crc kubenswrapper[4921]: I0312 13:12:22.366808 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 12 13:12:22 crc kubenswrapper[4921]: I0312 13:12:22.367132 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 12 13:12:22 crc kubenswrapper[4921]: I0312 13:12:22.377640 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 12 13:12:22 crc kubenswrapper[4921]: I0312 13:12:22.417143 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=4.417123148 podStartE2EDuration="4.417123148s" podCreationTimestamp="2026-03-12 13:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:12:22.404454487 +0000 UTC m=+165.094526458" watchObservedRunningTime="2026-03-12 13:12:22.417123148 +0000 UTC m=+165.107195119" Mar 12 13:12:22 crc kubenswrapper[4921]: I0312 13:12:22.418374 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/484da162-312d-46b3-a31b-a5cdd420d742-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"484da162-312d-46b3-a31b-a5cdd420d742\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 13:12:22 crc kubenswrapper[4921]: I0312 13:12:22.418444 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/484da162-312d-46b3-a31b-a5cdd420d742-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"484da162-312d-46b3-a31b-a5cdd420d742\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 13:12:22 crc kubenswrapper[4921]: I0312 13:12:22.519375 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/484da162-312d-46b3-a31b-a5cdd420d742-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"484da162-312d-46b3-a31b-a5cdd420d742\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 13:12:22 crc kubenswrapper[4921]: I0312 13:12:22.519459 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/484da162-312d-46b3-a31b-a5cdd420d742-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"484da162-312d-46b3-a31b-a5cdd420d742\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 13:12:22 crc kubenswrapper[4921]: I0312 13:12:22.519828 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/484da162-312d-46b3-a31b-a5cdd420d742-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"484da162-312d-46b3-a31b-a5cdd420d742\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 13:12:22 crc kubenswrapper[4921]: I0312 13:12:22.539639 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/484da162-312d-46b3-a31b-a5cdd420d742-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"484da162-312d-46b3-a31b-a5cdd420d742\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 13:12:22 crc kubenswrapper[4921]: E0312 13:12:22.617447 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ll7bv" podUID="d6868925-795c-4765-9343-0b147db98216" Mar 12 13:12:22 crc kubenswrapper[4921]: I0312 13:12:22.695378 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 13:12:22 crc kubenswrapper[4921]: E0312 13:12:22.711502 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 12 13:12:22 crc kubenswrapper[4921]: E0312 13:12:22.711769 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7f8tt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kv2xc_openshift-marketplace(ec0983c2-4cd5-41aa-972c-60dd47817a5b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 13:12:22 crc kubenswrapper[4921]: E0312 13:12:22.713059 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kv2xc" podUID="ec0983c2-4cd5-41aa-972c-60dd47817a5b" Mar 12 13:12:22 crc kubenswrapper[4921]: E0312 13:12:22.740731 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 12 13:12:22 crc kubenswrapper[4921]: E0312 13:12:22.741039 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dsbrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7vlvg_openshift-marketplace(dc904419-43b3-4164-8efb-b493171791cc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 13:12:22 crc kubenswrapper[4921]: E0312 13:12:22.742328 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7vlvg" podUID="dc904419-43b3-4164-8efb-b493171791cc" Mar 12 13:12:22 crc kubenswrapper[4921]: E0312 13:12:22.745431 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 12 13:12:22 crc kubenswrapper[4921]: E0312 13:12:22.745555 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5gmld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-xxd4x_openshift-marketplace(0840f674-6e13-4336-ad20-a67b979ae5ba): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 12 13:12:22 crc kubenswrapper[4921]: E0312 13:12:22.746892 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-xxd4x" podUID="0840f674-6e13-4336-ad20-a67b979ae5ba" Mar 12 13:12:23 crc kubenswrapper[4921]: I0312 13:12:23.019043 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lgwhc_81ec102d-42ba-4d41-952d-d36fa110e626/kube-multus-additional-cni-plugins/0.log" Mar 12 13:12:23 crc kubenswrapper[4921]: I0312 13:12:23.019448 4921 generic.go:334] "Generic (PLEG): container finished" podID="81ec102d-42ba-4d41-952d-d36fa110e626" containerID="f5555b0ef0e3d30a1c2751b3c85c0f7c91680c00716833d8ce887a67c2b15813" exitCode=137 Mar 12 13:12:23 crc kubenswrapper[4921]: I0312 13:12:23.019707 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" event={"ID":"81ec102d-42ba-4d41-952d-d36fa110e626","Type":"ContainerDied","Data":"f5555b0ef0e3d30a1c2751b3c85c0f7c91680c00716833d8ce887a67c2b15813"} Mar 12 13:12:25 crc kubenswrapper[4921]: E0312 13:12:25.346511 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-xxd4x" podUID="0840f674-6e13-4336-ad20-a67b979ae5ba" Mar 12 13:12:25 crc kubenswrapper[4921]: E0312 13:12:25.346654 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7vlvg" podUID="dc904419-43b3-4164-8efb-b493171791cc" Mar 12 13:12:25 crc kubenswrapper[4921]: E0312 13:12:25.357458 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kv2xc" podUID="ec0983c2-4cd5-41aa-972c-60dd47817a5b" Mar 12 13:12:25 crc kubenswrapper[4921]: I0312 13:12:25.810450 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lgwhc_81ec102d-42ba-4d41-952d-d36fa110e626/kube-multus-additional-cni-plugins/0.log" Mar 12 13:12:25 crc kubenswrapper[4921]: I0312 13:12:25.810518 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:12:25 crc kubenswrapper[4921]: I0312 13:12:25.872590 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/81ec102d-42ba-4d41-952d-d36fa110e626-tuning-conf-dir\") pod \"81ec102d-42ba-4d41-952d-d36fa110e626\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " Mar 12 13:12:25 crc kubenswrapper[4921]: I0312 13:12:25.872732 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/81ec102d-42ba-4d41-952d-d36fa110e626-ready\") pod \"81ec102d-42ba-4d41-952d-d36fa110e626\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " Mar 12 13:12:25 crc kubenswrapper[4921]: I0312 13:12:25.872827 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nktws\" (UniqueName: \"kubernetes.io/projected/81ec102d-42ba-4d41-952d-d36fa110e626-kube-api-access-nktws\") pod \"81ec102d-42ba-4d41-952d-d36fa110e626\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " Mar 12 13:12:25 crc kubenswrapper[4921]: I0312 13:12:25.872883 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/81ec102d-42ba-4d41-952d-d36fa110e626-cni-sysctl-allowlist\") pod \"81ec102d-42ba-4d41-952d-d36fa110e626\" (UID: \"81ec102d-42ba-4d41-952d-d36fa110e626\") " Mar 12 13:12:25 crc kubenswrapper[4921]: I0312 13:12:25.872903 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/81ec102d-42ba-4d41-952d-d36fa110e626-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "81ec102d-42ba-4d41-952d-d36fa110e626" (UID: "81ec102d-42ba-4d41-952d-d36fa110e626"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:12:25 crc kubenswrapper[4921]: I0312 13:12:25.873186 4921 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/81ec102d-42ba-4d41-952d-d36fa110e626-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 12 13:12:25 crc kubenswrapper[4921]: I0312 13:12:25.873431 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81ec102d-42ba-4d41-952d-d36fa110e626-ready" (OuterVolumeSpecName: "ready") pod "81ec102d-42ba-4d41-952d-d36fa110e626" (UID: "81ec102d-42ba-4d41-952d-d36fa110e626"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:12:25 crc kubenswrapper[4921]: I0312 13:12:25.873669 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81ec102d-42ba-4d41-952d-d36fa110e626-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "81ec102d-42ba-4d41-952d-d36fa110e626" (UID: "81ec102d-42ba-4d41-952d-d36fa110e626"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:12:25 crc kubenswrapper[4921]: I0312 13:12:25.880177 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ec102d-42ba-4d41-952d-d36fa110e626-kube-api-access-nktws" (OuterVolumeSpecName: "kube-api-access-nktws") pod "81ec102d-42ba-4d41-952d-d36fa110e626" (UID: "81ec102d-42ba-4d41-952d-d36fa110e626"). InnerVolumeSpecName "kube-api-access-nktws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:12:25 crc kubenswrapper[4921]: I0312 13:12:25.974792 4921 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/81ec102d-42ba-4d41-952d-d36fa110e626-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 12 13:12:25 crc kubenswrapper[4921]: I0312 13:12:25.974862 4921 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/81ec102d-42ba-4d41-952d-d36fa110e626-ready\") on node \"crc\" DevicePath \"\"" Mar 12 13:12:25 crc kubenswrapper[4921]: I0312 13:12:25.974874 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nktws\" (UniqueName: \"kubernetes.io/projected/81ec102d-42ba-4d41-952d-d36fa110e626-kube-api-access-nktws\") on node \"crc\" DevicePath \"\"" Mar 12 13:12:26 crc kubenswrapper[4921]: I0312 13:12:26.052299 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-lgwhc_81ec102d-42ba-4d41-952d-d36fa110e626/kube-multus-additional-cni-plugins/0.log" Mar 12 13:12:26 crc kubenswrapper[4921]: I0312 13:12:26.052364 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" event={"ID":"81ec102d-42ba-4d41-952d-d36fa110e626","Type":"ContainerDied","Data":"10f2850688d3a00bee279341c32453288841f7df15daa586ef137ac7d27713ea"} Mar 12 13:12:26 crc kubenswrapper[4921]: I0312 13:12:26.052408 4921 scope.go:117] "RemoveContainer" containerID="f5555b0ef0e3d30a1c2751b3c85c0f7c91680c00716833d8ce887a67c2b15813" Mar 12 13:12:26 crc kubenswrapper[4921]: I0312 13:12:26.052442 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-lgwhc" Mar 12 13:12:26 crc kubenswrapper[4921]: I0312 13:12:26.074725 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lgwhc"] Mar 12 13:12:26 crc kubenswrapper[4921]: I0312 13:12:26.078855 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-lgwhc"] Mar 12 13:12:26 crc kubenswrapper[4921]: I0312 13:12:26.283052 4921 ???:1] "http: TLS handshake error from 192.168.126.11:34898: no serving certificate available for the kubelet" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.062947 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvkz9" event={"ID":"82dff338-35e1-44df-8f20-a4d4d8b3c198","Type":"ContainerStarted","Data":"b214c191ba23cf298a0840b1884d17db2c3fd3731dcd017fe97b6fc0f9cbb977"} Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.066699 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6c4k" event={"ID":"7fef7638-98df-405a-b04b-f47997b46eac","Type":"ContainerStarted","Data":"b3098ec18ce1e8a07b54a5064ee66b9aa4b52da2e9dd4e250506dea2df8590cd"} Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.068028 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tzw9" event={"ID":"2b934596-5580-41ba-8ad2-8722f4cf476d","Type":"ContainerStarted","Data":"95ba84464b900e02fd0ae21b7b33879edd89ecb73645b67258f5728da6075215"} Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.070878 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfjjf" event={"ID":"0a8433ae-09da-4dfb-98c6-922fcfbaa546","Type":"ContainerStarted","Data":"4c44ae8b67d035f13e8ac79eebe0d32a0ffb64a616f982e0d60c3a22f3515bb6"} Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.141953 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 12 13:12:27 crc kubenswrapper[4921]: W0312 13:12:27.145298 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod484da162_312d_46b3_a31b_a5cdd420d742.slice/crio-0de8245c7b23a034031a686b3b5c3a1efca5236b6affe659c35bb122b462a8db WatchSource:0}: Error finding container 0de8245c7b23a034031a686b3b5c3a1efca5236b6affe659c35bb122b462a8db: Status 404 returned error can't find the container with id 0de8245c7b23a034031a686b3b5c3a1efca5236b6affe659c35bb122b462a8db Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.310909 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.351689 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 12 13:12:27 crc kubenswrapper[4921]: E0312 13:12:27.352593 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ec102d-42ba-4d41-952d-d36fa110e626" containerName="kube-multus-additional-cni-plugins" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.352607 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ec102d-42ba-4d41-952d-d36fa110e626" containerName="kube-multus-additional-cni-plugins" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.352717 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ec102d-42ba-4d41-952d-d36fa110e626" containerName="kube-multus-additional-cni-plugins" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.353102 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.363753 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.496248 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14fec143-fc2c-4cef-93c5-0bcc947068a3-kube-api-access\") pod \"installer-9-crc\" (UID: \"14fec143-fc2c-4cef-93c5-0bcc947068a3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.496369 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/14fec143-fc2c-4cef-93c5-0bcc947068a3-var-lock\") pod \"installer-9-crc\" (UID: \"14fec143-fc2c-4cef-93c5-0bcc947068a3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.496434 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14fec143-fc2c-4cef-93c5-0bcc947068a3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"14fec143-fc2c-4cef-93c5-0bcc947068a3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.598057 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14fec143-fc2c-4cef-93c5-0bcc947068a3-kube-api-access\") pod \"installer-9-crc\" (UID: \"14fec143-fc2c-4cef-93c5-0bcc947068a3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.598575 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/14fec143-fc2c-4cef-93c5-0bcc947068a3-var-lock\") pod \"installer-9-crc\" (UID: \"14fec143-fc2c-4cef-93c5-0bcc947068a3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.598681 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/14fec143-fc2c-4cef-93c5-0bcc947068a3-var-lock\") pod \"installer-9-crc\" (UID: \"14fec143-fc2c-4cef-93c5-0bcc947068a3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.598843 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14fec143-fc2c-4cef-93c5-0bcc947068a3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"14fec143-fc2c-4cef-93c5-0bcc947068a3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.598938 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14fec143-fc2c-4cef-93c5-0bcc947068a3-kubelet-dir\") pod \"installer-9-crc\" (UID: \"14fec143-fc2c-4cef-93c5-0bcc947068a3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.618449 4921 csr.go:261] certificate signing request csr-6l27t is approved, waiting to be issued Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.619266 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14fec143-fc2c-4cef-93c5-0bcc947068a3-kube-api-access\") pod \"installer-9-crc\" (UID: \"14fec143-fc2c-4cef-93c5-0bcc947068a3\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.625577 4921 csr.go:257] certificate signing request csr-6l27t is issued Mar 12 13:12:27 crc kubenswrapper[4921]: I0312 13:12:27.754576 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.001616 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ec102d-42ba-4d41-952d-d36fa110e626" path="/var/lib/kubelet/pods/81ec102d-42ba-4d41-952d-d36fa110e626/volumes" Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.080603 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"484da162-312d-46b3-a31b-a5cdd420d742","Type":"ContainerStarted","Data":"ca6677bb46e8bc105291c243389a99b392b7cdd09c696cbae7cdb56fbd78212e"} Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.080769 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"484da162-312d-46b3-a31b-a5cdd420d742","Type":"ContainerStarted","Data":"0de8245c7b23a034031a686b3b5c3a1efca5236b6affe659c35bb122b462a8db"} Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.083419 4921 generic.go:334] "Generic (PLEG): container finished" podID="7fef7638-98df-405a-b04b-f47997b46eac" containerID="b3098ec18ce1e8a07b54a5064ee66b9aa4b52da2e9dd4e250506dea2df8590cd" exitCode=0 Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.083559 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6c4k" event={"ID":"7fef7638-98df-405a-b04b-f47997b46eac","Type":"ContainerDied","Data":"b3098ec18ce1e8a07b54a5064ee66b9aa4b52da2e9dd4e250506dea2df8590cd"} Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.088399 4921 generic.go:334] "Generic (PLEG): container finished" podID="0a8433ae-09da-4dfb-98c6-922fcfbaa546" containerID="4c44ae8b67d035f13e8ac79eebe0d32a0ffb64a616f982e0d60c3a22f3515bb6" exitCode=0 Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.088627 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfjjf" event={"ID":"0a8433ae-09da-4dfb-98c6-922fcfbaa546","Type":"ContainerDied","Data":"4c44ae8b67d035f13e8ac79eebe0d32a0ffb64a616f982e0d60c3a22f3515bb6"} Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.094259 4921 generic.go:334] "Generic (PLEG): container finished" podID="2b934596-5580-41ba-8ad2-8722f4cf476d" containerID="95ba84464b900e02fd0ae21b7b33879edd89ecb73645b67258f5728da6075215" exitCode=0 Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.094349 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tzw9" event={"ID":"2b934596-5580-41ba-8ad2-8722f4cf476d","Type":"ContainerDied","Data":"95ba84464b900e02fd0ae21b7b33879edd89ecb73645b67258f5728da6075215"} Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.104729 4921 generic.go:334] "Generic (PLEG): container finished" podID="fc6eb617-cfea-4abf-81fd-8417dc305d9c" containerID="e431e0e27e2398cdb9c5a15802593ff33ca2e82474e2c9fedf1b2d11a2daf186" exitCode=0 Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.105119 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555352-kdcsz" event={"ID":"fc6eb617-cfea-4abf-81fd-8417dc305d9c","Type":"ContainerDied","Data":"e431e0e27e2398cdb9c5a15802593ff33ca2e82474e2c9fedf1b2d11a2daf186"} Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.117932 4921 generic.go:334] "Generic (PLEG): container finished" podID="82dff338-35e1-44df-8f20-a4d4d8b3c198" containerID="b214c191ba23cf298a0840b1884d17db2c3fd3731dcd017fe97b6fc0f9cbb977" exitCode=0 Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.118007 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvkz9" event={"ID":"82dff338-35e1-44df-8f20-a4d4d8b3c198","Type":"ContainerDied","Data":"b214c191ba23cf298a0840b1884d17db2c3fd3731dcd017fe97b6fc0f9cbb977"} Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.130784 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=6.130760075 podStartE2EDuration="6.130760075s" podCreationTimestamp="2026-03-12 13:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:12:28.118587018 +0000 UTC m=+170.808659009" watchObservedRunningTime="2026-03-12 13:12:28.130760075 +0000 UTC m=+170.820832036" Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.222840 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 12 13:12:28 crc kubenswrapper[4921]: W0312 13:12:28.233343 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod14fec143_fc2c_4cef_93c5_0bcc947068a3.slice/crio-955fa95da25582dc391489a8451ee184f86743a454a27f00f64b934707b02edd WatchSource:0}: Error finding container 955fa95da25582dc391489a8451ee184f86743a454a27f00f64b934707b02edd: Status 404 returned error can't find the container with id 955fa95da25582dc391489a8451ee184f86743a454a27f00f64b934707b02edd Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.628145 4921 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-16 10:43:02.800922362 +0000 UTC Mar 12 13:12:28 crc kubenswrapper[4921]: I0312 13:12:28.628657 4921 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7437h30m34.172269114s for next certificate rotation Mar 12 13:12:29 crc kubenswrapper[4921]: I0312 13:12:29.006846 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 12 13:12:29 crc kubenswrapper[4921]: I0312 13:12:29.126145 4921 generic.go:334] "Generic (PLEG): container finished" podID="484da162-312d-46b3-a31b-a5cdd420d742" containerID="ca6677bb46e8bc105291c243389a99b392b7cdd09c696cbae7cdb56fbd78212e" exitCode=0 Mar 12 13:12:29 crc kubenswrapper[4921]: I0312 13:12:29.126267 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"484da162-312d-46b3-a31b-a5cdd420d742","Type":"ContainerDied","Data":"ca6677bb46e8bc105291c243389a99b392b7cdd09c696cbae7cdb56fbd78212e"} Mar 12 13:12:29 crc kubenswrapper[4921]: I0312 13:12:29.130882 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"14fec143-fc2c-4cef-93c5-0bcc947068a3","Type":"ContainerStarted","Data":"b70d308c9ece568252fb9953fd041234677ae9d05a84ae11c9cdb61916f02ee3"} Mar 12 13:12:29 crc kubenswrapper[4921]: I0312 13:12:29.130941 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"14fec143-fc2c-4cef-93c5-0bcc947068a3","Type":"ContainerStarted","Data":"955fa95da25582dc391489a8451ee184f86743a454a27f00f64b934707b02edd"} Mar 12 13:12:29 crc kubenswrapper[4921]: I0312 13:12:29.151464 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.15143734 podStartE2EDuration="1.15143734s" podCreationTimestamp="2026-03-12 13:12:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:12:29.146709212 +0000 UTC m=+171.836781223" watchObservedRunningTime="2026-03-12 13:12:29.15143734 +0000 UTC m=+171.841509341" Mar 12 13:12:29 crc kubenswrapper[4921]: I0312 13:12:29.190707 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.190689146 podStartE2EDuration="2.190689146s" podCreationTimestamp="2026-03-12 13:12:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:12:29.187432708 +0000 UTC m=+171.877504669" watchObservedRunningTime="2026-03-12 13:12:29.190689146 +0000 UTC m=+171.880761107" Mar 12 13:12:29 crc kubenswrapper[4921]: I0312 13:12:29.515197 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555352-kdcsz" Mar 12 13:12:29 crc kubenswrapper[4921]: I0312 13:12:29.544142 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnnlg\" (UniqueName: \"kubernetes.io/projected/fc6eb617-cfea-4abf-81fd-8417dc305d9c-kube-api-access-nnnlg\") pod \"fc6eb617-cfea-4abf-81fd-8417dc305d9c\" (UID: \"fc6eb617-cfea-4abf-81fd-8417dc305d9c\") " Mar 12 13:12:29 crc kubenswrapper[4921]: I0312 13:12:29.553522 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc6eb617-cfea-4abf-81fd-8417dc305d9c-kube-api-access-nnnlg" (OuterVolumeSpecName: "kube-api-access-nnnlg") pod "fc6eb617-cfea-4abf-81fd-8417dc305d9c" (UID: "fc6eb617-cfea-4abf-81fd-8417dc305d9c"). InnerVolumeSpecName "kube-api-access-nnnlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:12:29 crc kubenswrapper[4921]: I0312 13:12:29.629009 4921 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-03 00:11:43.009291547 +0000 UTC Mar 12 13:12:29 crc kubenswrapper[4921]: I0312 13:12:29.629122 4921 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6370h59m13.380175008s for next certificate rotation Mar 12 13:12:29 crc kubenswrapper[4921]: I0312 13:12:29.645566 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnnlg\" (UniqueName: \"kubernetes.io/projected/fc6eb617-cfea-4abf-81fd-8417dc305d9c-kube-api-access-nnnlg\") on node \"crc\" DevicePath \"\"" Mar 12 13:12:30 crc kubenswrapper[4921]: I0312 13:12:30.142022 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555352-kdcsz" event={"ID":"fc6eb617-cfea-4abf-81fd-8417dc305d9c","Type":"ContainerDied","Data":"3b744f06e733cebedc9b6d7695d011670917a223a570617d82c4417ff78b41e3"} Mar 12 13:12:30 crc kubenswrapper[4921]: I0312 13:12:30.142097 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555352-kdcsz" Mar 12 13:12:30 crc kubenswrapper[4921]: I0312 13:12:30.142124 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b744f06e733cebedc9b6d7695d011670917a223a570617d82c4417ff78b41e3" Mar 12 13:12:30 crc kubenswrapper[4921]: I0312 13:12:30.460363 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 13:12:30 crc kubenswrapper[4921]: I0312 13:12:30.556949 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/484da162-312d-46b3-a31b-a5cdd420d742-kubelet-dir\") pod \"484da162-312d-46b3-a31b-a5cdd420d742\" (UID: \"484da162-312d-46b3-a31b-a5cdd420d742\") " Mar 12 13:12:30 crc kubenswrapper[4921]: I0312 13:12:30.557039 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/484da162-312d-46b3-a31b-a5cdd420d742-kube-api-access\") pod \"484da162-312d-46b3-a31b-a5cdd420d742\" (UID: \"484da162-312d-46b3-a31b-a5cdd420d742\") " Mar 12 13:12:30 crc kubenswrapper[4921]: I0312 13:12:30.557232 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/484da162-312d-46b3-a31b-a5cdd420d742-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "484da162-312d-46b3-a31b-a5cdd420d742" (UID: "484da162-312d-46b3-a31b-a5cdd420d742"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:12:30 crc kubenswrapper[4921]: I0312 13:12:30.557395 4921 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/484da162-312d-46b3-a31b-a5cdd420d742-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 13:12:30 crc kubenswrapper[4921]: I0312 13:12:30.570556 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484da162-312d-46b3-a31b-a5cdd420d742-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "484da162-312d-46b3-a31b-a5cdd420d742" (UID: "484da162-312d-46b3-a31b-a5cdd420d742"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:12:30 crc kubenswrapper[4921]: I0312 13:12:30.659109 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/484da162-312d-46b3-a31b-a5cdd420d742-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 13:12:31 crc kubenswrapper[4921]: I0312 13:12:31.155047 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 12 13:12:31 crc kubenswrapper[4921]: I0312 13:12:31.155081 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"484da162-312d-46b3-a31b-a5cdd420d742","Type":"ContainerDied","Data":"0de8245c7b23a034031a686b3b5c3a1efca5236b6affe659c35bb122b462a8db"} Mar 12 13:12:31 crc kubenswrapper[4921]: I0312 13:12:31.155135 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0de8245c7b23a034031a686b3b5c3a1efca5236b6affe659c35bb122b462a8db" Mar 12 13:12:31 crc kubenswrapper[4921]: I0312 13:12:31.157675 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6c4k" event={"ID":"7fef7638-98df-405a-b04b-f47997b46eac","Type":"ContainerStarted","Data":"87350e95bdd8b46b20adb55b84579df94eff35960e97f961929a1f169e337d9a"} Mar 12 13:12:31 crc kubenswrapper[4921]: I0312 13:12:31.161780 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfjjf" event={"ID":"0a8433ae-09da-4dfb-98c6-922fcfbaa546","Type":"ContainerStarted","Data":"cf2a7d1d0c0544011872906f3ef8c406d497578d2c81663a08991dc43cf9d248"} Mar 12 13:12:31 crc kubenswrapper[4921]: I0312 13:12:31.172056 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tzw9" event={"ID":"2b934596-5580-41ba-8ad2-8722f4cf476d","Type":"ContainerStarted","Data":"3febb8809e812f601cb90a1cdaf8dd74fde388aa3401509f9ee8a97dd8003ec0"} Mar 12 13:12:31 crc kubenswrapper[4921]: I0312 13:12:31.196577 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m6c4k" podStartSLOduration=6.625250861 podStartE2EDuration="45.19653539s" podCreationTimestamp="2026-03-12 13:11:46 +0000 UTC" firstStartedPulling="2026-03-12 13:11:51.391846464 +0000 UTC m=+134.081918425" lastFinishedPulling="2026-03-12 13:12:29.963130943 +0000 UTC m=+172.653202954" observedRunningTime="2026-03-12 13:12:31.178788973 +0000 UTC m=+173.868860944" watchObservedRunningTime="2026-03-12 13:12:31.19653539 +0000 UTC m=+173.886607381" Mar 12 13:12:31 crc kubenswrapper[4921]: I0312 13:12:31.200625 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tfjjf" podStartSLOduration=7.833767556 podStartE2EDuration="46.20061751s" podCreationTimestamp="2026-03-12 13:11:45 +0000 UTC" firstStartedPulling="2026-03-12 13:11:51.443357455 +0000 UTC m=+134.133429426" lastFinishedPulling="2026-03-12 13:12:29.810207369 +0000 UTC m=+172.500279380" observedRunningTime="2026-03-12 13:12:31.196804047 +0000 UTC m=+173.886876048" watchObservedRunningTime="2026-03-12 13:12:31.20061751 +0000 UTC m=+173.890689481" Mar 12 13:12:31 crc kubenswrapper[4921]: I0312 13:12:31.217911 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6tzw9" podStartSLOduration=6.227377058 podStartE2EDuration="46.217881825s" podCreationTimestamp="2026-03-12 13:11:45 +0000 UTC" firstStartedPulling="2026-03-12 13:11:50.173549831 +0000 UTC m=+132.863621802" lastFinishedPulling="2026-03-12 13:12:30.164054568 +0000 UTC m=+172.854126569" observedRunningTime="2026-03-12 13:12:31.21473777 +0000 UTC m=+173.904809751" watchObservedRunningTime="2026-03-12 13:12:31.217881825 +0000 UTC m=+173.907953836" Mar 12 13:12:32 crc kubenswrapper[4921]: I0312 13:12:32.179785 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvkz9" event={"ID":"82dff338-35e1-44df-8f20-a4d4d8b3c198","Type":"ContainerStarted","Data":"099bc7c95faf2466e5fc9b2cb9bdd431336e4f297d4ccc1fbbfff42554b33019"} Mar 12 13:12:32 crc kubenswrapper[4921]: I0312 13:12:32.203056 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lvkz9" podStartSLOduration=9.624819375 podStartE2EDuration="49.203023463s" podCreationTimestamp="2026-03-12 13:11:43 +0000 UTC" firstStartedPulling="2026-03-12 13:11:51.544383493 +0000 UTC m=+134.234455464" lastFinishedPulling="2026-03-12 13:12:31.122587581 +0000 UTC m=+173.812659552" observedRunningTime="2026-03-12 13:12:32.198040589 +0000 UTC m=+174.888112580" watchObservedRunningTime="2026-03-12 13:12:32.203023463 +0000 UTC m=+174.893095434" Mar 12 13:12:37 crc kubenswrapper[4921]: I0312 13:12:37.556429 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:12:37 crc kubenswrapper[4921]: I0312 13:12:37.556500 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:12:37 crc kubenswrapper[4921]: I0312 13:12:37.587905 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:12:37 crc kubenswrapper[4921]: I0312 13:12:37.587976 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:12:37 crc kubenswrapper[4921]: I0312 13:12:37.639937 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:12:37 crc kubenswrapper[4921]: I0312 13:12:37.640017 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:12:37 crc kubenswrapper[4921]: I0312 13:12:37.656223 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:12:37 crc kubenswrapper[4921]: I0312 13:12:37.656295 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:12:37 crc kubenswrapper[4921]: I0312 13:12:37.734740 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:12:37 crc kubenswrapper[4921]: I0312 13:12:37.736737 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:12:37 crc kubenswrapper[4921]: I0312 13:12:37.739087 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:12:38 crc kubenswrapper[4921]: I0312 13:12:38.272059 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:12:38 crc kubenswrapper[4921]: I0312 13:12:38.274079 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:12:38 crc kubenswrapper[4921]: I0312 13:12:38.276034 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:12:38 crc kubenswrapper[4921]: I0312 13:12:38.720236 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m6c4k" podUID="7fef7638-98df-405a-b04b-f47997b46eac" containerName="registry-server" probeResult="failure" output=< Mar 12 13:12:38 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 13:12:38 crc kubenswrapper[4921]: > Mar 12 13:12:39 crc kubenswrapper[4921]: I0312 13:12:39.230442 4921 generic.go:334] "Generic (PLEG): container finished" podID="d6868925-795c-4765-9343-0b147db98216" containerID="550bea0c575ebee0dad224b93817bc115249971c559027fe35e56d75bd1233aa" exitCode=0 Mar 12 13:12:39 crc kubenswrapper[4921]: I0312 13:12:39.230517 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ll7bv" event={"ID":"d6868925-795c-4765-9343-0b147db98216","Type":"ContainerDied","Data":"550bea0c575ebee0dad224b93817bc115249971c559027fe35e56d75bd1233aa"} Mar 12 13:12:39 crc kubenswrapper[4921]: I0312 13:12:39.232857 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxd4x" event={"ID":"0840f674-6e13-4336-ad20-a67b979ae5ba","Type":"ContainerStarted","Data":"0caf51f05aaff9f1d164d56de25eaebfefe52095f88b13821b49609248057d87"} Mar 12 13:12:40 crc kubenswrapper[4921]: I0312 13:12:40.243917 4921 generic.go:334] "Generic (PLEG): container finished" podID="0840f674-6e13-4336-ad20-a67b979ae5ba" containerID="0caf51f05aaff9f1d164d56de25eaebfefe52095f88b13821b49609248057d87" exitCode=0 Mar 12 13:12:40 crc kubenswrapper[4921]: I0312 13:12:40.244431 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxd4x" event={"ID":"0840f674-6e13-4336-ad20-a67b979ae5ba","Type":"ContainerDied","Data":"0caf51f05aaff9f1d164d56de25eaebfefe52095f88b13821b49609248057d87"} Mar 12 13:12:40 crc kubenswrapper[4921]: I0312 13:12:40.699102 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfjjf"] Mar 12 13:12:40 crc kubenswrapper[4921]: I0312 13:12:40.699463 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tfjjf" podUID="0a8433ae-09da-4dfb-98c6-922fcfbaa546" containerName="registry-server" containerID="cri-o://cf2a7d1d0c0544011872906f3ef8c406d497578d2c81663a08991dc43cf9d248" gracePeriod=2 Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.105118 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.255789 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ll7bv" event={"ID":"d6868925-795c-4765-9343-0b147db98216","Type":"ContainerStarted","Data":"6e406b90e54138979540bc2ad36218f2b8fc106654e5200e87fb642f8e0786e6"} Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.258372 4921 generic.go:334] "Generic (PLEG): container finished" podID="dc904419-43b3-4164-8efb-b493171791cc" containerID="d1fb2f0dcd47401856c892e8ec8f44677ec89df061c36fdce8ab427add3b0414" exitCode=0 Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.258451 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7vlvg" event={"ID":"dc904419-43b3-4164-8efb-b493171791cc","Type":"ContainerDied","Data":"d1fb2f0dcd47401856c892e8ec8f44677ec89df061c36fdce8ab427add3b0414"} Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.262964 4921 generic.go:334] "Generic (PLEG): container finished" podID="0a8433ae-09da-4dfb-98c6-922fcfbaa546" containerID="cf2a7d1d0c0544011872906f3ef8c406d497578d2c81663a08991dc43cf9d248" exitCode=0 Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.263009 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfjjf" event={"ID":"0a8433ae-09da-4dfb-98c6-922fcfbaa546","Type":"ContainerDied","Data":"cf2a7d1d0c0544011872906f3ef8c406d497578d2c81663a08991dc43cf9d248"} Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.263042 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tfjjf" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.263060 4921 scope.go:117] "RemoveContainer" containerID="cf2a7d1d0c0544011872906f3ef8c406d497578d2c81663a08991dc43cf9d248" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.263043 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tfjjf" event={"ID":"0a8433ae-09da-4dfb-98c6-922fcfbaa546","Type":"ContainerDied","Data":"1410175a8aafdda4091710a667d32ec77eb01332bfd1b617627fb5e5920ade5e"} Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.283039 4921 scope.go:117] "RemoveContainer" containerID="4c44ae8b67d035f13e8ac79eebe0d32a0ffb64a616f982e0d60c3a22f3515bb6" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.287949 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ll7bv" podStartSLOduration=8.258224778 podStartE2EDuration="58.287929402s" podCreationTimestamp="2026-03-12 13:11:43 +0000 UTC" firstStartedPulling="2026-03-12 13:11:50.231416508 +0000 UTC m=+132.921488469" lastFinishedPulling="2026-03-12 13:12:40.261121122 +0000 UTC m=+182.951193093" observedRunningTime="2026-03-12 13:12:41.284912121 +0000 UTC m=+183.974984112" watchObservedRunningTime="2026-03-12 13:12:41.287929402 +0000 UTC m=+183.978001373" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.304168 4921 scope.go:117] "RemoveContainer" containerID="26db567317c4d838c1ba4430fd11155f11bcc177ad72e43713aae1ce4b31637a" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.311922 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dcb2\" (UniqueName: \"kubernetes.io/projected/0a8433ae-09da-4dfb-98c6-922fcfbaa546-kube-api-access-4dcb2\") pod \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\" (UID: \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\") " Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.312032 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8433ae-09da-4dfb-98c6-922fcfbaa546-catalog-content\") pod \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\" (UID: \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\") " Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.312090 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8433ae-09da-4dfb-98c6-922fcfbaa546-utilities\") pod \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\" (UID: \"0a8433ae-09da-4dfb-98c6-922fcfbaa546\") " Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.316493 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a8433ae-09da-4dfb-98c6-922fcfbaa546-utilities" (OuterVolumeSpecName: "utilities") pod "0a8433ae-09da-4dfb-98c6-922fcfbaa546" (UID: "0a8433ae-09da-4dfb-98c6-922fcfbaa546"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.321554 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a8433ae-09da-4dfb-98c6-922fcfbaa546-kube-api-access-4dcb2" (OuterVolumeSpecName: "kube-api-access-4dcb2") pod "0a8433ae-09da-4dfb-98c6-922fcfbaa546" (UID: "0a8433ae-09da-4dfb-98c6-922fcfbaa546"). InnerVolumeSpecName "kube-api-access-4dcb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.343459 4921 scope.go:117] "RemoveContainer" containerID="cf2a7d1d0c0544011872906f3ef8c406d497578d2c81663a08991dc43cf9d248" Mar 12 13:12:41 crc kubenswrapper[4921]: E0312 13:12:41.343908 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf2a7d1d0c0544011872906f3ef8c406d497578d2c81663a08991dc43cf9d248\": container with ID starting with cf2a7d1d0c0544011872906f3ef8c406d497578d2c81663a08991dc43cf9d248 not found: ID does not exist" containerID="cf2a7d1d0c0544011872906f3ef8c406d497578d2c81663a08991dc43cf9d248" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.343951 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf2a7d1d0c0544011872906f3ef8c406d497578d2c81663a08991dc43cf9d248"} err="failed to get container status \"cf2a7d1d0c0544011872906f3ef8c406d497578d2c81663a08991dc43cf9d248\": rpc error: code = NotFound desc = could not find container \"cf2a7d1d0c0544011872906f3ef8c406d497578d2c81663a08991dc43cf9d248\": container with ID starting with cf2a7d1d0c0544011872906f3ef8c406d497578d2c81663a08991dc43cf9d248 not found: ID does not exist" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.343982 4921 scope.go:117] "RemoveContainer" containerID="4c44ae8b67d035f13e8ac79eebe0d32a0ffb64a616f982e0d60c3a22f3515bb6" Mar 12 13:12:41 crc kubenswrapper[4921]: E0312 13:12:41.344498 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c44ae8b67d035f13e8ac79eebe0d32a0ffb64a616f982e0d60c3a22f3515bb6\": container with ID starting with 4c44ae8b67d035f13e8ac79eebe0d32a0ffb64a616f982e0d60c3a22f3515bb6 not found: ID does not exist" containerID="4c44ae8b67d035f13e8ac79eebe0d32a0ffb64a616f982e0d60c3a22f3515bb6" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.344534 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c44ae8b67d035f13e8ac79eebe0d32a0ffb64a616f982e0d60c3a22f3515bb6"} err="failed to get container status \"4c44ae8b67d035f13e8ac79eebe0d32a0ffb64a616f982e0d60c3a22f3515bb6\": rpc error: code = NotFound desc = could not find container \"4c44ae8b67d035f13e8ac79eebe0d32a0ffb64a616f982e0d60c3a22f3515bb6\": container with ID starting with 4c44ae8b67d035f13e8ac79eebe0d32a0ffb64a616f982e0d60c3a22f3515bb6 not found: ID does not exist" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.344574 4921 scope.go:117] "RemoveContainer" containerID="26db567317c4d838c1ba4430fd11155f11bcc177ad72e43713aae1ce4b31637a" Mar 12 13:12:41 crc kubenswrapper[4921]: E0312 13:12:41.344887 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26db567317c4d838c1ba4430fd11155f11bcc177ad72e43713aae1ce4b31637a\": container with ID starting with 26db567317c4d838c1ba4430fd11155f11bcc177ad72e43713aae1ce4b31637a not found: ID does not exist" containerID="26db567317c4d838c1ba4430fd11155f11bcc177ad72e43713aae1ce4b31637a" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.344914 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26db567317c4d838c1ba4430fd11155f11bcc177ad72e43713aae1ce4b31637a"} err="failed to get container status \"26db567317c4d838c1ba4430fd11155f11bcc177ad72e43713aae1ce4b31637a\": rpc error: code = NotFound desc = could not find container \"26db567317c4d838c1ba4430fd11155f11bcc177ad72e43713aae1ce4b31637a\": container with ID starting with 26db567317c4d838c1ba4430fd11155f11bcc177ad72e43713aae1ce4b31637a not found: ID does not exist" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.349944 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a8433ae-09da-4dfb-98c6-922fcfbaa546-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a8433ae-09da-4dfb-98c6-922fcfbaa546" (UID: "0a8433ae-09da-4dfb-98c6-922fcfbaa546"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.414159 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a8433ae-09da-4dfb-98c6-922fcfbaa546-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.414200 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dcb2\" (UniqueName: \"kubernetes.io/projected/0a8433ae-09da-4dfb-98c6-922fcfbaa546-kube-api-access-4dcb2\") on node \"crc\" DevicePath \"\"" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.414217 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a8433ae-09da-4dfb-98c6-922fcfbaa546-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.618740 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfjjf"] Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.622627 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tfjjf"] Mar 12 13:12:41 crc kubenswrapper[4921]: I0312 13:12:41.991783 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a8433ae-09da-4dfb-98c6-922fcfbaa546" path="/var/lib/kubelet/pods/0a8433ae-09da-4dfb-98c6-922fcfbaa546/volumes" Mar 12 13:12:42 crc kubenswrapper[4921]: I0312 13:12:42.277479 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxd4x" event={"ID":"0840f674-6e13-4336-ad20-a67b979ae5ba","Type":"ContainerStarted","Data":"ca5526df56c7b00ac88b2d274d1930af1a4889293b278ca5fd7b8ae92aa8839f"} Mar 12 13:12:43 crc kubenswrapper[4921]: I0312 13:12:43.290661 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7vlvg" event={"ID":"dc904419-43b3-4164-8efb-b493171791cc","Type":"ContainerStarted","Data":"db68c1cea3e3dc00ddeae4350261332d29e67adfadb9250501e6ae376506065a"} Mar 12 13:12:43 crc kubenswrapper[4921]: I0312 13:12:43.309686 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xxd4x" podStartSLOduration=6.893077406 podStartE2EDuration="57.309621222s" podCreationTimestamp="2026-03-12 13:11:46 +0000 UTC" firstStartedPulling="2026-03-12 13:11:51.325117611 +0000 UTC m=+134.015189582" lastFinishedPulling="2026-03-12 13:12:41.741661427 +0000 UTC m=+184.431733398" observedRunningTime="2026-03-12 13:12:42.308389671 +0000 UTC m=+184.998461652" watchObservedRunningTime="2026-03-12 13:12:43.309621222 +0000 UTC m=+185.999693193" Mar 12 13:12:43 crc kubenswrapper[4921]: I0312 13:12:43.315304 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7vlvg" podStartSLOduration=8.210050284 podStartE2EDuration="1m0.315281724s" podCreationTimestamp="2026-03-12 13:11:43 +0000 UTC" firstStartedPulling="2026-03-12 13:11:50.215303178 +0000 UTC m=+132.905375149" lastFinishedPulling="2026-03-12 13:12:42.320534608 +0000 UTC m=+185.010606589" observedRunningTime="2026-03-12 13:12:43.309598002 +0000 UTC m=+185.999670013" watchObservedRunningTime="2026-03-12 13:12:43.315281724 +0000 UTC m=+186.005353695" Mar 12 13:12:44 crc kubenswrapper[4921]: I0312 13:12:44.298661 4921 generic.go:334] "Generic (PLEG): container finished" podID="ec0983c2-4cd5-41aa-972c-60dd47817a5b" containerID="da664644d55c54ec03ffc95ad6a77e6707f840ffdf66c9bf57e1a5e64845477c" exitCode=0 Mar 12 13:12:44 crc kubenswrapper[4921]: I0312 13:12:44.298752 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv2xc" event={"ID":"ec0983c2-4cd5-41aa-972c-60dd47817a5b","Type":"ContainerDied","Data":"da664644d55c54ec03ffc95ad6a77e6707f840ffdf66c9bf57e1a5e64845477c"} Mar 12 13:12:45 crc kubenswrapper[4921]: I0312 13:12:45.308052 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv2xc" event={"ID":"ec0983c2-4cd5-41aa-972c-60dd47817a5b","Type":"ContainerStarted","Data":"fb984f406a5e52ccc6b6180231010e7945cb4e444dedad3c3223fddb1934c3e5"} Mar 12 13:12:45 crc kubenswrapper[4921]: I0312 13:12:45.330440 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kv2xc" podStartSLOduration=8.7541797 podStartE2EDuration="1m2.330421699s" podCreationTimestamp="2026-03-12 13:11:43 +0000 UTC" firstStartedPulling="2026-03-12 13:11:51.473276014 +0000 UTC m=+134.163347985" lastFinishedPulling="2026-03-12 13:12:45.049518013 +0000 UTC m=+187.739589984" observedRunningTime="2026-03-12 13:12:45.326047581 +0000 UTC m=+188.016119552" watchObservedRunningTime="2026-03-12 13:12:45.330421699 +0000 UTC m=+188.020493670" Mar 12 13:12:47 crc kubenswrapper[4921]: I0312 13:12:47.550775 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:12:47 crc kubenswrapper[4921]: I0312 13:12:47.551869 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:12:47 crc kubenswrapper[4921]: I0312 13:12:47.624161 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:12:47 crc kubenswrapper[4921]: I0312 13:12:47.643704 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:12:47 crc kubenswrapper[4921]: I0312 13:12:47.643764 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:12:47 crc kubenswrapper[4921]: I0312 13:12:47.652932 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:12:47 crc kubenswrapper[4921]: I0312 13:12:47.652980 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:12:47 crc kubenswrapper[4921]: I0312 13:12:47.659731 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:12:47 crc kubenswrapper[4921]: I0312 13:12:47.715392 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:12:47 crc kubenswrapper[4921]: I0312 13:12:47.719623 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:12:47 crc kubenswrapper[4921]: I0312 13:12:47.739255 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:12:47 crc kubenswrapper[4921]: I0312 13:12:47.740455 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:12:47 crc kubenswrapper[4921]: I0312 13:12:47.785459 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:12:48 crc kubenswrapper[4921]: I0312 13:12:48.381996 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:12:48 crc kubenswrapper[4921]: I0312 13:12:48.398711 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:12:48 crc kubenswrapper[4921]: I0312 13:12:48.729651 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xxd4x" podUID="0840f674-6e13-4336-ad20-a67b979ae5ba" containerName="registry-server" probeResult="failure" output=< Mar 12 13:12:48 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 13:12:48 crc kubenswrapper[4921]: > Mar 12 13:12:50 crc kubenswrapper[4921]: I0312 13:12:50.099229 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7vlvg"] Mar 12 13:12:50 crc kubenswrapper[4921]: I0312 13:12:50.343585 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7vlvg" podUID="dc904419-43b3-4164-8efb-b493171791cc" containerName="registry-server" containerID="cri-o://db68c1cea3e3dc00ddeae4350261332d29e67adfadb9250501e6ae376506065a" gracePeriod=2 Mar 12 13:12:51 crc kubenswrapper[4921]: I0312 13:12:51.354220 4921 generic.go:334] "Generic (PLEG): container finished" podID="dc904419-43b3-4164-8efb-b493171791cc" containerID="db68c1cea3e3dc00ddeae4350261332d29e67adfadb9250501e6ae376506065a" exitCode=0 Mar 12 13:12:51 crc kubenswrapper[4921]: I0312 13:12:51.354296 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7vlvg" event={"ID":"dc904419-43b3-4164-8efb-b493171791cc","Type":"ContainerDied","Data":"db68c1cea3e3dc00ddeae4350261332d29e67adfadb9250501e6ae376506065a"} Mar 12 13:12:51 crc kubenswrapper[4921]: I0312 13:12:51.355170 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7vlvg" event={"ID":"dc904419-43b3-4164-8efb-b493171791cc","Type":"ContainerDied","Data":"737b7a0e9d1bf6c8f9f36aeeee1f7067b2978ff3ff389e9a6d17e7985d911a25"} Mar 12 13:12:51 crc kubenswrapper[4921]: I0312 13:12:51.355194 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="737b7a0e9d1bf6c8f9f36aeeee1f7067b2978ff3ff389e9a6d17e7985d911a25" Mar 12 13:12:51 crc kubenswrapper[4921]: I0312 13:12:51.373273 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:12:51 crc kubenswrapper[4921]: I0312 13:12:51.564253 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsbrp\" (UniqueName: \"kubernetes.io/projected/dc904419-43b3-4164-8efb-b493171791cc-kube-api-access-dsbrp\") pod \"dc904419-43b3-4164-8efb-b493171791cc\" (UID: \"dc904419-43b3-4164-8efb-b493171791cc\") " Mar 12 13:12:51 crc kubenswrapper[4921]: I0312 13:12:51.564329 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc904419-43b3-4164-8efb-b493171791cc-catalog-content\") pod \"dc904419-43b3-4164-8efb-b493171791cc\" (UID: \"dc904419-43b3-4164-8efb-b493171791cc\") " Mar 12 13:12:51 crc kubenswrapper[4921]: I0312 13:12:51.564442 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc904419-43b3-4164-8efb-b493171791cc-utilities\") pod \"dc904419-43b3-4164-8efb-b493171791cc\" (UID: \"dc904419-43b3-4164-8efb-b493171791cc\") " Mar 12 13:12:51 crc kubenswrapper[4921]: I0312 13:12:51.565330 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc904419-43b3-4164-8efb-b493171791cc-utilities" (OuterVolumeSpecName: "utilities") pod "dc904419-43b3-4164-8efb-b493171791cc" (UID: "dc904419-43b3-4164-8efb-b493171791cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:12:51 crc kubenswrapper[4921]: I0312 13:12:51.572549 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc904419-43b3-4164-8efb-b493171791cc-kube-api-access-dsbrp" (OuterVolumeSpecName: "kube-api-access-dsbrp") pod "dc904419-43b3-4164-8efb-b493171791cc" (UID: "dc904419-43b3-4164-8efb-b493171791cc"). InnerVolumeSpecName "kube-api-access-dsbrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:12:51 crc kubenswrapper[4921]: I0312 13:12:51.622708 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc904419-43b3-4164-8efb-b493171791cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc904419-43b3-4164-8efb-b493171791cc" (UID: "dc904419-43b3-4164-8efb-b493171791cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:12:51 crc kubenswrapper[4921]: I0312 13:12:51.666050 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsbrp\" (UniqueName: \"kubernetes.io/projected/dc904419-43b3-4164-8efb-b493171791cc-kube-api-access-dsbrp\") on node \"crc\" DevicePath \"\"" Mar 12 13:12:51 crc kubenswrapper[4921]: I0312 13:12:51.666096 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc904419-43b3-4164-8efb-b493171791cc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:12:51 crc kubenswrapper[4921]: I0312 13:12:51.666111 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc904419-43b3-4164-8efb-b493171791cc-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:12:52 crc kubenswrapper[4921]: I0312 13:12:52.360436 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7vlvg" Mar 12 13:12:52 crc kubenswrapper[4921]: I0312 13:12:52.382375 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7vlvg"] Mar 12 13:12:52 crc kubenswrapper[4921]: I0312 13:12:52.382420 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7vlvg"] Mar 12 13:12:53 crc kubenswrapper[4921]: I0312 13:12:53.992997 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc904419-43b3-4164-8efb-b493171791cc" path="/var/lib/kubelet/pods/dc904419-43b3-4164-8efb-b493171791cc/volumes" Mar 12 13:12:57 crc kubenswrapper[4921]: I0312 13:12:57.683042 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:12:57 crc kubenswrapper[4921]: I0312 13:12:57.738193 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:12:57 crc kubenswrapper[4921]: I0312 13:12:57.782883 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:12:57 crc kubenswrapper[4921]: I0312 13:12:57.913651 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xxd4x"] Mar 12 13:12:59 crc kubenswrapper[4921]: I0312 13:12:59.396808 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xxd4x" podUID="0840f674-6e13-4336-ad20-a67b979ae5ba" containerName="registry-server" containerID="cri-o://ca5526df56c7b00ac88b2d274d1930af1a4889293b278ca5fd7b8ae92aa8839f" gracePeriod=2 Mar 12 13:12:59 crc kubenswrapper[4921]: I0312 13:12:59.886047 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.012120 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gmld\" (UniqueName: \"kubernetes.io/projected/0840f674-6e13-4336-ad20-a67b979ae5ba-kube-api-access-5gmld\") pod \"0840f674-6e13-4336-ad20-a67b979ae5ba\" (UID: \"0840f674-6e13-4336-ad20-a67b979ae5ba\") " Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.012172 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0840f674-6e13-4336-ad20-a67b979ae5ba-catalog-content\") pod \"0840f674-6e13-4336-ad20-a67b979ae5ba\" (UID: \"0840f674-6e13-4336-ad20-a67b979ae5ba\") " Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.012208 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0840f674-6e13-4336-ad20-a67b979ae5ba-utilities\") pod \"0840f674-6e13-4336-ad20-a67b979ae5ba\" (UID: \"0840f674-6e13-4336-ad20-a67b979ae5ba\") " Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.013561 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0840f674-6e13-4336-ad20-a67b979ae5ba-utilities" (OuterVolumeSpecName: "utilities") pod "0840f674-6e13-4336-ad20-a67b979ae5ba" (UID: "0840f674-6e13-4336-ad20-a67b979ae5ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.020374 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0840f674-6e13-4336-ad20-a67b979ae5ba-kube-api-access-5gmld" (OuterVolumeSpecName: "kube-api-access-5gmld") pod "0840f674-6e13-4336-ad20-a67b979ae5ba" (UID: "0840f674-6e13-4336-ad20-a67b979ae5ba"). InnerVolumeSpecName "kube-api-access-5gmld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.117406 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gmld\" (UniqueName: \"kubernetes.io/projected/0840f674-6e13-4336-ad20-a67b979ae5ba-kube-api-access-5gmld\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.117455 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0840f674-6e13-4336-ad20-a67b979ae5ba-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.125723 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kv2xc"] Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.126054 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kv2xc" podUID="ec0983c2-4cd5-41aa-972c-60dd47817a5b" containerName="registry-server" containerID="cri-o://fb984f406a5e52ccc6b6180231010e7945cb4e444dedad3c3223fddb1934c3e5" gracePeriod=2 Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.159330 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0840f674-6e13-4336-ad20-a67b979ae5ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0840f674-6e13-4336-ad20-a67b979ae5ba" (UID: "0840f674-6e13-4336-ad20-a67b979ae5ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.218757 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0840f674-6e13-4336-ad20-a67b979ae5ba-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.404567 4921 generic.go:334] "Generic (PLEG): container finished" podID="0840f674-6e13-4336-ad20-a67b979ae5ba" containerID="ca5526df56c7b00ac88b2d274d1930af1a4889293b278ca5fd7b8ae92aa8839f" exitCode=0 Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.404678 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xxd4x" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.404698 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxd4x" event={"ID":"0840f674-6e13-4336-ad20-a67b979ae5ba","Type":"ContainerDied","Data":"ca5526df56c7b00ac88b2d274d1930af1a4889293b278ca5fd7b8ae92aa8839f"} Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.404760 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xxd4x" event={"ID":"0840f674-6e13-4336-ad20-a67b979ae5ba","Type":"ContainerDied","Data":"e31317af4b6c9d980d16ea822c5a98a1a2f2559c32cca024a2f244625fd26be0"} Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.404791 4921 scope.go:117] "RemoveContainer" containerID="ca5526df56c7b00ac88b2d274d1930af1a4889293b278ca5fd7b8ae92aa8839f" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.409111 4921 generic.go:334] "Generic (PLEG): container finished" podID="ec0983c2-4cd5-41aa-972c-60dd47817a5b" containerID="fb984f406a5e52ccc6b6180231010e7945cb4e444dedad3c3223fddb1934c3e5" exitCode=0 Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.409177 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv2xc" event={"ID":"ec0983c2-4cd5-41aa-972c-60dd47817a5b","Type":"ContainerDied","Data":"fb984f406a5e52ccc6b6180231010e7945cb4e444dedad3c3223fddb1934c3e5"} Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.422683 4921 scope.go:117] "RemoveContainer" containerID="0caf51f05aaff9f1d164d56de25eaebfefe52095f88b13821b49609248057d87" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.424928 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.437500 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xxd4x"] Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.441451 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xxd4x"] Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.459024 4921 scope.go:117] "RemoveContainer" containerID="90d3a5563b82d98d5b3f5b84e37a859b1f7c1a1f1ace331931ada64ed8f3ab47" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.474121 4921 scope.go:117] "RemoveContainer" containerID="ca5526df56c7b00ac88b2d274d1930af1a4889293b278ca5fd7b8ae92aa8839f" Mar 12 13:13:00 crc kubenswrapper[4921]: E0312 13:13:00.474516 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca5526df56c7b00ac88b2d274d1930af1a4889293b278ca5fd7b8ae92aa8839f\": container with ID starting with ca5526df56c7b00ac88b2d274d1930af1a4889293b278ca5fd7b8ae92aa8839f not found: ID does not exist" containerID="ca5526df56c7b00ac88b2d274d1930af1a4889293b278ca5fd7b8ae92aa8839f" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.474561 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca5526df56c7b00ac88b2d274d1930af1a4889293b278ca5fd7b8ae92aa8839f"} err="failed to get container status \"ca5526df56c7b00ac88b2d274d1930af1a4889293b278ca5fd7b8ae92aa8839f\": rpc error: code = NotFound desc = could not find container \"ca5526df56c7b00ac88b2d274d1930af1a4889293b278ca5fd7b8ae92aa8839f\": container with ID starting with ca5526df56c7b00ac88b2d274d1930af1a4889293b278ca5fd7b8ae92aa8839f not found: ID does not exist" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.474594 4921 scope.go:117] "RemoveContainer" containerID="0caf51f05aaff9f1d164d56de25eaebfefe52095f88b13821b49609248057d87" Mar 12 13:13:00 crc kubenswrapper[4921]: E0312 13:13:00.475094 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0caf51f05aaff9f1d164d56de25eaebfefe52095f88b13821b49609248057d87\": container with ID starting with 0caf51f05aaff9f1d164d56de25eaebfefe52095f88b13821b49609248057d87 not found: ID does not exist" containerID="0caf51f05aaff9f1d164d56de25eaebfefe52095f88b13821b49609248057d87" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.475143 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0caf51f05aaff9f1d164d56de25eaebfefe52095f88b13821b49609248057d87"} err="failed to get container status \"0caf51f05aaff9f1d164d56de25eaebfefe52095f88b13821b49609248057d87\": rpc error: code = NotFound desc = could not find container \"0caf51f05aaff9f1d164d56de25eaebfefe52095f88b13821b49609248057d87\": container with ID starting with 0caf51f05aaff9f1d164d56de25eaebfefe52095f88b13821b49609248057d87 not found: ID does not exist" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.475178 4921 scope.go:117] "RemoveContainer" containerID="90d3a5563b82d98d5b3f5b84e37a859b1f7c1a1f1ace331931ada64ed8f3ab47" Mar 12 13:13:00 crc kubenswrapper[4921]: E0312 13:13:00.475480 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90d3a5563b82d98d5b3f5b84e37a859b1f7c1a1f1ace331931ada64ed8f3ab47\": container with ID starting with 90d3a5563b82d98d5b3f5b84e37a859b1f7c1a1f1ace331931ada64ed8f3ab47 not found: ID does not exist" containerID="90d3a5563b82d98d5b3f5b84e37a859b1f7c1a1f1ace331931ada64ed8f3ab47" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.475507 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90d3a5563b82d98d5b3f5b84e37a859b1f7c1a1f1ace331931ada64ed8f3ab47"} err="failed to get container status \"90d3a5563b82d98d5b3f5b84e37a859b1f7c1a1f1ace331931ada64ed8f3ab47\": rpc error: code = NotFound desc = could not find container \"90d3a5563b82d98d5b3f5b84e37a859b1f7c1a1f1ace331931ada64ed8f3ab47\": container with ID starting with 90d3a5563b82d98d5b3f5b84e37a859b1f7c1a1f1ace331931ada64ed8f3ab47 not found: ID does not exist" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.523928 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec0983c2-4cd5-41aa-972c-60dd47817a5b-utilities\") pod \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\" (UID: \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\") " Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.524010 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f8tt\" (UniqueName: \"kubernetes.io/projected/ec0983c2-4cd5-41aa-972c-60dd47817a5b-kube-api-access-7f8tt\") pod \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\" (UID: \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\") " Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.524088 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec0983c2-4cd5-41aa-972c-60dd47817a5b-catalog-content\") pod \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\" (UID: \"ec0983c2-4cd5-41aa-972c-60dd47817a5b\") " Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.524751 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec0983c2-4cd5-41aa-972c-60dd47817a5b-utilities" (OuterVolumeSpecName: "utilities") pod "ec0983c2-4cd5-41aa-972c-60dd47817a5b" (UID: "ec0983c2-4cd5-41aa-972c-60dd47817a5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.531661 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec0983c2-4cd5-41aa-972c-60dd47817a5b-kube-api-access-7f8tt" (OuterVolumeSpecName: "kube-api-access-7f8tt") pod "ec0983c2-4cd5-41aa-972c-60dd47817a5b" (UID: "ec0983c2-4cd5-41aa-972c-60dd47817a5b"). InnerVolumeSpecName "kube-api-access-7f8tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.578037 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec0983c2-4cd5-41aa-972c-60dd47817a5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec0983c2-4cd5-41aa-972c-60dd47817a5b" (UID: "ec0983c2-4cd5-41aa-972c-60dd47817a5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.626154 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec0983c2-4cd5-41aa-972c-60dd47817a5b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.626188 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec0983c2-4cd5-41aa-972c-60dd47817a5b-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:00 crc kubenswrapper[4921]: I0312 13:13:00.626199 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f8tt\" (UniqueName: \"kubernetes.io/projected/ec0983c2-4cd5-41aa-972c-60dd47817a5b-kube-api-access-7f8tt\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:01 crc kubenswrapper[4921]: I0312 13:13:01.421353 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kv2xc" event={"ID":"ec0983c2-4cd5-41aa-972c-60dd47817a5b","Type":"ContainerDied","Data":"0f6f61564077b52afe0655e33f51df2254868c50feeb4b55cd7844c02ce9a90a"} Mar 12 13:13:01 crc kubenswrapper[4921]: I0312 13:13:01.421413 4921 scope.go:117] "RemoveContainer" containerID="fb984f406a5e52ccc6b6180231010e7945cb4e444dedad3c3223fddb1934c3e5" Mar 12 13:13:01 crc kubenswrapper[4921]: I0312 13:13:01.421413 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kv2xc" Mar 12 13:13:01 crc kubenswrapper[4921]: I0312 13:13:01.434598 4921 scope.go:117] "RemoveContainer" containerID="da664644d55c54ec03ffc95ad6a77e6707f840ffdf66c9bf57e1a5e64845477c" Mar 12 13:13:01 crc kubenswrapper[4921]: I0312 13:13:01.449195 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kv2xc"] Mar 12 13:13:01 crc kubenswrapper[4921]: I0312 13:13:01.449339 4921 scope.go:117] "RemoveContainer" containerID="82b9d14f50937c345f0b422d4cfa5bb524775f1e32d61224f90b085dd36101eb" Mar 12 13:13:01 crc kubenswrapper[4921]: I0312 13:13:01.459166 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kv2xc"] Mar 12 13:13:01 crc kubenswrapper[4921]: I0312 13:13:01.994257 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0840f674-6e13-4336-ad20-a67b979ae5ba" path="/var/lib/kubelet/pods/0840f674-6e13-4336-ad20-a67b979ae5ba/volumes" Mar 12 13:13:01 crc kubenswrapper[4921]: I0312 13:13:01.995104 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec0983c2-4cd5-41aa-972c-60dd47817a5b" path="/var/lib/kubelet/pods/ec0983c2-4cd5-41aa-972c-60dd47817a5b/volumes" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.553317 4921 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.553916 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8433ae-09da-4dfb-98c6-922fcfbaa546" containerName="extract-utilities" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.553932 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8433ae-09da-4dfb-98c6-922fcfbaa546" containerName="extract-utilities" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.553942 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc904419-43b3-4164-8efb-b493171791cc" containerName="registry-server" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.553949 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc904419-43b3-4164-8efb-b493171791cc" containerName="registry-server" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.553964 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc904419-43b3-4164-8efb-b493171791cc" containerName="extract-utilities" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.553970 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc904419-43b3-4164-8efb-b493171791cc" containerName="extract-utilities" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.553981 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8433ae-09da-4dfb-98c6-922fcfbaa546" containerName="registry-server" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.553987 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8433ae-09da-4dfb-98c6-922fcfbaa546" containerName="registry-server" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.553996 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a8433ae-09da-4dfb-98c6-922fcfbaa546" containerName="extract-content" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554003 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a8433ae-09da-4dfb-98c6-922fcfbaa546" containerName="extract-content" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.554012 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc904419-43b3-4164-8efb-b493171791cc" containerName="extract-content" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554018 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc904419-43b3-4164-8efb-b493171791cc" containerName="extract-content" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.554026 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec0983c2-4cd5-41aa-972c-60dd47817a5b" containerName="extract-utilities" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554034 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec0983c2-4cd5-41aa-972c-60dd47817a5b" containerName="extract-utilities" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.554044 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484da162-312d-46b3-a31b-a5cdd420d742" containerName="pruner" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554050 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="484da162-312d-46b3-a31b-a5cdd420d742" containerName="pruner" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.554057 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec0983c2-4cd5-41aa-972c-60dd47817a5b" containerName="registry-server" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554063 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec0983c2-4cd5-41aa-972c-60dd47817a5b" containerName="registry-server" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.554075 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0840f674-6e13-4336-ad20-a67b979ae5ba" containerName="extract-content" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554080 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0840f674-6e13-4336-ad20-a67b979ae5ba" containerName="extract-content" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.554088 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0840f674-6e13-4336-ad20-a67b979ae5ba" containerName="registry-server" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554095 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0840f674-6e13-4336-ad20-a67b979ae5ba" containerName="registry-server" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.554103 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0840f674-6e13-4336-ad20-a67b979ae5ba" containerName="extract-utilities" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554109 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0840f674-6e13-4336-ad20-a67b979ae5ba" containerName="extract-utilities" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.554117 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec0983c2-4cd5-41aa-972c-60dd47817a5b" containerName="extract-content" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554124 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec0983c2-4cd5-41aa-972c-60dd47817a5b" containerName="extract-content" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.554133 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6eb617-cfea-4abf-81fd-8417dc305d9c" containerName="oc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554139 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6eb617-cfea-4abf-81fd-8417dc305d9c" containerName="oc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554233 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a8433ae-09da-4dfb-98c6-922fcfbaa546" containerName="registry-server" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554243 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec0983c2-4cd5-41aa-972c-60dd47817a5b" containerName="registry-server" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554251 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc6eb617-cfea-4abf-81fd-8417dc305d9c" containerName="oc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554258 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="484da162-312d-46b3-a31b-a5cdd420d742" containerName="pruner" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554267 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc904419-43b3-4164-8efb-b493171791cc" containerName="registry-server" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554280 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0840f674-6e13-4336-ad20-a67b979ae5ba" containerName="registry-server" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.554673 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556103 4921 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556241 4921 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.556374 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556390 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.556401 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556409 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.556419 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556426 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.556434 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556440 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.556447 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556453 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.556461 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556467 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.556478 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556484 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.556492 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556498 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.556506 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556512 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.556520 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556525 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556537 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://72d5a41ba5e6693fddc4ac804a3ac70e84fba4c345d616f9c7ad0edf9cd18636" gracePeriod=15 Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556624 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556633 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556644 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556651 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556659 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556635 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://afa2ffb558d1975cd44b098a477b250aeb500db914f05edfe28794ceb4218bb8" gracePeriod=15 Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556656 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://06a9f9dc3ddcc223d93884c566c8a5eb6afc46157cd8bb98148d461b90f859c6" gracePeriod=15 Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556666 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556779 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556635 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://420a0b9ad4ac14e41f95cb652a63f4511903be6dd56ae8b3158029d208e2af60" gracePeriod=15 Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.556731 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://bd41c2aa1fad49385a95b1988c6aabd969696c85268947327ec1e6149cac6aa9" gracePeriod=15 Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.557014 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.557029 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.589109 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.714419 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.714909 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.714942 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.714973 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.715015 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.715034 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.715050 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.715066 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.816673 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.816780 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.816806 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.816846 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.816854 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.816865 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.816924 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.816949 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.817068 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.817086 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.817198 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.817221 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.817235 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.817283 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.817336 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.817402 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: I0312 13:13:06.885694 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:06 crc kubenswrapper[4921]: E0312 13:13:06.916056 4921 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c1a395a927124 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:13:06.912137508 +0000 UTC m=+209.602209509,LastTimestamp:2026-03-12 13:13:06.912137508 +0000 UTC m=+209.602209509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.456511 4921 generic.go:334] "Generic (PLEG): container finished" podID="14fec143-fc2c-4cef-93c5-0bcc947068a3" containerID="b70d308c9ece568252fb9953fd041234677ae9d05a84ae11c9cdb61916f02ee3" exitCode=0 Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.456602 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"14fec143-fc2c-4cef-93c5-0bcc947068a3","Type":"ContainerDied","Data":"b70d308c9ece568252fb9953fd041234677ae9d05a84ae11c9cdb61916f02ee3"} Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.457557 4921 status_manager.go:851] "Failed to get status for pod" podUID="14fec143-fc2c-4cef-93c5-0bcc947068a3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.458033 4921 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.459032 4921 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.459629 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.461002 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.461585 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="afa2ffb558d1975cd44b098a477b250aeb500db914f05edfe28794ceb4218bb8" exitCode=0 Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.461605 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="06a9f9dc3ddcc223d93884c566c8a5eb6afc46157cd8bb98148d461b90f859c6" exitCode=0 Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.461613 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="420a0b9ad4ac14e41f95cb652a63f4511903be6dd56ae8b3158029d208e2af60" exitCode=0 Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.461621 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bd41c2aa1fad49385a95b1988c6aabd969696c85268947327ec1e6149cac6aa9" exitCode=2 Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.461670 4921 scope.go:117] "RemoveContainer" containerID="35202d539a243cb28c79808a706d0f7030ad1b011ea706c3bf1132d623651ff6" Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.463301 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1503c995a91d9c1f12166bd1235aa10c9cd354ed397648bceb771dd63797d548"} Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.463322 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3f388d454a31020ebd30cf28509c2c0dd00161c00b7104947dd34e5db6edc6f3"} Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.464178 4921 status_manager.go:851] "Failed to get status for pod" podUID="14fec143-fc2c-4cef-93c5-0bcc947068a3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.464471 4921 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.464885 4921 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.538248 4921 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.538308 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 12 13:13:07 crc kubenswrapper[4921]: E0312 13:13:07.826518 4921 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:07 crc kubenswrapper[4921]: E0312 13:13:07.826933 4921 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:07 crc kubenswrapper[4921]: E0312 13:13:07.827122 4921 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:07 crc kubenswrapper[4921]: E0312 13:13:07.827541 4921 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:07 crc kubenswrapper[4921]: E0312 13:13:07.828348 4921 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.828402 4921 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 12 13:13:07 crc kubenswrapper[4921]: E0312 13:13:07.828773 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="200ms" Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.986229 4921 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.986725 4921 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:07 crc kubenswrapper[4921]: I0312 13:13:07.986968 4921 status_manager.go:851] "Failed to get status for pod" podUID="14fec143-fc2c-4cef-93c5-0bcc947068a3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:08 crc kubenswrapper[4921]: E0312 13:13:08.030197 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="400ms" Mar 12 13:13:08 crc kubenswrapper[4921]: E0312 13:13:08.430876 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="800ms" Mar 12 13:13:08 crc kubenswrapper[4921]: I0312 13:13:08.472665 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 13:13:08 crc kubenswrapper[4921]: I0312 13:13:08.757971 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 13:13:08 crc kubenswrapper[4921]: I0312 13:13:08.758640 4921 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:08 crc kubenswrapper[4921]: I0312 13:13:08.758999 4921 status_manager.go:851] "Failed to get status for pod" podUID="14fec143-fc2c-4cef-93c5-0bcc947068a3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:08 crc kubenswrapper[4921]: I0312 13:13:08.850168 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14fec143-fc2c-4cef-93c5-0bcc947068a3-kubelet-dir\") pod \"14fec143-fc2c-4cef-93c5-0bcc947068a3\" (UID: \"14fec143-fc2c-4cef-93c5-0bcc947068a3\") " Mar 12 13:13:08 crc kubenswrapper[4921]: I0312 13:13:08.850480 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/14fec143-fc2c-4cef-93c5-0bcc947068a3-var-lock\") pod \"14fec143-fc2c-4cef-93c5-0bcc947068a3\" (UID: \"14fec143-fc2c-4cef-93c5-0bcc947068a3\") " Mar 12 13:13:08 crc kubenswrapper[4921]: I0312 13:13:08.850576 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14fec143-fc2c-4cef-93c5-0bcc947068a3-kube-api-access\") pod \"14fec143-fc2c-4cef-93c5-0bcc947068a3\" (UID: \"14fec143-fc2c-4cef-93c5-0bcc947068a3\") " Mar 12 13:13:08 crc kubenswrapper[4921]: I0312 13:13:08.851654 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14fec143-fc2c-4cef-93c5-0bcc947068a3-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "14fec143-fc2c-4cef-93c5-0bcc947068a3" (UID: "14fec143-fc2c-4cef-93c5-0bcc947068a3"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:13:08 crc kubenswrapper[4921]: I0312 13:13:08.851703 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14fec143-fc2c-4cef-93c5-0bcc947068a3-var-lock" (OuterVolumeSpecName: "var-lock") pod "14fec143-fc2c-4cef-93c5-0bcc947068a3" (UID: "14fec143-fc2c-4cef-93c5-0bcc947068a3"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:13:08 crc kubenswrapper[4921]: I0312 13:13:08.855260 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14fec143-fc2c-4cef-93c5-0bcc947068a3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "14fec143-fc2c-4cef-93c5-0bcc947068a3" (UID: "14fec143-fc2c-4cef-93c5-0bcc947068a3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:13:08 crc kubenswrapper[4921]: I0312 13:13:08.952279 4921 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14fec143-fc2c-4cef-93c5-0bcc947068a3-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:08 crc kubenswrapper[4921]: I0312 13:13:08.952339 4921 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/14fec143-fc2c-4cef-93c5-0bcc947068a3-var-lock\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:08 crc kubenswrapper[4921]: I0312 13:13:08.952350 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14fec143-fc2c-4cef-93c5-0bcc947068a3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.029110 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.030469 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.031324 4921 status_manager.go:851] "Failed to get status for pod" podUID="14fec143-fc2c-4cef-93c5-0bcc947068a3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.032151 4921 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.032993 4921 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.154923 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.155022 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.155085 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.155189 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.155290 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.155349 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.156069 4921 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.156109 4921 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.156129 4921 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:09 crc kubenswrapper[4921]: E0312 13:13:09.172084 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:13:09Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:13:09Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:13:09Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T13:13:09Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:0d4c830b2653f2eeffebd09537afb06afb5ae827adbc03f224ab7269f399c05c\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d6065909bc521a3f9a85174276fdbceafad02a276449a7dd1952a1f689b0d362\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1735807445},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:185237e125a9d710a58d4b588ea6b75eb361e4e99d979c1acd193de3b5d787f1\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:746054bb64fa0b27b1a696cd5db508bb9ee883a94969e4c1c4b5d35a93da8ef5\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1281521882},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0e6908b5c2800b56584a3fdf3bc164b76cb945966a49103123dabb61f8e367f2\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:ad31505e97766fe3b9d49abfe33098361de32a828c13e290be714f02a7ee76e0\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221788890},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:09 crc kubenswrapper[4921]: E0312 13:13:09.172918 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:09 crc kubenswrapper[4921]: E0312 13:13:09.173614 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:09 crc kubenswrapper[4921]: E0312 13:13:09.174092 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:09 crc kubenswrapper[4921]: E0312 13:13:09.174503 4921 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:09 crc kubenswrapper[4921]: E0312 13:13:09.174537 4921 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 13:13:09 crc kubenswrapper[4921]: E0312 13:13:09.232771 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="1.6s" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.483140 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"14fec143-fc2c-4cef-93c5-0bcc947068a3","Type":"ContainerDied","Data":"955fa95da25582dc391489a8451ee184f86743a454a27f00f64b934707b02edd"} Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.483223 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="955fa95da25582dc391489a8451ee184f86743a454a27f00f64b934707b02edd" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.483360 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.493282 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.494890 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="72d5a41ba5e6693fddc4ac804a3ac70e84fba4c345d616f9c7ad0edf9cd18636" exitCode=0 Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.494975 4921 scope.go:117] "RemoveContainer" containerID="afa2ffb558d1975cd44b098a477b250aeb500db914f05edfe28794ceb4218bb8" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.495092 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.518994 4921 status_manager.go:851] "Failed to get status for pod" podUID="14fec143-fc2c-4cef-93c5-0bcc947068a3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.519628 4921 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.520319 4921 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.521009 4921 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.521533 4921 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.522086 4921 status_manager.go:851] "Failed to get status for pod" podUID="14fec143-fc2c-4cef-93c5-0bcc947068a3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.526888 4921 scope.go:117] "RemoveContainer" containerID="06a9f9dc3ddcc223d93884c566c8a5eb6afc46157cd8bb98148d461b90f859c6" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.549754 4921 scope.go:117] "RemoveContainer" containerID="420a0b9ad4ac14e41f95cb652a63f4511903be6dd56ae8b3158029d208e2af60" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.580662 4921 scope.go:117] "RemoveContainer" containerID="bd41c2aa1fad49385a95b1988c6aabd969696c85268947327ec1e6149cac6aa9" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.604755 4921 scope.go:117] "RemoveContainer" containerID="72d5a41ba5e6693fddc4ac804a3ac70e84fba4c345d616f9c7ad0edf9cd18636" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.631496 4921 scope.go:117] "RemoveContainer" containerID="afb697f6d82069298998c403418b1110436b90fdf525301862cdb1bdb49ceeca" Mar 12 13:13:09 crc kubenswrapper[4921]: E0312 13:13:09.645295 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-b700691b7f77c4c918ad1e1deab7de223ed0acc92bf7d352cc231df31dcce01c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod14fec143_fc2c_4cef_93c5_0bcc947068a3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod14fec143_fc2c_4cef_93c5_0bcc947068a3.slice/crio-955fa95da25582dc391489a8451ee184f86743a454a27f00f64b934707b02edd\": RecentStats: unable to find data in memory cache]" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.665034 4921 scope.go:117] "RemoveContainer" containerID="afa2ffb558d1975cd44b098a477b250aeb500db914f05edfe28794ceb4218bb8" Mar 12 13:13:09 crc kubenswrapper[4921]: E0312 13:13:09.665941 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afa2ffb558d1975cd44b098a477b250aeb500db914f05edfe28794ceb4218bb8\": container with ID starting with afa2ffb558d1975cd44b098a477b250aeb500db914f05edfe28794ceb4218bb8 not found: ID does not exist" containerID="afa2ffb558d1975cd44b098a477b250aeb500db914f05edfe28794ceb4218bb8" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.665971 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afa2ffb558d1975cd44b098a477b250aeb500db914f05edfe28794ceb4218bb8"} err="failed to get container status \"afa2ffb558d1975cd44b098a477b250aeb500db914f05edfe28794ceb4218bb8\": rpc error: code = NotFound desc = could not find container \"afa2ffb558d1975cd44b098a477b250aeb500db914f05edfe28794ceb4218bb8\": container with ID starting with afa2ffb558d1975cd44b098a477b250aeb500db914f05edfe28794ceb4218bb8 not found: ID does not exist" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.665994 4921 scope.go:117] "RemoveContainer" containerID="06a9f9dc3ddcc223d93884c566c8a5eb6afc46157cd8bb98148d461b90f859c6" Mar 12 13:13:09 crc kubenswrapper[4921]: E0312 13:13:09.667405 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06a9f9dc3ddcc223d93884c566c8a5eb6afc46157cd8bb98148d461b90f859c6\": container with ID starting with 06a9f9dc3ddcc223d93884c566c8a5eb6afc46157cd8bb98148d461b90f859c6 not found: ID does not exist" containerID="06a9f9dc3ddcc223d93884c566c8a5eb6afc46157cd8bb98148d461b90f859c6" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.667449 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06a9f9dc3ddcc223d93884c566c8a5eb6afc46157cd8bb98148d461b90f859c6"} err="failed to get container status \"06a9f9dc3ddcc223d93884c566c8a5eb6afc46157cd8bb98148d461b90f859c6\": rpc error: code = NotFound desc = could not find container \"06a9f9dc3ddcc223d93884c566c8a5eb6afc46157cd8bb98148d461b90f859c6\": container with ID starting with 06a9f9dc3ddcc223d93884c566c8a5eb6afc46157cd8bb98148d461b90f859c6 not found: ID does not exist" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.667480 4921 scope.go:117] "RemoveContainer" containerID="420a0b9ad4ac14e41f95cb652a63f4511903be6dd56ae8b3158029d208e2af60" Mar 12 13:13:09 crc kubenswrapper[4921]: E0312 13:13:09.668193 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"420a0b9ad4ac14e41f95cb652a63f4511903be6dd56ae8b3158029d208e2af60\": container with ID starting with 420a0b9ad4ac14e41f95cb652a63f4511903be6dd56ae8b3158029d208e2af60 not found: ID does not exist" containerID="420a0b9ad4ac14e41f95cb652a63f4511903be6dd56ae8b3158029d208e2af60" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.668223 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420a0b9ad4ac14e41f95cb652a63f4511903be6dd56ae8b3158029d208e2af60"} err="failed to get container status \"420a0b9ad4ac14e41f95cb652a63f4511903be6dd56ae8b3158029d208e2af60\": rpc error: code = NotFound desc = could not find container \"420a0b9ad4ac14e41f95cb652a63f4511903be6dd56ae8b3158029d208e2af60\": container with ID starting with 420a0b9ad4ac14e41f95cb652a63f4511903be6dd56ae8b3158029d208e2af60 not found: ID does not exist" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.668241 4921 scope.go:117] "RemoveContainer" containerID="bd41c2aa1fad49385a95b1988c6aabd969696c85268947327ec1e6149cac6aa9" Mar 12 13:13:09 crc kubenswrapper[4921]: E0312 13:13:09.668558 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd41c2aa1fad49385a95b1988c6aabd969696c85268947327ec1e6149cac6aa9\": container with ID starting with bd41c2aa1fad49385a95b1988c6aabd969696c85268947327ec1e6149cac6aa9 not found: ID does not exist" containerID="bd41c2aa1fad49385a95b1988c6aabd969696c85268947327ec1e6149cac6aa9" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.668577 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd41c2aa1fad49385a95b1988c6aabd969696c85268947327ec1e6149cac6aa9"} err="failed to get container status \"bd41c2aa1fad49385a95b1988c6aabd969696c85268947327ec1e6149cac6aa9\": rpc error: code = NotFound desc = could not find container \"bd41c2aa1fad49385a95b1988c6aabd969696c85268947327ec1e6149cac6aa9\": container with ID starting with bd41c2aa1fad49385a95b1988c6aabd969696c85268947327ec1e6149cac6aa9 not found: ID does not exist" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.668590 4921 scope.go:117] "RemoveContainer" containerID="72d5a41ba5e6693fddc4ac804a3ac70e84fba4c345d616f9c7ad0edf9cd18636" Mar 12 13:13:09 crc kubenswrapper[4921]: E0312 13:13:09.668896 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d5a41ba5e6693fddc4ac804a3ac70e84fba4c345d616f9c7ad0edf9cd18636\": container with ID starting with 72d5a41ba5e6693fddc4ac804a3ac70e84fba4c345d616f9c7ad0edf9cd18636 not found: ID does not exist" containerID="72d5a41ba5e6693fddc4ac804a3ac70e84fba4c345d616f9c7ad0edf9cd18636" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.668935 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d5a41ba5e6693fddc4ac804a3ac70e84fba4c345d616f9c7ad0edf9cd18636"} err="failed to get container status \"72d5a41ba5e6693fddc4ac804a3ac70e84fba4c345d616f9c7ad0edf9cd18636\": rpc error: code = NotFound desc = could not find container \"72d5a41ba5e6693fddc4ac804a3ac70e84fba4c345d616f9c7ad0edf9cd18636\": container with ID starting with 72d5a41ba5e6693fddc4ac804a3ac70e84fba4c345d616f9c7ad0edf9cd18636 not found: ID does not exist" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.668961 4921 scope.go:117] "RemoveContainer" containerID="afb697f6d82069298998c403418b1110436b90fdf525301862cdb1bdb49ceeca" Mar 12 13:13:09 crc kubenswrapper[4921]: E0312 13:13:09.669251 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afb697f6d82069298998c403418b1110436b90fdf525301862cdb1bdb49ceeca\": container with ID starting with afb697f6d82069298998c403418b1110436b90fdf525301862cdb1bdb49ceeca not found: ID does not exist" containerID="afb697f6d82069298998c403418b1110436b90fdf525301862cdb1bdb49ceeca" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.669286 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb697f6d82069298998c403418b1110436b90fdf525301862cdb1bdb49ceeca"} err="failed to get container status \"afb697f6d82069298998c403418b1110436b90fdf525301862cdb1bdb49ceeca\": rpc error: code = NotFound desc = could not find container \"afb697f6d82069298998c403418b1110436b90fdf525301862cdb1bdb49ceeca\": container with ID starting with afb697f6d82069298998c403418b1110436b90fdf525301862cdb1bdb49ceeca not found: ID does not exist" Mar 12 13:13:09 crc kubenswrapper[4921]: I0312 13:13:09.995804 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 12 13:13:10 crc kubenswrapper[4921]: E0312 13:13:10.470855 4921 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c1a395a927124 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-12 13:13:06.912137508 +0000 UTC m=+209.602209509,LastTimestamp:2026-03-12 13:13:06.912137508 +0000 UTC m=+209.602209509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 12 13:13:10 crc kubenswrapper[4921]: E0312 13:13:10.833993 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="3.2s" Mar 12 13:13:14 crc kubenswrapper[4921]: E0312 13:13:14.035481 4921 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="6.4s" Mar 12 13:13:16 crc kubenswrapper[4921]: I0312 13:13:16.983483 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:16 crc kubenswrapper[4921]: I0312 13:13:16.985859 4921 status_manager.go:851] "Failed to get status for pod" podUID="14fec143-fc2c-4cef-93c5-0bcc947068a3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:16 crc kubenswrapper[4921]: I0312 13:13:16.986402 4921 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:17 crc kubenswrapper[4921]: I0312 13:13:17.013040 4921 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57ea06df-a88e-4331-a4d5-ae9d7801b73c" Mar 12 13:13:17 crc kubenswrapper[4921]: I0312 13:13:17.013098 4921 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57ea06df-a88e-4331-a4d5-ae9d7801b73c" Mar 12 13:13:17 crc kubenswrapper[4921]: E0312 13:13:17.013640 4921 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:17 crc kubenswrapper[4921]: I0312 13:13:17.014460 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:17 crc kubenswrapper[4921]: I0312 13:13:17.564767 4921 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="192a134a0055114ea4b0d94ed7ce92594a32e2a6e3f3ff1348394a6446edbd33" exitCode=0 Mar 12 13:13:17 crc kubenswrapper[4921]: I0312 13:13:17.564915 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"192a134a0055114ea4b0d94ed7ce92594a32e2a6e3f3ff1348394a6446edbd33"} Mar 12 13:13:17 crc kubenswrapper[4921]: I0312 13:13:17.565544 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3e17c62c0c84479a0917b5a96d5faae53ff9406a159da6d1ac69d8b8bf6828d4"} Mar 12 13:13:17 crc kubenswrapper[4921]: I0312 13:13:17.566180 4921 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57ea06df-a88e-4331-a4d5-ae9d7801b73c" Mar 12 13:13:17 crc kubenswrapper[4921]: I0312 13:13:17.566225 4921 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57ea06df-a88e-4331-a4d5-ae9d7801b73c" Mar 12 13:13:17 crc kubenswrapper[4921]: E0312 13:13:17.566984 4921 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:17 crc kubenswrapper[4921]: I0312 13:13:17.568123 4921 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:17 crc kubenswrapper[4921]: I0312 13:13:17.568505 4921 status_manager.go:851] "Failed to get status for pod" podUID="14fec143-fc2c-4cef-93c5-0bcc947068a3" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 12 13:13:18 crc kubenswrapper[4921]: I0312 13:13:18.572354 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4914fa0561ce78a9ce94afba018aaab585b48b64616fd2f48a5045406540524b"} Mar 12 13:13:18 crc kubenswrapper[4921]: I0312 13:13:18.572719 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4f09b0b2ebeb1857484fcd9ab86438ba4f9abf82419aaa860be8592f693c066e"} Mar 12 13:13:18 crc kubenswrapper[4921]: I0312 13:13:18.572732 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cb96bba3a4890c4028ab8f7ad2ca97eeb341b514579a2acda49f6e98a8a65d2a"} Mar 12 13:13:19 crc kubenswrapper[4921]: I0312 13:13:19.584636 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"83354850dd13947ca9f2473e48ee7b9f2dee5b0354c8d160c4835ecd5769bfbf"} Mar 12 13:13:19 crc kubenswrapper[4921]: I0312 13:13:19.584692 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fa73733e7f193b0257e6f9bd77daf3b8b226bf15c7cafe765130b1c6bd7cd56c"} Mar 12 13:13:19 crc kubenswrapper[4921]: I0312 13:13:19.586012 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:19 crc kubenswrapper[4921]: I0312 13:13:19.586382 4921 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57ea06df-a88e-4331-a4d5-ae9d7801b73c" Mar 12 13:13:19 crc kubenswrapper[4921]: I0312 13:13:19.586550 4921 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57ea06df-a88e-4331-a4d5-ae9d7801b73c" Mar 12 13:13:22 crc kubenswrapper[4921]: I0312 13:13:22.015365 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:22 crc kubenswrapper[4921]: I0312 13:13:22.015677 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:22 crc kubenswrapper[4921]: I0312 13:13:22.022024 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:24 crc kubenswrapper[4921]: I0312 13:13:24.606643 4921 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:24 crc kubenswrapper[4921]: I0312 13:13:24.614807 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 12 13:13:24 crc kubenswrapper[4921]: I0312 13:13:24.615947 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 12 13:13:24 crc kubenswrapper[4921]: I0312 13:13:24.615986 4921 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="40fd6609bda9c83a226d0c5c926067a50655b35b1b92b8cf4eaf211900ae707c" exitCode=1 Mar 12 13:13:24 crc kubenswrapper[4921]: I0312 13:13:24.616017 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"40fd6609bda9c83a226d0c5c926067a50655b35b1b92b8cf4eaf211900ae707c"} Mar 12 13:13:24 crc kubenswrapper[4921]: I0312 13:13:24.616444 4921 scope.go:117] "RemoveContainer" containerID="40fd6609bda9c83a226d0c5c926067a50655b35b1b92b8cf4eaf211900ae707c" Mar 12 13:13:25 crc kubenswrapper[4921]: I0312 13:13:25.636342 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 12 13:13:25 crc kubenswrapper[4921]: I0312 13:13:25.638054 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 12 13:13:25 crc kubenswrapper[4921]: I0312 13:13:25.638153 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"03249638a2a68949be6e38e645463bac452a9dae61c018d7d40cd2c0c6d5fd04"} Mar 12 13:13:25 crc kubenswrapper[4921]: I0312 13:13:25.640444 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-check-endpoints/0.log" Mar 12 13:13:25 crc kubenswrapper[4921]: I0312 13:13:25.642643 4921 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="83354850dd13947ca9f2473e48ee7b9f2dee5b0354c8d160c4835ecd5769bfbf" exitCode=255 Mar 12 13:13:25 crc kubenswrapper[4921]: I0312 13:13:25.642680 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"83354850dd13947ca9f2473e48ee7b9f2dee5b0354c8d160c4835ecd5769bfbf"} Mar 12 13:13:25 crc kubenswrapper[4921]: I0312 13:13:25.643099 4921 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57ea06df-a88e-4331-a4d5-ae9d7801b73c" Mar 12 13:13:25 crc kubenswrapper[4921]: I0312 13:13:25.643131 4921 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57ea06df-a88e-4331-a4d5-ae9d7801b73c" Mar 12 13:13:25 crc kubenswrapper[4921]: I0312 13:13:25.649883 4921 scope.go:117] "RemoveContainer" containerID="83354850dd13947ca9f2473e48ee7b9f2dee5b0354c8d160c4835ecd5769bfbf" Mar 12 13:13:25 crc kubenswrapper[4921]: I0312 13:13:25.650537 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:25 crc kubenswrapper[4921]: I0312 13:13:25.670890 4921 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="06337c89-5bce-4dd1-9cd4-49b08882b6a8" Mar 12 13:13:26 crc kubenswrapper[4921]: I0312 13:13:26.323692 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:13:26 crc kubenswrapper[4921]: I0312 13:13:26.323752 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:13:26 crc kubenswrapper[4921]: I0312 13:13:26.653744 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_71bb4a3aecc4ba5b26c4b7318770ce13/kube-apiserver-check-endpoints/0.log" Mar 12 13:13:26 crc kubenswrapper[4921]: I0312 13:13:26.656620 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3e59e77ecb10898b64557a0cb737d1adf4363b178f1b36c3db90d215309fff8b"} Mar 12 13:13:26 crc kubenswrapper[4921]: I0312 13:13:26.656823 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:26 crc kubenswrapper[4921]: I0312 13:13:26.656934 4921 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57ea06df-a88e-4331-a4d5-ae9d7801b73c" Mar 12 13:13:26 crc kubenswrapper[4921]: I0312 13:13:26.656959 4921 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57ea06df-a88e-4331-a4d5-ae9d7801b73c" Mar 12 13:13:27 crc kubenswrapper[4921]: I0312 13:13:27.672476 4921 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57ea06df-a88e-4331-a4d5-ae9d7801b73c" Mar 12 13:13:27 crc kubenswrapper[4921]: I0312 13:13:27.672526 4921 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="57ea06df-a88e-4331-a4d5-ae9d7801b73c" Mar 12 13:13:27 crc kubenswrapper[4921]: I0312 13:13:27.911057 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:13:28 crc kubenswrapper[4921]: I0312 13:13:28.016053 4921 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="06337c89-5bce-4dd1-9cd4-49b08882b6a8" Mar 12 13:13:30 crc kubenswrapper[4921]: I0312 13:13:30.738520 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 13:13:30 crc kubenswrapper[4921]: I0312 13:13:30.789303 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 12 13:13:30 crc kubenswrapper[4921]: I0312 13:13:30.907273 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 12 13:13:30 crc kubenswrapper[4921]: I0312 13:13:30.951530 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.066334 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.097896 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.212677 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.393435 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.397345 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.412572 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.426990 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.547283 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.567441 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.574592 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.685654 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.693953 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.763291 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.844883 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 12 13:13:31 crc kubenswrapper[4921]: I0312 13:13:31.877862 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 12 13:13:32 crc kubenswrapper[4921]: I0312 13:13:32.003515 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 12 13:13:32 crc kubenswrapper[4921]: I0312 13:13:32.057247 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 12 13:13:32 crc kubenswrapper[4921]: I0312 13:13:32.093980 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 12 13:13:32 crc kubenswrapper[4921]: I0312 13:13:32.103192 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 12 13:13:32 crc kubenswrapper[4921]: I0312 13:13:32.223011 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 12 13:13:32 crc kubenswrapper[4921]: I0312 13:13:32.261185 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 12 13:13:32 crc kubenswrapper[4921]: I0312 13:13:32.828481 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 12 13:13:32 crc kubenswrapper[4921]: I0312 13:13:32.846618 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 12 13:13:32 crc kubenswrapper[4921]: I0312 13:13:32.974535 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 13:13:33 crc kubenswrapper[4921]: I0312 13:13:33.009012 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 12 13:13:33 crc kubenswrapper[4921]: I0312 13:13:33.089598 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 12 13:13:33 crc kubenswrapper[4921]: I0312 13:13:33.299232 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 12 13:13:33 crc kubenswrapper[4921]: I0312 13:13:33.483994 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 12 13:13:33 crc kubenswrapper[4921]: I0312 13:13:33.632111 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 12 13:13:33 crc kubenswrapper[4921]: I0312 13:13:33.724638 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 12 13:13:33 crc kubenswrapper[4921]: I0312 13:13:33.776437 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 12 13:13:33 crc kubenswrapper[4921]: I0312 13:13:33.809647 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 12 13:13:33 crc kubenswrapper[4921]: I0312 13:13:33.863126 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 12 13:13:33 crc kubenswrapper[4921]: I0312 13:13:33.965865 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 12 13:13:33 crc kubenswrapper[4921]: I0312 13:13:33.979350 4921 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 12 13:13:33 crc kubenswrapper[4921]: I0312 13:13:33.999555 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 12 13:13:34 crc kubenswrapper[4921]: I0312 13:13:34.070651 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 12 13:13:34 crc kubenswrapper[4921]: I0312 13:13:34.135249 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 12 13:13:34 crc kubenswrapper[4921]: I0312 13:13:34.173590 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 12 13:13:34 crc kubenswrapper[4921]: I0312 13:13:34.180840 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 12 13:13:34 crc kubenswrapper[4921]: I0312 13:13:34.346445 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 12 13:13:34 crc kubenswrapper[4921]: I0312 13:13:34.438713 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 12 13:13:34 crc kubenswrapper[4921]: I0312 13:13:34.570864 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 12 13:13:34 crc kubenswrapper[4921]: I0312 13:13:34.787385 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 12 13:13:34 crc kubenswrapper[4921]: I0312 13:13:34.892328 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 13:13:35 crc kubenswrapper[4921]: I0312 13:13:35.177993 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 12 13:13:35 crc kubenswrapper[4921]: I0312 13:13:35.293619 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 12 13:13:35 crc kubenswrapper[4921]: I0312 13:13:35.651326 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 12 13:13:35 crc kubenswrapper[4921]: I0312 13:13:35.684515 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 12 13:13:35 crc kubenswrapper[4921]: I0312 13:13:35.771573 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 12 13:13:35 crc kubenswrapper[4921]: I0312 13:13:35.836167 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 12 13:13:35 crc kubenswrapper[4921]: I0312 13:13:35.879328 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 12 13:13:35 crc kubenswrapper[4921]: I0312 13:13:35.881535 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 12 13:13:36 crc kubenswrapper[4921]: I0312 13:13:36.077755 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 12 13:13:36 crc kubenswrapper[4921]: I0312 13:13:36.162382 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 12 13:13:36 crc kubenswrapper[4921]: I0312 13:13:36.242966 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 12 13:13:36 crc kubenswrapper[4921]: I0312 13:13:36.356338 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 12 13:13:36 crc kubenswrapper[4921]: I0312 13:13:36.676521 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 12 13:13:36 crc kubenswrapper[4921]: I0312 13:13:36.677424 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 12 13:13:36 crc kubenswrapper[4921]: I0312 13:13:36.797889 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 12 13:13:36 crc kubenswrapper[4921]: I0312 13:13:36.803264 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 12 13:13:36 crc kubenswrapper[4921]: I0312 13:13:36.967157 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 12 13:13:37 crc kubenswrapper[4921]: I0312 13:13:37.002961 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 13:13:37 crc kubenswrapper[4921]: I0312 13:13:37.422692 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 12 13:13:37 crc kubenswrapper[4921]: I0312 13:13:37.904109 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 12 13:13:37 crc kubenswrapper[4921]: I0312 13:13:37.904434 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 13:13:37 crc kubenswrapper[4921]: I0312 13:13:37.917061 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 12 13:13:38 crc kubenswrapper[4921]: I0312 13:13:38.010742 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 12 13:13:38 crc kubenswrapper[4921]: I0312 13:13:38.226479 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 12 13:13:38 crc kubenswrapper[4921]: I0312 13:13:38.281797 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 12 13:13:38 crc kubenswrapper[4921]: I0312 13:13:38.456607 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 12 13:13:38 crc kubenswrapper[4921]: I0312 13:13:38.691307 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 12 13:13:38 crc kubenswrapper[4921]: I0312 13:13:38.712436 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 12 13:13:39 crc kubenswrapper[4921]: I0312 13:13:39.179213 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 12 13:13:39 crc kubenswrapper[4921]: I0312 13:13:39.537214 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 12 13:13:39 crc kubenswrapper[4921]: I0312 13:13:39.634488 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 12 13:13:39 crc kubenswrapper[4921]: I0312 13:13:39.710783 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 12 13:13:39 crc kubenswrapper[4921]: I0312 13:13:39.814965 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 12 13:13:39 crc kubenswrapper[4921]: I0312 13:13:39.960048 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 12 13:13:39 crc kubenswrapper[4921]: I0312 13:13:39.967862 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 12 13:13:40 crc kubenswrapper[4921]: I0312 13:13:40.148767 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 12 13:13:40 crc kubenswrapper[4921]: I0312 13:13:40.151399 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 12 13:13:40 crc kubenswrapper[4921]: I0312 13:13:40.220884 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 12 13:13:40 crc kubenswrapper[4921]: I0312 13:13:40.426868 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 12 13:13:40 crc kubenswrapper[4921]: I0312 13:13:40.562082 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 13:13:40 crc kubenswrapper[4921]: I0312 13:13:40.588107 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 12 13:13:40 crc kubenswrapper[4921]: I0312 13:13:40.691253 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 12 13:13:40 crc kubenswrapper[4921]: I0312 13:13:40.698337 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 12 13:13:40 crc kubenswrapper[4921]: I0312 13:13:40.902968 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 12 13:13:40 crc kubenswrapper[4921]: I0312 13:13:40.922869 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 12 13:13:41 crc kubenswrapper[4921]: I0312 13:13:41.068654 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 12 13:13:41 crc kubenswrapper[4921]: I0312 13:13:41.323198 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 12 13:13:41 crc kubenswrapper[4921]: I0312 13:13:41.337356 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 12 13:13:41 crc kubenswrapper[4921]: I0312 13:13:41.401291 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 12 13:13:41 crc kubenswrapper[4921]: I0312 13:13:41.432157 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 12 13:13:41 crc kubenswrapper[4921]: I0312 13:13:41.530118 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 12 13:13:41 crc kubenswrapper[4921]: I0312 13:13:41.581877 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 12 13:13:41 crc kubenswrapper[4921]: I0312 13:13:41.591733 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 12 13:13:41 crc kubenswrapper[4921]: I0312 13:13:41.592069 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 12 13:13:41 crc kubenswrapper[4921]: I0312 13:13:41.672837 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 12 13:13:41 crc kubenswrapper[4921]: I0312 13:13:41.687137 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 12 13:13:41 crc kubenswrapper[4921]: I0312 13:13:41.771854 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 12 13:13:41 crc kubenswrapper[4921]: I0312 13:13:41.898431 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 12 13:13:41 crc kubenswrapper[4921]: I0312 13:13:41.955781 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.091340 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.111791 4921 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.116382 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.202325 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.294796 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.447170 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.469175 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.515345 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.708304 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.756171 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.802520 4921 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.809051 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=36.809029972 podStartE2EDuration="36.809029972s" podCreationTimestamp="2026-03-12 13:13:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:13:24.387561056 +0000 UTC m=+227.077633047" watchObservedRunningTime="2026-03-12 13:13:42.809029972 +0000 UTC m=+245.499101973" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.810037 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.810100 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.817937 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.841301 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.841275102 podStartE2EDuration="18.841275102s" podCreationTimestamp="2026-03-12 13:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:13:42.836786732 +0000 UTC m=+245.526858753" watchObservedRunningTime="2026-03-12 13:13:42.841275102 +0000 UTC m=+245.531347103" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.846242 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.880120 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.913585 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.914321 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.922971 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 12 13:13:42 crc kubenswrapper[4921]: I0312 13:13:42.994031 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.209871 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.313303 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.382232 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.468958 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.479167 4921 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.542094 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.623596 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.640021 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.677319 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.717787 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.738314 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.823821 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.841747 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.878591 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 12 13:13:43 crc kubenswrapper[4921]: I0312 13:13:43.914147 4921 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 12 13:13:44 crc kubenswrapper[4921]: I0312 13:13:44.053428 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 12 13:13:44 crc kubenswrapper[4921]: I0312 13:13:44.107083 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 12 13:13:44 crc kubenswrapper[4921]: I0312 13:13:44.118131 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 12 13:13:44 crc kubenswrapper[4921]: I0312 13:13:44.233665 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 12 13:13:44 crc kubenswrapper[4921]: I0312 13:13:44.290739 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 12 13:13:44 crc kubenswrapper[4921]: I0312 13:13:44.625372 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 12 13:13:44 crc kubenswrapper[4921]: I0312 13:13:44.646372 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 12 13:13:44 crc kubenswrapper[4921]: I0312 13:13:44.647438 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 12 13:13:44 crc kubenswrapper[4921]: I0312 13:13:44.710116 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 12 13:13:44 crc kubenswrapper[4921]: I0312 13:13:44.740611 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 12 13:13:44 crc kubenswrapper[4921]: I0312 13:13:44.880319 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 12 13:13:44 crc kubenswrapper[4921]: I0312 13:13:44.974550 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.009727 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.085136 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.093122 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.186088 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.199991 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.435779 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.485695 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.511124 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.627547 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.701999 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.829359 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.838815 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.880529 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.919882 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 12 13:13:45 crc kubenswrapper[4921]: I0312 13:13:45.978385 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.038214 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.112608 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.167104 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.359331 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.370856 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.491301 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.504725 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.510291 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.511431 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.599239 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.608070 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.625773 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.730699 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.874540 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 12 13:13:46 crc kubenswrapper[4921]: I0312 13:13:46.961340 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.007202 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.047601 4921 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.047985 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://1503c995a91d9c1f12166bd1235aa10c9cd354ed397648bceb771dd63797d548" gracePeriod=5 Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.067416 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.219555 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.250149 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.283536 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.302391 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.306448 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.328123 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.379659 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.430966 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.588931 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.617687 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.648898 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.679716 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.811151 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.877991 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 12 13:13:47 crc kubenswrapper[4921]: I0312 13:13:47.959657 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 12 13:13:48 crc kubenswrapper[4921]: I0312 13:13:48.018860 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 12 13:13:48 crc kubenswrapper[4921]: I0312 13:13:48.028812 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 12 13:13:48 crc kubenswrapper[4921]: I0312 13:13:48.055634 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 12 13:13:48 crc kubenswrapper[4921]: I0312 13:13:48.106593 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 12 13:13:48 crc kubenswrapper[4921]: I0312 13:13:48.119617 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 12 13:13:48 crc kubenswrapper[4921]: I0312 13:13:48.264491 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 12 13:13:48 crc kubenswrapper[4921]: I0312 13:13:48.333057 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 12 13:13:48 crc kubenswrapper[4921]: I0312 13:13:48.533746 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 12 13:13:48 crc kubenswrapper[4921]: I0312 13:13:48.583434 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 12 13:13:48 crc kubenswrapper[4921]: I0312 13:13:48.591968 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 12 13:13:48 crc kubenswrapper[4921]: I0312 13:13:48.632629 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 12 13:13:48 crc kubenswrapper[4921]: I0312 13:13:48.849501 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 12 13:13:48 crc kubenswrapper[4921]: I0312 13:13:48.909125 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.040388 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.113889 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.151920 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.175984 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.232927 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.251135 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.383897 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.483614 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.484870 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.572884 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.581851 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.670279 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.757739 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.875283 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.910319 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 12 13:13:49 crc kubenswrapper[4921]: I0312 13:13:49.990240 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 12 13:13:50 crc kubenswrapper[4921]: I0312 13:13:50.165181 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 12 13:13:50 crc kubenswrapper[4921]: I0312 13:13:50.416860 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 12 13:13:50 crc kubenswrapper[4921]: I0312 13:13:50.839156 4921 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 12 13:13:51 crc kubenswrapper[4921]: I0312 13:13:51.117185 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 12 13:13:51 crc kubenswrapper[4921]: I0312 13:13:51.556212 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 12 13:13:51 crc kubenswrapper[4921]: I0312 13:13:51.598049 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 12 13:13:51 crc kubenswrapper[4921]: I0312 13:13:51.625426 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 12 13:13:51 crc kubenswrapper[4921]: I0312 13:13:51.669601 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 12 13:13:51 crc kubenswrapper[4921]: I0312 13:13:51.702261 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 12 13:13:51 crc kubenswrapper[4921]: I0312 13:13:51.718909 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 12 13:13:51 crc kubenswrapper[4921]: I0312 13:13:51.770191 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 12 13:13:51 crc kubenswrapper[4921]: I0312 13:13:51.787342 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 13:13:51 crc kubenswrapper[4921]: I0312 13:13:51.844882 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 13:13:51 crc kubenswrapper[4921]: I0312 13:13:51.915629 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 12 13:13:51 crc kubenswrapper[4921]: I0312 13:13:51.984313 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.121484 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.163867 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.589942 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.608981 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.624084 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.652970 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.653089 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.709440 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.803593 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.803655 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.803690 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.803726 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.803794 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.803865 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.803889 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.803974 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.804117 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.804248 4921 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.804269 4921 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.804286 4921 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.804305 4921 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.814489 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.864085 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.873771 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.874185 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.874278 4921 scope.go:117] "RemoveContainer" containerID="1503c995a91d9c1f12166bd1235aa10c9cd354ed397648bceb771dd63797d548" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.873899 4921 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="1503c995a91d9c1f12166bd1235aa10c9cd354ed397648bceb771dd63797d548" exitCode=137 Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.893533 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.900587 4921 scope.go:117] "RemoveContainer" containerID="1503c995a91d9c1f12166bd1235aa10c9cd354ed397648bceb771dd63797d548" Mar 12 13:13:52 crc kubenswrapper[4921]: E0312 13:13:52.901032 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1503c995a91d9c1f12166bd1235aa10c9cd354ed397648bceb771dd63797d548\": container with ID starting with 1503c995a91d9c1f12166bd1235aa10c9cd354ed397648bceb771dd63797d548 not found: ID does not exist" containerID="1503c995a91d9c1f12166bd1235aa10c9cd354ed397648bceb771dd63797d548" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.901082 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1503c995a91d9c1f12166bd1235aa10c9cd354ed397648bceb771dd63797d548"} err="failed to get container status \"1503c995a91d9c1f12166bd1235aa10c9cd354ed397648bceb771dd63797d548\": rpc error: code = NotFound desc = could not find container \"1503c995a91d9c1f12166bd1235aa10c9cd354ed397648bceb771dd63797d548\": container with ID starting with 1503c995a91d9c1f12166bd1235aa10c9cd354ed397648bceb771dd63797d548 not found: ID does not exist" Mar 12 13:13:52 crc kubenswrapper[4921]: I0312 13:13:52.904997 4921 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 12 13:13:53 crc kubenswrapper[4921]: I0312 13:13:53.229922 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 12 13:13:53 crc kubenswrapper[4921]: I0312 13:13:53.298505 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 12 13:13:53 crc kubenswrapper[4921]: I0312 13:13:53.998287 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 12 13:13:53 crc kubenswrapper[4921]: I0312 13:13:53.999116 4921 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 12 13:13:54 crc kubenswrapper[4921]: I0312 13:13:54.013996 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 13:13:54 crc kubenswrapper[4921]: I0312 13:13:54.014044 4921 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="86179fa5-22c3-482c-b8bf-1796cb46f023" Mar 12 13:13:54 crc kubenswrapper[4921]: I0312 13:13:54.021334 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 12 13:13:54 crc kubenswrapper[4921]: I0312 13:13:54.021375 4921 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="86179fa5-22c3-482c-b8bf-1796cb46f023" Mar 12 13:13:54 crc kubenswrapper[4921]: I0312 13:13:54.115143 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 12 13:13:54 crc kubenswrapper[4921]: I0312 13:13:54.553234 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 12 13:13:56 crc kubenswrapper[4921]: I0312 13:13:56.324105 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:13:56 crc kubenswrapper[4921]: I0312 13:13:56.324202 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.172223 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555354-kkl75"] Mar 12 13:14:00 crc kubenswrapper[4921]: E0312 13:14:00.173572 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.173665 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 12 13:14:00 crc kubenswrapper[4921]: E0312 13:14:00.173749 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14fec143-fc2c-4cef-93c5-0bcc947068a3" containerName="installer" Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.173837 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="14fec143-fc2c-4cef-93c5-0bcc947068a3" containerName="installer" Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.174030 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="14fec143-fc2c-4cef-93c5-0bcc947068a3" containerName="installer" Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.174118 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.174600 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555354-kkl75" Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.176661 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.176895 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.177114 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.177909 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555354-kkl75"] Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.311703 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dggz9\" (UniqueName: \"kubernetes.io/projected/d5704616-685f-49f3-9dd7-bb080b87cf29-kube-api-access-dggz9\") pod \"auto-csr-approver-29555354-kkl75\" (UID: \"d5704616-685f-49f3-9dd7-bb080b87cf29\") " pod="openshift-infra/auto-csr-approver-29555354-kkl75" Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.413713 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dggz9\" (UniqueName: \"kubernetes.io/projected/d5704616-685f-49f3-9dd7-bb080b87cf29-kube-api-access-dggz9\") pod \"auto-csr-approver-29555354-kkl75\" (UID: \"d5704616-685f-49f3-9dd7-bb080b87cf29\") " pod="openshift-infra/auto-csr-approver-29555354-kkl75" Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.436788 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dggz9\" (UniqueName: \"kubernetes.io/projected/d5704616-685f-49f3-9dd7-bb080b87cf29-kube-api-access-dggz9\") pod \"auto-csr-approver-29555354-kkl75\" (UID: \"d5704616-685f-49f3-9dd7-bb080b87cf29\") " pod="openshift-infra/auto-csr-approver-29555354-kkl75" Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.491596 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555354-kkl75" Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.890509 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555354-kkl75"] Mar 12 13:14:00 crc kubenswrapper[4921]: W0312 13:14:00.902981 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5704616_685f_49f3_9dd7_bb080b87cf29.slice/crio-bbc03aaf28ba00fa6203d6ce72d0e7ad568a7c43f1f3d2a98474ab63d7f7a66c WatchSource:0}: Error finding container bbc03aaf28ba00fa6203d6ce72d0e7ad568a7c43f1f3d2a98474ab63d7f7a66c: Status 404 returned error can't find the container with id bbc03aaf28ba00fa6203d6ce72d0e7ad568a7c43f1f3d2a98474ab63d7f7a66c Mar 12 13:14:00 crc kubenswrapper[4921]: I0312 13:14:00.930367 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555354-kkl75" event={"ID":"d5704616-685f-49f3-9dd7-bb080b87cf29","Type":"ContainerStarted","Data":"bbc03aaf28ba00fa6203d6ce72d0e7ad568a7c43f1f3d2a98474ab63d7f7a66c"} Mar 12 13:14:02 crc kubenswrapper[4921]: I0312 13:14:02.943090 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5704616-685f-49f3-9dd7-bb080b87cf29" containerID="192152a3bc8743f7d3d4259bd68947af5ad7ef207a58fbb29a437f3f703149eb" exitCode=0 Mar 12 13:14:02 crc kubenswrapper[4921]: I0312 13:14:02.943132 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555354-kkl75" event={"ID":"d5704616-685f-49f3-9dd7-bb080b87cf29","Type":"ContainerDied","Data":"192152a3bc8743f7d3d4259bd68947af5ad7ef207a58fbb29a437f3f703149eb"} Mar 12 13:14:04 crc kubenswrapper[4921]: I0312 13:14:04.230704 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555354-kkl75" Mar 12 13:14:04 crc kubenswrapper[4921]: I0312 13:14:04.366893 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dggz9\" (UniqueName: \"kubernetes.io/projected/d5704616-685f-49f3-9dd7-bb080b87cf29-kube-api-access-dggz9\") pod \"d5704616-685f-49f3-9dd7-bb080b87cf29\" (UID: \"d5704616-685f-49f3-9dd7-bb080b87cf29\") " Mar 12 13:14:04 crc kubenswrapper[4921]: I0312 13:14:04.378929 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5704616-685f-49f3-9dd7-bb080b87cf29-kube-api-access-dggz9" (OuterVolumeSpecName: "kube-api-access-dggz9") pod "d5704616-685f-49f3-9dd7-bb080b87cf29" (UID: "d5704616-685f-49f3-9dd7-bb080b87cf29"). InnerVolumeSpecName "kube-api-access-dggz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:14:04 crc kubenswrapper[4921]: I0312 13:14:04.468349 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dggz9\" (UniqueName: \"kubernetes.io/projected/d5704616-685f-49f3-9dd7-bb080b87cf29-kube-api-access-dggz9\") on node \"crc\" DevicePath \"\"" Mar 12 13:14:04 crc kubenswrapper[4921]: I0312 13:14:04.956072 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555354-kkl75" event={"ID":"d5704616-685f-49f3-9dd7-bb080b87cf29","Type":"ContainerDied","Data":"bbc03aaf28ba00fa6203d6ce72d0e7ad568a7c43f1f3d2a98474ab63d7f7a66c"} Mar 12 13:14:04 crc kubenswrapper[4921]: I0312 13:14:04.956414 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbc03aaf28ba00fa6203d6ce72d0e7ad568a7c43f1f3d2a98474ab63d7f7a66c" Mar 12 13:14:04 crc kubenswrapper[4921]: I0312 13:14:04.956160 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555354-kkl75" Mar 12 13:14:22 crc kubenswrapper[4921]: I0312 13:14:22.763206 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c4c9867ff-skt86"] Mar 12 13:14:22 crc kubenswrapper[4921]: I0312 13:14:22.763991 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" podUID="82232748-66cb-4f41-857a-b6bbd9e03cf4" containerName="controller-manager" containerID="cri-o://292f63dbe64ce24c16a4731b5e3ca97db3bd234d46c97826674dcc990a3ec97f" gracePeriod=30 Mar 12 13:14:22 crc kubenswrapper[4921]: I0312 13:14:22.768469 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z"] Mar 12 13:14:22 crc kubenswrapper[4921]: I0312 13:14:22.768700 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" podUID="1fa61e61-04e1-4a45-966b-a347b6491128" containerName="route-controller-manager" containerID="cri-o://c7842a09fa638ae438b87666d44051084a29b051aef3571e9a197f16eee8a5d7" gracePeriod=30 Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.055980 4921 generic.go:334] "Generic (PLEG): container finished" podID="82232748-66cb-4f41-857a-b6bbd9e03cf4" containerID="292f63dbe64ce24c16a4731b5e3ca97db3bd234d46c97826674dcc990a3ec97f" exitCode=0 Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.056038 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" event={"ID":"82232748-66cb-4f41-857a-b6bbd9e03cf4","Type":"ContainerDied","Data":"292f63dbe64ce24c16a4731b5e3ca97db3bd234d46c97826674dcc990a3ec97f"} Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.057141 4921 generic.go:334] "Generic (PLEG): container finished" podID="1fa61e61-04e1-4a45-966b-a347b6491128" containerID="c7842a09fa638ae438b87666d44051084a29b051aef3571e9a197f16eee8a5d7" exitCode=0 Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.057159 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" event={"ID":"1fa61e61-04e1-4a45-966b-a347b6491128","Type":"ContainerDied","Data":"c7842a09fa638ae438b87666d44051084a29b051aef3571e9a197f16eee8a5d7"} Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.192670 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.197066 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.319612 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-proxy-ca-bundles\") pod \"82232748-66cb-4f41-857a-b6bbd9e03cf4\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.319680 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksh5m\" (UniqueName: \"kubernetes.io/projected/1fa61e61-04e1-4a45-966b-a347b6491128-kube-api-access-ksh5m\") pod \"1fa61e61-04e1-4a45-966b-a347b6491128\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.319712 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-config\") pod \"82232748-66cb-4f41-857a-b6bbd9e03cf4\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.319752 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa61e61-04e1-4a45-966b-a347b6491128-config\") pod \"1fa61e61-04e1-4a45-966b-a347b6491128\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.319775 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1fa61e61-04e1-4a45-966b-a347b6491128-client-ca\") pod \"1fa61e61-04e1-4a45-966b-a347b6491128\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.319797 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-client-ca\") pod \"82232748-66cb-4f41-857a-b6bbd9e03cf4\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.319840 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82232748-66cb-4f41-857a-b6bbd9e03cf4-serving-cert\") pod \"82232748-66cb-4f41-857a-b6bbd9e03cf4\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.319878 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fa61e61-04e1-4a45-966b-a347b6491128-serving-cert\") pod \"1fa61e61-04e1-4a45-966b-a347b6491128\" (UID: \"1fa61e61-04e1-4a45-966b-a347b6491128\") " Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.319909 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9dmj\" (UniqueName: \"kubernetes.io/projected/82232748-66cb-4f41-857a-b6bbd9e03cf4-kube-api-access-q9dmj\") pod \"82232748-66cb-4f41-857a-b6bbd9e03cf4\" (UID: \"82232748-66cb-4f41-857a-b6bbd9e03cf4\") " Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.320278 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "82232748-66cb-4f41-857a-b6bbd9e03cf4" (UID: "82232748-66cb-4f41-857a-b6bbd9e03cf4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.320368 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-config" (OuterVolumeSpecName: "config") pod "82232748-66cb-4f41-857a-b6bbd9e03cf4" (UID: "82232748-66cb-4f41-857a-b6bbd9e03cf4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.320422 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-client-ca" (OuterVolumeSpecName: "client-ca") pod "82232748-66cb-4f41-857a-b6bbd9e03cf4" (UID: "82232748-66cb-4f41-857a-b6bbd9e03cf4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.320565 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa61e61-04e1-4a45-966b-a347b6491128-client-ca" (OuterVolumeSpecName: "client-ca") pod "1fa61e61-04e1-4a45-966b-a347b6491128" (UID: "1fa61e61-04e1-4a45-966b-a347b6491128"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.321089 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa61e61-04e1-4a45-966b-a347b6491128-config" (OuterVolumeSpecName: "config") pod "1fa61e61-04e1-4a45-966b-a347b6491128" (UID: "1fa61e61-04e1-4a45-966b-a347b6491128"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.324844 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fa61e61-04e1-4a45-966b-a347b6491128-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1fa61e61-04e1-4a45-966b-a347b6491128" (UID: "1fa61e61-04e1-4a45-966b-a347b6491128"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.325281 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82232748-66cb-4f41-857a-b6bbd9e03cf4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "82232748-66cb-4f41-857a-b6bbd9e03cf4" (UID: "82232748-66cb-4f41-857a-b6bbd9e03cf4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.326035 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82232748-66cb-4f41-857a-b6bbd9e03cf4-kube-api-access-q9dmj" (OuterVolumeSpecName: "kube-api-access-q9dmj") pod "82232748-66cb-4f41-857a-b6bbd9e03cf4" (UID: "82232748-66cb-4f41-857a-b6bbd9e03cf4"). InnerVolumeSpecName "kube-api-access-q9dmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.325619 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa61e61-04e1-4a45-966b-a347b6491128-kube-api-access-ksh5m" (OuterVolumeSpecName: "kube-api-access-ksh5m") pod "1fa61e61-04e1-4a45-966b-a347b6491128" (UID: "1fa61e61-04e1-4a45-966b-a347b6491128"). InnerVolumeSpecName "kube-api-access-ksh5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.421341 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1fa61e61-04e1-4a45-966b-a347b6491128-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.421395 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.421413 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82232748-66cb-4f41-857a-b6bbd9e03cf4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.421431 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fa61e61-04e1-4a45-966b-a347b6491128-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.421452 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9dmj\" (UniqueName: \"kubernetes.io/projected/82232748-66cb-4f41-857a-b6bbd9e03cf4-kube-api-access-q9dmj\") on node \"crc\" DevicePath \"\"" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.421470 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.421487 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksh5m\" (UniqueName: \"kubernetes.io/projected/1fa61e61-04e1-4a45-966b-a347b6491128-kube-api-access-ksh5m\") on node \"crc\" DevicePath \"\"" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.421504 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82232748-66cb-4f41-857a-b6bbd9e03cf4-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:14:23 crc kubenswrapper[4921]: I0312 13:14:23.421519 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa61e61-04e1-4a45-966b-a347b6491128-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.065174 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" event={"ID":"82232748-66cb-4f41-857a-b6bbd9e03cf4","Type":"ContainerDied","Data":"e2d6164e1a60abec9ea08cb5fea1bdd999fc0505ad904523c1ee150ff65bd3bc"} Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.065201 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5c4c9867ff-skt86" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.065646 4921 scope.go:117] "RemoveContainer" containerID="292f63dbe64ce24c16a4731b5e3ca97db3bd234d46c97826674dcc990a3ec97f" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.066945 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" event={"ID":"1fa61e61-04e1-4a45-966b-a347b6491128","Type":"ContainerDied","Data":"097b011c0795a0f1a2341d4a0805d3d81de5be38f8b966839b68ef323e9e5c13"} Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.067056 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.091700 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z"] Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.092194 4921 scope.go:117] "RemoveContainer" containerID="c7842a09fa638ae438b87666d44051084a29b051aef3571e9a197f16eee8a5d7" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.097126 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6fdd7bd669-tm58z"] Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.116312 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5c4c9867ff-skt86"] Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.126695 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5c4c9867ff-skt86"] Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.850893 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj"] Mar 12 13:14:24 crc kubenswrapper[4921]: E0312 13:14:24.851110 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82232748-66cb-4f41-857a-b6bbd9e03cf4" containerName="controller-manager" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.851122 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="82232748-66cb-4f41-857a-b6bbd9e03cf4" containerName="controller-manager" Mar 12 13:14:24 crc kubenswrapper[4921]: E0312 13:14:24.851137 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa61e61-04e1-4a45-966b-a347b6491128" containerName="route-controller-manager" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.851145 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa61e61-04e1-4a45-966b-a347b6491128" containerName="route-controller-manager" Mar 12 13:14:24 crc kubenswrapper[4921]: E0312 13:14:24.851154 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5704616-685f-49f3-9dd7-bb080b87cf29" containerName="oc" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.851160 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5704616-685f-49f3-9dd7-bb080b87cf29" containerName="oc" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.851244 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="82232748-66cb-4f41-857a-b6bbd9e03cf4" containerName="controller-manager" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.851254 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5704616-685f-49f3-9dd7-bb080b87cf29" containerName="oc" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.851266 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa61e61-04e1-4a45-966b-a347b6491128" containerName="route-controller-manager" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.851599 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.865507 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc"] Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.867016 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.868766 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.868900 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.869261 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.869626 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.869791 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.869961 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.873070 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.873456 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.873935 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc"] Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.874204 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.874375 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.874429 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.874766 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.878442 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj"] Mar 12 13:14:24 crc kubenswrapper[4921]: I0312 13:14:24.882553 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.043917 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22edfb2d-5b00-4737-ba50-dee4d973b394-serving-cert\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.044889 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8t74\" (UniqueName: \"kubernetes.io/projected/22edfb2d-5b00-4737-ba50-dee4d973b394-kube-api-access-h8t74\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.044954 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33092d88-16e4-4910-8da5-602c25961cb1-client-ca\") pod \"route-controller-manager-86c679cff5-mrgdc\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.045111 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33092d88-16e4-4910-8da5-602c25961cb1-config\") pod \"route-controller-manager-86c679cff5-mrgdc\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.045194 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33092d88-16e4-4910-8da5-602c25961cb1-serving-cert\") pod \"route-controller-manager-86c679cff5-mrgdc\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.045443 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-client-ca\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.045556 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-proxy-ca-bundles\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.045625 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-config\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.045656 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6ckl\" (UniqueName: \"kubernetes.io/projected/33092d88-16e4-4910-8da5-602c25961cb1-kube-api-access-b6ckl\") pod \"route-controller-manager-86c679cff5-mrgdc\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.146751 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8t74\" (UniqueName: \"kubernetes.io/projected/22edfb2d-5b00-4737-ba50-dee4d973b394-kube-api-access-h8t74\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.146878 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33092d88-16e4-4910-8da5-602c25961cb1-client-ca\") pod \"route-controller-manager-86c679cff5-mrgdc\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.146986 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33092d88-16e4-4910-8da5-602c25961cb1-config\") pod \"route-controller-manager-86c679cff5-mrgdc\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.147038 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33092d88-16e4-4910-8da5-602c25961cb1-serving-cert\") pod \"route-controller-manager-86c679cff5-mrgdc\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.147088 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-client-ca\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.147158 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-proxy-ca-bundles\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.147226 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-config\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.147314 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6ckl\" (UniqueName: \"kubernetes.io/projected/33092d88-16e4-4910-8da5-602c25961cb1-kube-api-access-b6ckl\") pod \"route-controller-manager-86c679cff5-mrgdc\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.147441 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22edfb2d-5b00-4737-ba50-dee4d973b394-serving-cert\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.149444 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33092d88-16e4-4910-8da5-602c25961cb1-config\") pod \"route-controller-manager-86c679cff5-mrgdc\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.149980 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-client-ca\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.150024 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33092d88-16e4-4910-8da5-602c25961cb1-client-ca\") pod \"route-controller-manager-86c679cff5-mrgdc\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.150251 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-config\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.151388 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-proxy-ca-bundles\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.159166 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22edfb2d-5b00-4737-ba50-dee4d973b394-serving-cert\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.164512 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33092d88-16e4-4910-8da5-602c25961cb1-serving-cert\") pod \"route-controller-manager-86c679cff5-mrgdc\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.169661 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8t74\" (UniqueName: \"kubernetes.io/projected/22edfb2d-5b00-4737-ba50-dee4d973b394-kube-api-access-h8t74\") pod \"controller-manager-7ff5bf444c-xlkpj\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.171889 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6ckl\" (UniqueName: \"kubernetes.io/projected/33092d88-16e4-4910-8da5-602c25961cb1-kube-api-access-b6ckl\") pod \"route-controller-manager-86c679cff5-mrgdc\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.172940 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.196064 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.409879 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj"] Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.460162 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc"] Mar 12 13:14:25 crc kubenswrapper[4921]: W0312 13:14:25.463120 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33092d88_16e4_4910_8da5_602c25961cb1.slice/crio-617f904436af4a59cabee6095d3fedf9cc233e00e0e593009e36cffeadfadeae WatchSource:0}: Error finding container 617f904436af4a59cabee6095d3fedf9cc233e00e0e593009e36cffeadfadeae: Status 404 returned error can't find the container with id 617f904436af4a59cabee6095d3fedf9cc233e00e0e593009e36cffeadfadeae Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.990635 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa61e61-04e1-4a45-966b-a347b6491128" path="/var/lib/kubelet/pods/1fa61e61-04e1-4a45-966b-a347b6491128/volumes" Mar 12 13:14:25 crc kubenswrapper[4921]: I0312 13:14:25.992116 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82232748-66cb-4f41-857a-b6bbd9e03cf4" path="/var/lib/kubelet/pods/82232748-66cb-4f41-857a-b6bbd9e03cf4/volumes" Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.085658 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" event={"ID":"22edfb2d-5b00-4737-ba50-dee4d973b394","Type":"ContainerStarted","Data":"df5a0798533b9ca6016608e7dc2cda6faac99a4a065433e4b112614e12a2b6e7"} Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.085713 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" event={"ID":"22edfb2d-5b00-4737-ba50-dee4d973b394","Type":"ContainerStarted","Data":"169e7c72a57e4240e8123c3b7562a679b4703771fa193426720618d49b0d5b0d"} Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.085946 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.089874 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" event={"ID":"33092d88-16e4-4910-8da5-602c25961cb1","Type":"ContainerStarted","Data":"830f2a94ed931a58a0909f96dfee1d69642921b4683d9112eeba7eaac8f62e05"} Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.089938 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" event={"ID":"33092d88-16e4-4910-8da5-602c25961cb1","Type":"ContainerStarted","Data":"617f904436af4a59cabee6095d3fedf9cc233e00e0e593009e36cffeadfadeae"} Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.090066 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.092525 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.095114 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.109313 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" podStartSLOduration=3.109286152 podStartE2EDuration="3.109286152s" podCreationTimestamp="2026-03-12 13:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:14:26.103441358 +0000 UTC m=+288.793513329" watchObservedRunningTime="2026-03-12 13:14:26.109286152 +0000 UTC m=+288.799358133" Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.122468 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" podStartSLOduration=3.122446626 podStartE2EDuration="3.122446626s" podCreationTimestamp="2026-03-12 13:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:14:26.121651161 +0000 UTC m=+288.811723132" watchObservedRunningTime="2026-03-12 13:14:26.122446626 +0000 UTC m=+288.812518597" Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.323943 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.324002 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.324058 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.324589 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3558d676a3c882348661fd9967700d03038460628a1f557e21868fc5a9c603bc"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 13:14:26 crc kubenswrapper[4921]: I0312 13:14:26.324636 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://3558d676a3c882348661fd9967700d03038460628a1f557e21868fc5a9c603bc" gracePeriod=600 Mar 12 13:14:27 crc kubenswrapper[4921]: I0312 13:14:27.099128 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="3558d676a3c882348661fd9967700d03038460628a1f557e21868fc5a9c603bc" exitCode=0 Mar 12 13:14:27 crc kubenswrapper[4921]: I0312 13:14:27.099257 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"3558d676a3c882348661fd9967700d03038460628a1f557e21868fc5a9c603bc"} Mar 12 13:14:27 crc kubenswrapper[4921]: I0312 13:14:27.099694 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"10c61861e52d240193680813d2394b39e92b34ce948352b7c71e1120e87603ad"} Mar 12 13:14:50 crc kubenswrapper[4921]: I0312 13:14:50.936943 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gg92s"] Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.148526 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6"] Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.152362 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.155336 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.155446 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.164589 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6"] Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.224994 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dzpv\" (UniqueName: \"kubernetes.io/projected/0ca8df82-33c8-43ef-8e87-9df25af27923-kube-api-access-4dzpv\") pod \"collect-profiles-29555355-8psk6\" (UID: \"0ca8df82-33c8-43ef-8e87-9df25af27923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.225058 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ca8df82-33c8-43ef-8e87-9df25af27923-config-volume\") pod \"collect-profiles-29555355-8psk6\" (UID: \"0ca8df82-33c8-43ef-8e87-9df25af27923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.225151 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ca8df82-33c8-43ef-8e87-9df25af27923-secret-volume\") pod \"collect-profiles-29555355-8psk6\" (UID: \"0ca8df82-33c8-43ef-8e87-9df25af27923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.326499 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ca8df82-33c8-43ef-8e87-9df25af27923-config-volume\") pod \"collect-profiles-29555355-8psk6\" (UID: \"0ca8df82-33c8-43ef-8e87-9df25af27923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.326579 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ca8df82-33c8-43ef-8e87-9df25af27923-secret-volume\") pod \"collect-profiles-29555355-8psk6\" (UID: \"0ca8df82-33c8-43ef-8e87-9df25af27923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.326687 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dzpv\" (UniqueName: \"kubernetes.io/projected/0ca8df82-33c8-43ef-8e87-9df25af27923-kube-api-access-4dzpv\") pod \"collect-profiles-29555355-8psk6\" (UID: \"0ca8df82-33c8-43ef-8e87-9df25af27923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.328424 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ca8df82-33c8-43ef-8e87-9df25af27923-config-volume\") pod \"collect-profiles-29555355-8psk6\" (UID: \"0ca8df82-33c8-43ef-8e87-9df25af27923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.337754 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ca8df82-33c8-43ef-8e87-9df25af27923-secret-volume\") pod \"collect-profiles-29555355-8psk6\" (UID: \"0ca8df82-33c8-43ef-8e87-9df25af27923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.359090 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dzpv\" (UniqueName: \"kubernetes.io/projected/0ca8df82-33c8-43ef-8e87-9df25af27923-kube-api-access-4dzpv\") pod \"collect-profiles-29555355-8psk6\" (UID: \"0ca8df82-33c8-43ef-8e87-9df25af27923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.478945 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" Mar 12 13:15:00 crc kubenswrapper[4921]: I0312 13:15:00.963631 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6"] Mar 12 13:15:01 crc kubenswrapper[4921]: I0312 13:15:01.323945 4921 generic.go:334] "Generic (PLEG): container finished" podID="0ca8df82-33c8-43ef-8e87-9df25af27923" containerID="642a58f7ddbf06f95ca332f4a68933c68769ea95f0d709b937fdfc24450ad2d5" exitCode=0 Mar 12 13:15:01 crc kubenswrapper[4921]: I0312 13:15:01.324044 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" event={"ID":"0ca8df82-33c8-43ef-8e87-9df25af27923","Type":"ContainerDied","Data":"642a58f7ddbf06f95ca332f4a68933c68769ea95f0d709b937fdfc24450ad2d5"} Mar 12 13:15:01 crc kubenswrapper[4921]: I0312 13:15:01.324360 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" event={"ID":"0ca8df82-33c8-43ef-8e87-9df25af27923","Type":"ContainerStarted","Data":"ee7d4156bdc6a0d1bb5f758b50d1919b98c0cf035c79dc0f57ff1879b4cca425"} Mar 12 13:15:02 crc kubenswrapper[4921]: I0312 13:15:02.697254 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" Mar 12 13:15:02 crc kubenswrapper[4921]: I0312 13:15:02.808663 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj"] Mar 12 13:15:02 crc kubenswrapper[4921]: I0312 13:15:02.808964 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" podUID="22edfb2d-5b00-4737-ba50-dee4d973b394" containerName="controller-manager" containerID="cri-o://df5a0798533b9ca6016608e7dc2cda6faac99a4a065433e4b112614e12a2b6e7" gracePeriod=30 Mar 12 13:15:02 crc kubenswrapper[4921]: I0312 13:15:02.858463 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ca8df82-33c8-43ef-8e87-9df25af27923-config-volume\") pod \"0ca8df82-33c8-43ef-8e87-9df25af27923\" (UID: \"0ca8df82-33c8-43ef-8e87-9df25af27923\") " Mar 12 13:15:02 crc kubenswrapper[4921]: I0312 13:15:02.858734 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dzpv\" (UniqueName: \"kubernetes.io/projected/0ca8df82-33c8-43ef-8e87-9df25af27923-kube-api-access-4dzpv\") pod \"0ca8df82-33c8-43ef-8e87-9df25af27923\" (UID: \"0ca8df82-33c8-43ef-8e87-9df25af27923\") " Mar 12 13:15:02 crc kubenswrapper[4921]: I0312 13:15:02.858782 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ca8df82-33c8-43ef-8e87-9df25af27923-secret-volume\") pod \"0ca8df82-33c8-43ef-8e87-9df25af27923\" (UID: \"0ca8df82-33c8-43ef-8e87-9df25af27923\") " Mar 12 13:15:02 crc kubenswrapper[4921]: I0312 13:15:02.859275 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ca8df82-33c8-43ef-8e87-9df25af27923-config-volume" (OuterVolumeSpecName: "config-volume") pod "0ca8df82-33c8-43ef-8e87-9df25af27923" (UID: "0ca8df82-33c8-43ef-8e87-9df25af27923"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:15:02 crc kubenswrapper[4921]: I0312 13:15:02.864666 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ca8df82-33c8-43ef-8e87-9df25af27923-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0ca8df82-33c8-43ef-8e87-9df25af27923" (UID: "0ca8df82-33c8-43ef-8e87-9df25af27923"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:15:02 crc kubenswrapper[4921]: I0312 13:15:02.865336 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ca8df82-33c8-43ef-8e87-9df25af27923-kube-api-access-4dzpv" (OuterVolumeSpecName: "kube-api-access-4dzpv") pod "0ca8df82-33c8-43ef-8e87-9df25af27923" (UID: "0ca8df82-33c8-43ef-8e87-9df25af27923"). InnerVolumeSpecName "kube-api-access-4dzpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:15:02 crc kubenswrapper[4921]: I0312 13:15:02.959724 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ca8df82-33c8-43ef-8e87-9df25af27923-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:02 crc kubenswrapper[4921]: I0312 13:15:02.959760 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dzpv\" (UniqueName: \"kubernetes.io/projected/0ca8df82-33c8-43ef-8e87-9df25af27923-kube-api-access-4dzpv\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:02 crc kubenswrapper[4921]: I0312 13:15:02.959772 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0ca8df82-33c8-43ef-8e87-9df25af27923-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.209170 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.335751 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" event={"ID":"0ca8df82-33c8-43ef-8e87-9df25af27923","Type":"ContainerDied","Data":"ee7d4156bdc6a0d1bb5f758b50d1919b98c0cf035c79dc0f57ff1879b4cca425"} Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.335787 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee7d4156bdc6a0d1bb5f758b50d1919b98c0cf035c79dc0f57ff1879b4cca425" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.335834 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.337409 4921 generic.go:334] "Generic (PLEG): container finished" podID="22edfb2d-5b00-4737-ba50-dee4d973b394" containerID="df5a0798533b9ca6016608e7dc2cda6faac99a4a065433e4b112614e12a2b6e7" exitCode=0 Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.337462 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" event={"ID":"22edfb2d-5b00-4737-ba50-dee4d973b394","Type":"ContainerDied","Data":"df5a0798533b9ca6016608e7dc2cda6faac99a4a065433e4b112614e12a2b6e7"} Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.337496 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" event={"ID":"22edfb2d-5b00-4737-ba50-dee4d973b394","Type":"ContainerDied","Data":"169e7c72a57e4240e8123c3b7562a679b4703771fa193426720618d49b0d5b0d"} Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.337516 4921 scope.go:117] "RemoveContainer" containerID="df5a0798533b9ca6016608e7dc2cda6faac99a4a065433e4b112614e12a2b6e7" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.337584 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.360220 4921 scope.go:117] "RemoveContainer" containerID="df5a0798533b9ca6016608e7dc2cda6faac99a4a065433e4b112614e12a2b6e7" Mar 12 13:15:03 crc kubenswrapper[4921]: E0312 13:15:03.361084 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df5a0798533b9ca6016608e7dc2cda6faac99a4a065433e4b112614e12a2b6e7\": container with ID starting with df5a0798533b9ca6016608e7dc2cda6faac99a4a065433e4b112614e12a2b6e7 not found: ID does not exist" containerID="df5a0798533b9ca6016608e7dc2cda6faac99a4a065433e4b112614e12a2b6e7" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.361144 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df5a0798533b9ca6016608e7dc2cda6faac99a4a065433e4b112614e12a2b6e7"} err="failed to get container status \"df5a0798533b9ca6016608e7dc2cda6faac99a4a065433e4b112614e12a2b6e7\": rpc error: code = NotFound desc = could not find container \"df5a0798533b9ca6016608e7dc2cda6faac99a4a065433e4b112614e12a2b6e7\": container with ID starting with df5a0798533b9ca6016608e7dc2cda6faac99a4a065433e4b112614e12a2b6e7 not found: ID does not exist" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.365139 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8t74\" (UniqueName: \"kubernetes.io/projected/22edfb2d-5b00-4737-ba50-dee4d973b394-kube-api-access-h8t74\") pod \"22edfb2d-5b00-4737-ba50-dee4d973b394\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.365249 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-client-ca\") pod \"22edfb2d-5b00-4737-ba50-dee4d973b394\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.365290 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-proxy-ca-bundles\") pod \"22edfb2d-5b00-4737-ba50-dee4d973b394\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.365321 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22edfb2d-5b00-4737-ba50-dee4d973b394-serving-cert\") pod \"22edfb2d-5b00-4737-ba50-dee4d973b394\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.365375 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-config\") pod \"22edfb2d-5b00-4737-ba50-dee4d973b394\" (UID: \"22edfb2d-5b00-4737-ba50-dee4d973b394\") " Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.366251 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-client-ca" (OuterVolumeSpecName: "client-ca") pod "22edfb2d-5b00-4737-ba50-dee4d973b394" (UID: "22edfb2d-5b00-4737-ba50-dee4d973b394"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.366862 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-config" (OuterVolumeSpecName: "config") pod "22edfb2d-5b00-4737-ba50-dee4d973b394" (UID: "22edfb2d-5b00-4737-ba50-dee4d973b394"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.367014 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "22edfb2d-5b00-4737-ba50-dee4d973b394" (UID: "22edfb2d-5b00-4737-ba50-dee4d973b394"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.368766 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22edfb2d-5b00-4737-ba50-dee4d973b394-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "22edfb2d-5b00-4737-ba50-dee4d973b394" (UID: "22edfb2d-5b00-4737-ba50-dee4d973b394"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.369471 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22edfb2d-5b00-4737-ba50-dee4d973b394-kube-api-access-h8t74" (OuterVolumeSpecName: "kube-api-access-h8t74") pod "22edfb2d-5b00-4737-ba50-dee4d973b394" (UID: "22edfb2d-5b00-4737-ba50-dee4d973b394"). InnerVolumeSpecName "kube-api-access-h8t74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.467208 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8t74\" (UniqueName: \"kubernetes.io/projected/22edfb2d-5b00-4737-ba50-dee4d973b394-kube-api-access-h8t74\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.467287 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.467334 4921 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.467359 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22edfb2d-5b00-4737-ba50-dee4d973b394-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.467386 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22edfb2d-5b00-4737-ba50-dee4d973b394-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.667847 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj"] Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.670717 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7ff5bf444c-xlkpj"] Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.882864 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64dbcf6866-2pzzr"] Mar 12 13:15:03 crc kubenswrapper[4921]: E0312 13:15:03.883183 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ca8df82-33c8-43ef-8e87-9df25af27923" containerName="collect-profiles" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.883197 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ca8df82-33c8-43ef-8e87-9df25af27923" containerName="collect-profiles" Mar 12 13:15:03 crc kubenswrapper[4921]: E0312 13:15:03.883218 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22edfb2d-5b00-4737-ba50-dee4d973b394" containerName="controller-manager" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.883224 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="22edfb2d-5b00-4737-ba50-dee4d973b394" containerName="controller-manager" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.883360 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="22edfb2d-5b00-4737-ba50-dee4d973b394" containerName="controller-manager" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.883371 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ca8df82-33c8-43ef-8e87-9df25af27923" containerName="collect-profiles" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.883828 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.885559 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.885638 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.886211 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.888569 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.889788 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.890256 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64dbcf6866-2pzzr"] Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.890368 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.892051 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.972228 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-serving-cert\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.972292 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9d8t\" (UniqueName: \"kubernetes.io/projected/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-kube-api-access-b9d8t\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.972330 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-config\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.972369 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-proxy-ca-bundles\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.972442 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-client-ca\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:03 crc kubenswrapper[4921]: I0312 13:15:03.999060 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22edfb2d-5b00-4737-ba50-dee4d973b394" path="/var/lib/kubelet/pods/22edfb2d-5b00-4737-ba50-dee4d973b394/volumes" Mar 12 13:15:04 crc kubenswrapper[4921]: I0312 13:15:04.073468 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-serving-cert\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:04 crc kubenswrapper[4921]: I0312 13:15:04.073542 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9d8t\" (UniqueName: \"kubernetes.io/projected/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-kube-api-access-b9d8t\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:04 crc kubenswrapper[4921]: I0312 13:15:04.073598 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-config\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:04 crc kubenswrapper[4921]: I0312 13:15:04.073640 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-proxy-ca-bundles\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:04 crc kubenswrapper[4921]: I0312 13:15:04.073752 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-client-ca\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:04 crc kubenswrapper[4921]: I0312 13:15:04.078388 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-config\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:04 crc kubenswrapper[4921]: I0312 13:15:04.078624 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-serving-cert\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:04 crc kubenswrapper[4921]: I0312 13:15:04.079709 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-proxy-ca-bundles\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:04 crc kubenswrapper[4921]: I0312 13:15:04.080646 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-client-ca\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:04 crc kubenswrapper[4921]: I0312 13:15:04.090416 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9d8t\" (UniqueName: \"kubernetes.io/projected/c8d9afc5-dc56-403f-84b8-c2269b6c83b9-kube-api-access-b9d8t\") pod \"controller-manager-64dbcf6866-2pzzr\" (UID: \"c8d9afc5-dc56-403f-84b8-c2269b6c83b9\") " pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:04 crc kubenswrapper[4921]: I0312 13:15:04.202930 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:04 crc kubenswrapper[4921]: I0312 13:15:04.689497 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64dbcf6866-2pzzr"] Mar 12 13:15:05 crc kubenswrapper[4921]: I0312 13:15:05.354331 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" event={"ID":"c8d9afc5-dc56-403f-84b8-c2269b6c83b9","Type":"ContainerStarted","Data":"026eeac87005c0d6714940108205f579445d94c859dfecfbfdf961310ff7583b"} Mar 12 13:15:05 crc kubenswrapper[4921]: I0312 13:15:05.354663 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" event={"ID":"c8d9afc5-dc56-403f-84b8-c2269b6c83b9","Type":"ContainerStarted","Data":"e82a4dad3de63349f3cfef1fbe858926d5c3405936f4cf85e60e53716f465925"} Mar 12 13:15:05 crc kubenswrapper[4921]: I0312 13:15:05.354806 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:05 crc kubenswrapper[4921]: I0312 13:15:05.361404 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" Mar 12 13:15:05 crc kubenswrapper[4921]: I0312 13:15:05.373805 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64dbcf6866-2pzzr" podStartSLOduration=3.373782033 podStartE2EDuration="3.373782033s" podCreationTimestamp="2026-03-12 13:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:15:05.368995641 +0000 UTC m=+328.059067642" watchObservedRunningTime="2026-03-12 13:15:05.373782033 +0000 UTC m=+328.063854014" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.740466 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5nktn"] Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.742568 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.756677 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5nktn"] Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.880507 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phrv5\" (UniqueName: \"kubernetes.io/projected/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-kube-api-access-phrv5\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.880558 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-trusted-ca\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.880585 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.880618 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.880636 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.880653 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-bound-sa-token\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.880677 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-registry-certificates\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.880705 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-registry-tls\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.907375 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.982071 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phrv5\" (UniqueName: \"kubernetes.io/projected/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-kube-api-access-phrv5\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.982141 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-trusted-ca\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.982186 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.982213 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.982239 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-bound-sa-token\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.982266 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-registry-certificates\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.982288 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-registry-tls\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.983087 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.984005 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-trusted-ca\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.985346 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-registry-certificates\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.993083 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:11 crc kubenswrapper[4921]: I0312 13:15:11.993201 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-registry-tls\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:12 crc kubenswrapper[4921]: I0312 13:15:12.004610 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-bound-sa-token\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:12 crc kubenswrapper[4921]: I0312 13:15:12.007296 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phrv5\" (UniqueName: \"kubernetes.io/projected/a55a7eb8-f22e-48f7-a6aa-775b15fe71a7-kube-api-access-phrv5\") pod \"image-registry-66df7c8f76-5nktn\" (UID: \"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:12 crc kubenswrapper[4921]: I0312 13:15:12.073552 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:12 crc kubenswrapper[4921]: I0312 13:15:12.611748 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5nktn"] Mar 12 13:15:13 crc kubenswrapper[4921]: I0312 13:15:13.411518 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" event={"ID":"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7","Type":"ContainerStarted","Data":"fd163e73f3c1f88d4c71d087ac66bc39035d4a2537baa40c6c45291953e411bb"} Mar 12 13:15:13 crc kubenswrapper[4921]: I0312 13:15:13.411900 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" event={"ID":"a55a7eb8-f22e-48f7-a6aa-775b15fe71a7","Type":"ContainerStarted","Data":"399acbe824ea33329ea8f7022d77ef6113b8013bbd328ff168e707ea2334a5e6"} Mar 12 13:15:13 crc kubenswrapper[4921]: I0312 13:15:13.411926 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:13 crc kubenswrapper[4921]: I0312 13:15:13.440931 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" podStartSLOduration=2.440908515 podStartE2EDuration="2.440908515s" podCreationTimestamp="2026-03-12 13:15:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:15:13.437248558 +0000 UTC m=+336.127320569" watchObservedRunningTime="2026-03-12 13:15:13.440908515 +0000 UTC m=+336.130980526" Mar 12 13:15:15 crc kubenswrapper[4921]: I0312 13:15:15.969601 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" podUID="88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" containerName="oauth-openshift" containerID="cri-o://a911277ac8f2809155389ac0eafd14fe03e913d28177a70db104bfab58669e17" gracePeriod=15 Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.432928 4921 generic.go:334] "Generic (PLEG): container finished" podID="88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" containerID="a911277ac8f2809155389ac0eafd14fe03e913d28177a70db104bfab58669e17" exitCode=0 Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.432987 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" event={"ID":"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce","Type":"ContainerDied","Data":"a911277ac8f2809155389ac0eafd14fe03e913d28177a70db104bfab58669e17"} Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.523056 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.574342 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-bfb76855c-tdq4k"] Mar 12 13:15:16 crc kubenswrapper[4921]: E0312 13:15:16.574636 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" containerName="oauth-openshift" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.574660 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" containerName="oauth-openshift" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.574940 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" containerName="oauth-openshift" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.575789 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.593414 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bfb76855c-tdq4k"] Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.651605 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-session\") pod \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.651690 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-serving-cert\") pod \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.651717 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-audit-policies\") pod \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.651746 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-ocp-branding-template\") pod \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.651762 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-router-certs\") pod \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.651778 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-cliconfig\") pod \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.651795 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6kqr\" (UniqueName: \"kubernetes.io/projected/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-kube-api-access-n6kqr\") pod \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.651816 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-login\") pod \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.651877 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-audit-dir\") pod \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.651895 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-error\") pod \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.651932 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-trusted-ca-bundle\") pod \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.651949 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-idp-0-file-data\") pod \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.651967 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-service-ca\") pod \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.651993 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-provider-selection\") pod \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\" (UID: \"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce\") " Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.653359 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.653956 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.654173 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.654228 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.654499 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.658285 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.658681 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.660059 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-kube-api-access-n6kqr" (OuterVolumeSpecName: "kube-api-access-n6kqr") pod "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce"). InnerVolumeSpecName "kube-api-access-n6kqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.664975 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.666214 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.667354 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.667576 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.668665 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.668798 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" (UID: "88e0b0eb-d051-410d-b2e8-c80e9fe3fdce"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.753753 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.753853 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-user-template-error\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.753907 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-session\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.753946 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754014 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-router-certs\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754036 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-service-ca\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754071 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca82da46-ac84-4b79-b5f4-2132a87be65b-audit-dir\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754098 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754122 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754166 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754192 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-user-template-login\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754242 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca82da46-ac84-4b79-b5f4-2132a87be65b-audit-policies\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754270 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blczs\" (UniqueName: \"kubernetes.io/projected/ca82da46-ac84-4b79-b5f4-2132a87be65b-kube-api-access-blczs\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754296 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754464 4921 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754514 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754540 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754561 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754585 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754607 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754627 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754650 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754669 4921 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754692 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754711 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754730 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754749 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6kqr\" (UniqueName: \"kubernetes.io/projected/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-kube-api-access-n6kqr\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.754770 4921 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.855996 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.856109 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.856168 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-user-template-error\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.856223 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-session\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.856257 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.856304 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-router-certs\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.856337 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-service-ca\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.856377 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca82da46-ac84-4b79-b5f4-2132a87be65b-audit-dir\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.856420 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.856452 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.856513 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.856552 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-user-template-login\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.856597 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca82da46-ac84-4b79-b5f4-2132a87be65b-audit-policies\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.856639 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blczs\" (UniqueName: \"kubernetes.io/projected/ca82da46-ac84-4b79-b5f4-2132a87be65b-kube-api-access-blczs\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.856802 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca82da46-ac84-4b79-b5f4-2132a87be65b-audit-dir\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.859138 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-service-ca\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.859776 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.859848 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.860180 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca82da46-ac84-4b79-b5f4-2132a87be65b-audit-policies\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.861753 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-user-template-error\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.862316 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.862900 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.863394 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-user-template-login\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.863632 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.865194 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-router-certs\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.865395 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.865590 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ca82da46-ac84-4b79-b5f4-2132a87be65b-v4-0-config-system-session\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.877525 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blczs\" (UniqueName: \"kubernetes.io/projected/ca82da46-ac84-4b79-b5f4-2132a87be65b-kube-api-access-blczs\") pod \"oauth-openshift-bfb76855c-tdq4k\" (UID: \"ca82da46-ac84-4b79-b5f4-2132a87be65b\") " pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:16 crc kubenswrapper[4921]: I0312 13:15:16.892315 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:17 crc kubenswrapper[4921]: I0312 13:15:17.409596 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-bfb76855c-tdq4k"] Mar 12 13:15:17 crc kubenswrapper[4921]: I0312 13:15:17.440810 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" Mar 12 13:15:17 crc kubenswrapper[4921]: I0312 13:15:17.440849 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gg92s" event={"ID":"88e0b0eb-d051-410d-b2e8-c80e9fe3fdce","Type":"ContainerDied","Data":"44162cb4e9c6f62c5c9294b6b379e6c7c7bf059550665fcddd2041e755a22127"} Mar 12 13:15:17 crc kubenswrapper[4921]: I0312 13:15:17.441494 4921 scope.go:117] "RemoveContainer" containerID="a911277ac8f2809155389ac0eafd14fe03e913d28177a70db104bfab58669e17" Mar 12 13:15:17 crc kubenswrapper[4921]: I0312 13:15:17.441966 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" event={"ID":"ca82da46-ac84-4b79-b5f4-2132a87be65b","Type":"ContainerStarted","Data":"4e24af92ada094fb1a7073733d6be81c98ed94ae36e48f3f8e6316fe8d586eba"} Mar 12 13:15:17 crc kubenswrapper[4921]: I0312 13:15:17.524059 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gg92s"] Mar 12 13:15:17 crc kubenswrapper[4921]: I0312 13:15:17.533419 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gg92s"] Mar 12 13:15:17 crc kubenswrapper[4921]: I0312 13:15:17.990365 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e0b0eb-d051-410d-b2e8-c80e9fe3fdce" path="/var/lib/kubelet/pods/88e0b0eb-d051-410d-b2e8-c80e9fe3fdce/volumes" Mar 12 13:15:18 crc kubenswrapper[4921]: I0312 13:15:18.449095 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" event={"ID":"ca82da46-ac84-4b79-b5f4-2132a87be65b","Type":"ContainerStarted","Data":"24d386f87d148721ce36d657de22d49a1bf5aa10fa6e26b28db4df8166612a07"} Mar 12 13:15:18 crc kubenswrapper[4921]: I0312 13:15:18.450486 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:18 crc kubenswrapper[4921]: I0312 13:15:18.461338 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" Mar 12 13:15:18 crc kubenswrapper[4921]: I0312 13:15:18.489192 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-bfb76855c-tdq4k" podStartSLOduration=28.489160789 podStartE2EDuration="28.489160789s" podCreationTimestamp="2026-03-12 13:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:15:18.481537276 +0000 UTC m=+341.171609287" watchObservedRunningTime="2026-03-12 13:15:18.489160789 +0000 UTC m=+341.179232810" Mar 12 13:15:22 crc kubenswrapper[4921]: I0312 13:15:22.682914 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc"] Mar 12 13:15:22 crc kubenswrapper[4921]: I0312 13:15:22.683367 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" podUID="33092d88-16e4-4910-8da5-602c25961cb1" containerName="route-controller-manager" containerID="cri-o://830f2a94ed931a58a0909f96dfee1d69642921b4683d9112eeba7eaac8f62e05" gracePeriod=30 Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.177662 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.247652 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33092d88-16e4-4910-8da5-602c25961cb1-config\") pod \"33092d88-16e4-4910-8da5-602c25961cb1\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.248146 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33092d88-16e4-4910-8da5-602c25961cb1-serving-cert\") pod \"33092d88-16e4-4910-8da5-602c25961cb1\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.248251 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33092d88-16e4-4910-8da5-602c25961cb1-client-ca\") pod \"33092d88-16e4-4910-8da5-602c25961cb1\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.248311 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6ckl\" (UniqueName: \"kubernetes.io/projected/33092d88-16e4-4910-8da5-602c25961cb1-kube-api-access-b6ckl\") pod \"33092d88-16e4-4910-8da5-602c25961cb1\" (UID: \"33092d88-16e4-4910-8da5-602c25961cb1\") " Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.248741 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33092d88-16e4-4910-8da5-602c25961cb1-config" (OuterVolumeSpecName: "config") pod "33092d88-16e4-4910-8da5-602c25961cb1" (UID: "33092d88-16e4-4910-8da5-602c25961cb1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.249586 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33092d88-16e4-4910-8da5-602c25961cb1-client-ca" (OuterVolumeSpecName: "client-ca") pod "33092d88-16e4-4910-8da5-602c25961cb1" (UID: "33092d88-16e4-4910-8da5-602c25961cb1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.254169 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33092d88-16e4-4910-8da5-602c25961cb1-kube-api-access-b6ckl" (OuterVolumeSpecName: "kube-api-access-b6ckl") pod "33092d88-16e4-4910-8da5-602c25961cb1" (UID: "33092d88-16e4-4910-8da5-602c25961cb1"). InnerVolumeSpecName "kube-api-access-b6ckl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.257570 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33092d88-16e4-4910-8da5-602c25961cb1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "33092d88-16e4-4910-8da5-602c25961cb1" (UID: "33092d88-16e4-4910-8da5-602c25961cb1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.349561 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6ckl\" (UniqueName: \"kubernetes.io/projected/33092d88-16e4-4910-8da5-602c25961cb1-kube-api-access-b6ckl\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.349710 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33092d88-16e4-4910-8da5-602c25961cb1-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.349773 4921 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33092d88-16e4-4910-8da5-602c25961cb1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.349866 4921 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33092d88-16e4-4910-8da5-602c25961cb1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.478888 4921 generic.go:334] "Generic (PLEG): container finished" podID="33092d88-16e4-4910-8da5-602c25961cb1" containerID="830f2a94ed931a58a0909f96dfee1d69642921b4683d9112eeba7eaac8f62e05" exitCode=0 Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.478996 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.479039 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" event={"ID":"33092d88-16e4-4910-8da5-602c25961cb1","Type":"ContainerDied","Data":"830f2a94ed931a58a0909f96dfee1d69642921b4683d9112eeba7eaac8f62e05"} Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.479313 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc" event={"ID":"33092d88-16e4-4910-8da5-602c25961cb1","Type":"ContainerDied","Data":"617f904436af4a59cabee6095d3fedf9cc233e00e0e593009e36cffeadfadeae"} Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.479386 4921 scope.go:117] "RemoveContainer" containerID="830f2a94ed931a58a0909f96dfee1d69642921b4683d9112eeba7eaac8f62e05" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.513409 4921 scope.go:117] "RemoveContainer" containerID="830f2a94ed931a58a0909f96dfee1d69642921b4683d9112eeba7eaac8f62e05" Mar 12 13:15:23 crc kubenswrapper[4921]: E0312 13:15:23.513935 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"830f2a94ed931a58a0909f96dfee1d69642921b4683d9112eeba7eaac8f62e05\": container with ID starting with 830f2a94ed931a58a0909f96dfee1d69642921b4683d9112eeba7eaac8f62e05 not found: ID does not exist" containerID="830f2a94ed931a58a0909f96dfee1d69642921b4683d9112eeba7eaac8f62e05" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.513980 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830f2a94ed931a58a0909f96dfee1d69642921b4683d9112eeba7eaac8f62e05"} err="failed to get container status \"830f2a94ed931a58a0909f96dfee1d69642921b4683d9112eeba7eaac8f62e05\": rpc error: code = NotFound desc = could not find container \"830f2a94ed931a58a0909f96dfee1d69642921b4683d9112eeba7eaac8f62e05\": container with ID starting with 830f2a94ed931a58a0909f96dfee1d69642921b4683d9112eeba7eaac8f62e05 not found: ID does not exist" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.514357 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc"] Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.517806 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c679cff5-mrgdc"] Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.900837 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4"] Mar 12 13:15:23 crc kubenswrapper[4921]: E0312 13:15:23.903044 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33092d88-16e4-4910-8da5-602c25961cb1" containerName="route-controller-manager" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.903125 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="33092d88-16e4-4910-8da5-602c25961cb1" containerName="route-controller-manager" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.903347 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="33092d88-16e4-4910-8da5-602c25961cb1" containerName="route-controller-manager" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.903967 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.906997 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4"] Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.907168 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.907206 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.907307 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.907318 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.907409 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.911936 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 13:15:23 crc kubenswrapper[4921]: I0312 13:15:23.991922 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33092d88-16e4-4910-8da5-602c25961cb1" path="/var/lib/kubelet/pods/33092d88-16e4-4910-8da5-602c25961cb1/volumes" Mar 12 13:15:24 crc kubenswrapper[4921]: I0312 13:15:24.058888 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af702405-527a-4c5f-b351-a4cddb93e1ab-client-ca\") pod \"route-controller-manager-5b6d9f9f8b-wvpg4\" (UID: \"af702405-527a-4c5f-b351-a4cddb93e1ab\") " pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:24 crc kubenswrapper[4921]: I0312 13:15:24.058978 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af702405-527a-4c5f-b351-a4cddb93e1ab-config\") pod \"route-controller-manager-5b6d9f9f8b-wvpg4\" (UID: \"af702405-527a-4c5f-b351-a4cddb93e1ab\") " pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:24 crc kubenswrapper[4921]: I0312 13:15:24.059067 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxjlg\" (UniqueName: \"kubernetes.io/projected/af702405-527a-4c5f-b351-a4cddb93e1ab-kube-api-access-jxjlg\") pod \"route-controller-manager-5b6d9f9f8b-wvpg4\" (UID: \"af702405-527a-4c5f-b351-a4cddb93e1ab\") " pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:24 crc kubenswrapper[4921]: I0312 13:15:24.059119 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af702405-527a-4c5f-b351-a4cddb93e1ab-serving-cert\") pod \"route-controller-manager-5b6d9f9f8b-wvpg4\" (UID: \"af702405-527a-4c5f-b351-a4cddb93e1ab\") " pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:24 crc kubenswrapper[4921]: I0312 13:15:24.160797 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af702405-527a-4c5f-b351-a4cddb93e1ab-serving-cert\") pod \"route-controller-manager-5b6d9f9f8b-wvpg4\" (UID: \"af702405-527a-4c5f-b351-a4cddb93e1ab\") " pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:24 crc kubenswrapper[4921]: I0312 13:15:24.161795 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af702405-527a-4c5f-b351-a4cddb93e1ab-client-ca\") pod \"route-controller-manager-5b6d9f9f8b-wvpg4\" (UID: \"af702405-527a-4c5f-b351-a4cddb93e1ab\") " pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:24 crc kubenswrapper[4921]: I0312 13:15:24.163246 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af702405-527a-4c5f-b351-a4cddb93e1ab-client-ca\") pod \"route-controller-manager-5b6d9f9f8b-wvpg4\" (UID: \"af702405-527a-4c5f-b351-a4cddb93e1ab\") " pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:24 crc kubenswrapper[4921]: I0312 13:15:24.165938 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af702405-527a-4c5f-b351-a4cddb93e1ab-config\") pod \"route-controller-manager-5b6d9f9f8b-wvpg4\" (UID: \"af702405-527a-4c5f-b351-a4cddb93e1ab\") " pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:24 crc kubenswrapper[4921]: I0312 13:15:24.166121 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxjlg\" (UniqueName: \"kubernetes.io/projected/af702405-527a-4c5f-b351-a4cddb93e1ab-kube-api-access-jxjlg\") pod \"route-controller-manager-5b6d9f9f8b-wvpg4\" (UID: \"af702405-527a-4c5f-b351-a4cddb93e1ab\") " pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:24 crc kubenswrapper[4921]: I0312 13:15:24.167222 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af702405-527a-4c5f-b351-a4cddb93e1ab-config\") pod \"route-controller-manager-5b6d9f9f8b-wvpg4\" (UID: \"af702405-527a-4c5f-b351-a4cddb93e1ab\") " pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:24 crc kubenswrapper[4921]: I0312 13:15:24.176635 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af702405-527a-4c5f-b351-a4cddb93e1ab-serving-cert\") pod \"route-controller-manager-5b6d9f9f8b-wvpg4\" (UID: \"af702405-527a-4c5f-b351-a4cddb93e1ab\") " pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:24 crc kubenswrapper[4921]: I0312 13:15:24.201205 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxjlg\" (UniqueName: \"kubernetes.io/projected/af702405-527a-4c5f-b351-a4cddb93e1ab-kube-api-access-jxjlg\") pod \"route-controller-manager-5b6d9f9f8b-wvpg4\" (UID: \"af702405-527a-4c5f-b351-a4cddb93e1ab\") " pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:24 crc kubenswrapper[4921]: I0312 13:15:24.242155 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:24 crc kubenswrapper[4921]: I0312 13:15:24.659158 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4"] Mar 12 13:15:24 crc kubenswrapper[4921]: W0312 13:15:24.663198 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf702405_527a_4c5f_b351_a4cddb93e1ab.slice/crio-18ca7be29099086e2312606865ed9d6a65372b4693cf8661c99d43b0378a233b WatchSource:0}: Error finding container 18ca7be29099086e2312606865ed9d6a65372b4693cf8661c99d43b0378a233b: Status 404 returned error can't find the container with id 18ca7be29099086e2312606865ed9d6a65372b4693cf8661c99d43b0378a233b Mar 12 13:15:25 crc kubenswrapper[4921]: I0312 13:15:25.493298 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" event={"ID":"af702405-527a-4c5f-b351-a4cddb93e1ab","Type":"ContainerStarted","Data":"446671903f7a6fdbc877f1f64fe5450fb07e9bf822d5c43c92ef8e0743e55df9"} Mar 12 13:15:25 crc kubenswrapper[4921]: I0312 13:15:25.494296 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:25 crc kubenswrapper[4921]: I0312 13:15:25.494309 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" event={"ID":"af702405-527a-4c5f-b351-a4cddb93e1ab","Type":"ContainerStarted","Data":"18ca7be29099086e2312606865ed9d6a65372b4693cf8661c99d43b0378a233b"} Mar 12 13:15:25 crc kubenswrapper[4921]: I0312 13:15:25.500963 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" Mar 12 13:15:25 crc kubenswrapper[4921]: I0312 13:15:25.545273 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b6d9f9f8b-wvpg4" podStartSLOduration=3.545254545 podStartE2EDuration="3.545254545s" podCreationTimestamp="2026-03-12 13:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:15:25.524178004 +0000 UTC m=+348.214250025" watchObservedRunningTime="2026-03-12 13:15:25.545254545 +0000 UTC m=+348.235326506" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.081248 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lvkz9"] Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.081879 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lvkz9" podUID="82dff338-35e1-44df-8f20-a4d4d8b3c198" containerName="registry-server" containerID="cri-o://099bc7c95faf2466e5fc9b2cb9bdd431336e4f297d4ccc1fbbfff42554b33019" gracePeriod=30 Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.090003 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ll7bv"] Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.090253 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ll7bv" podUID="d6868925-795c-4765-9343-0b147db98216" containerName="registry-server" containerID="cri-o://6e406b90e54138979540bc2ad36218f2b8fc106654e5200e87fb642f8e0786e6" gracePeriod=30 Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.107019 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7gdw7"] Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.107269 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" podUID="680f8033-da87-4897-bf8c-23b2ad8af659" containerName="marketplace-operator" containerID="cri-o://422deaa84203af77ccb5781bfdd59512046bd17f282a52a9d3f6b65053857949" gracePeriod=30 Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.117977 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tzw9"] Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.118249 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6tzw9" podUID="2b934596-5580-41ba-8ad2-8722f4cf476d" containerName="registry-server" containerID="cri-o://3febb8809e812f601cb90a1cdaf8dd74fde388aa3401509f9ee8a97dd8003ec0" gracePeriod=30 Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.124756 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m6c4k"] Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.125070 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m6c4k" podUID="7fef7638-98df-405a-b04b-f47997b46eac" containerName="registry-server" containerID="cri-o://87350e95bdd8b46b20adb55b84579df94eff35960e97f961929a1f169e337d9a" gracePeriod=30 Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.129151 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cc774"] Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.130088 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cc774" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.144531 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cc774"] Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.225585 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8eea941-027c-44f8-a189-b7e9b3c6cb55-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cc774\" (UID: \"f8eea941-027c-44f8-a189-b7e9b3c6cb55\") " pod="openshift-marketplace/marketplace-operator-79b997595-cc774" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.225645 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb4hf\" (UniqueName: \"kubernetes.io/projected/f8eea941-027c-44f8-a189-b7e9b3c6cb55-kube-api-access-qb4hf\") pod \"marketplace-operator-79b997595-cc774\" (UID: \"f8eea941-027c-44f8-a189-b7e9b3c6cb55\") " pod="openshift-marketplace/marketplace-operator-79b997595-cc774" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.225857 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f8eea941-027c-44f8-a189-b7e9b3c6cb55-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cc774\" (UID: \"f8eea941-027c-44f8-a189-b7e9b3c6cb55\") " pod="openshift-marketplace/marketplace-operator-79b997595-cc774" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.327490 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8eea941-027c-44f8-a189-b7e9b3c6cb55-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cc774\" (UID: \"f8eea941-027c-44f8-a189-b7e9b3c6cb55\") " pod="openshift-marketplace/marketplace-operator-79b997595-cc774" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.327549 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb4hf\" (UniqueName: \"kubernetes.io/projected/f8eea941-027c-44f8-a189-b7e9b3c6cb55-kube-api-access-qb4hf\") pod \"marketplace-operator-79b997595-cc774\" (UID: \"f8eea941-027c-44f8-a189-b7e9b3c6cb55\") " pod="openshift-marketplace/marketplace-operator-79b997595-cc774" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.327591 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f8eea941-027c-44f8-a189-b7e9b3c6cb55-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cc774\" (UID: \"f8eea941-027c-44f8-a189-b7e9b3c6cb55\") " pod="openshift-marketplace/marketplace-operator-79b997595-cc774" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.328924 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8eea941-027c-44f8-a189-b7e9b3c6cb55-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-cc774\" (UID: \"f8eea941-027c-44f8-a189-b7e9b3c6cb55\") " pod="openshift-marketplace/marketplace-operator-79b997595-cc774" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.334061 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f8eea941-027c-44f8-a189-b7e9b3c6cb55-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-cc774\" (UID: \"f8eea941-027c-44f8-a189-b7e9b3c6cb55\") " pod="openshift-marketplace/marketplace-operator-79b997595-cc774" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.349041 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb4hf\" (UniqueName: \"kubernetes.io/projected/f8eea941-027c-44f8-a189-b7e9b3c6cb55-kube-api-access-qb4hf\") pod \"marketplace-operator-79b997595-cc774\" (UID: \"f8eea941-027c-44f8-a189-b7e9b3c6cb55\") " pod="openshift-marketplace/marketplace-operator-79b997595-cc774" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.510475 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-cc774" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.518242 4921 generic.go:334] "Generic (PLEG): container finished" podID="82dff338-35e1-44df-8f20-a4d4d8b3c198" containerID="099bc7c95faf2466e5fc9b2cb9bdd431336e4f297d4ccc1fbbfff42554b33019" exitCode=0 Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.518336 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvkz9" event={"ID":"82dff338-35e1-44df-8f20-a4d4d8b3c198","Type":"ContainerDied","Data":"099bc7c95faf2466e5fc9b2cb9bdd431336e4f297d4ccc1fbbfff42554b33019"} Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.521928 4921 generic.go:334] "Generic (PLEG): container finished" podID="680f8033-da87-4897-bf8c-23b2ad8af659" containerID="422deaa84203af77ccb5781bfdd59512046bd17f282a52a9d3f6b65053857949" exitCode=0 Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.522002 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" event={"ID":"680f8033-da87-4897-bf8c-23b2ad8af659","Type":"ContainerDied","Data":"422deaa84203af77ccb5781bfdd59512046bd17f282a52a9d3f6b65053857949"} Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.524655 4921 generic.go:334] "Generic (PLEG): container finished" podID="d6868925-795c-4765-9343-0b147db98216" containerID="6e406b90e54138979540bc2ad36218f2b8fc106654e5200e87fb642f8e0786e6" exitCode=0 Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.524700 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ll7bv" event={"ID":"d6868925-795c-4765-9343-0b147db98216","Type":"ContainerDied","Data":"6e406b90e54138979540bc2ad36218f2b8fc106654e5200e87fb642f8e0786e6"} Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.526835 4921 generic.go:334] "Generic (PLEG): container finished" podID="7fef7638-98df-405a-b04b-f47997b46eac" containerID="87350e95bdd8b46b20adb55b84579df94eff35960e97f961929a1f169e337d9a" exitCode=0 Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.526870 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6c4k" event={"ID":"7fef7638-98df-405a-b04b-f47997b46eac","Type":"ContainerDied","Data":"87350e95bdd8b46b20adb55b84579df94eff35960e97f961929a1f169e337d9a"} Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.528637 4921 generic.go:334] "Generic (PLEG): container finished" podID="2b934596-5580-41ba-8ad2-8722f4cf476d" containerID="3febb8809e812f601cb90a1cdaf8dd74fde388aa3401509f9ee8a97dd8003ec0" exitCode=0 Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.528662 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tzw9" event={"ID":"2b934596-5580-41ba-8ad2-8722f4cf476d","Type":"ContainerDied","Data":"3febb8809e812f601cb90a1cdaf8dd74fde388aa3401509f9ee8a97dd8003ec0"} Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.670523 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.733132 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6868925-795c-4765-9343-0b147db98216-catalog-content\") pod \"d6868925-795c-4765-9343-0b147db98216\" (UID: \"d6868925-795c-4765-9343-0b147db98216\") " Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.733179 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6868925-795c-4765-9343-0b147db98216-utilities\") pod \"d6868925-795c-4765-9343-0b147db98216\" (UID: \"d6868925-795c-4765-9343-0b147db98216\") " Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.733243 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqxbg\" (UniqueName: \"kubernetes.io/projected/d6868925-795c-4765-9343-0b147db98216-kube-api-access-xqxbg\") pod \"d6868925-795c-4765-9343-0b147db98216\" (UID: \"d6868925-795c-4765-9343-0b147db98216\") " Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.734360 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6868925-795c-4765-9343-0b147db98216-utilities" (OuterVolumeSpecName: "utilities") pod "d6868925-795c-4765-9343-0b147db98216" (UID: "d6868925-795c-4765-9343-0b147db98216"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.755072 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6868925-795c-4765-9343-0b147db98216-kube-api-access-xqxbg" (OuterVolumeSpecName: "kube-api-access-xqxbg") pod "d6868925-795c-4765-9343-0b147db98216" (UID: "d6868925-795c-4765-9343-0b147db98216"). InnerVolumeSpecName "kube-api-access-xqxbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.817278 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6868925-795c-4765-9343-0b147db98216-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6868925-795c-4765-9343-0b147db98216" (UID: "d6868925-795c-4765-9343-0b147db98216"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.835148 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqxbg\" (UniqueName: \"kubernetes.io/projected/d6868925-795c-4765-9343-0b147db98216-kube-api-access-xqxbg\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.835176 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6868925-795c-4765-9343-0b147db98216-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.835188 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6868925-795c-4765-9343-0b147db98216-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.857253 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.935751 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82dff338-35e1-44df-8f20-a4d4d8b3c198-catalog-content\") pod \"82dff338-35e1-44df-8f20-a4d4d8b3c198\" (UID: \"82dff338-35e1-44df-8f20-a4d4d8b3c198\") " Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.935888 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rnbt\" (UniqueName: \"kubernetes.io/projected/82dff338-35e1-44df-8f20-a4d4d8b3c198-kube-api-access-2rnbt\") pod \"82dff338-35e1-44df-8f20-a4d4d8b3c198\" (UID: \"82dff338-35e1-44df-8f20-a4d4d8b3c198\") " Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.935957 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82dff338-35e1-44df-8f20-a4d4d8b3c198-utilities\") pod \"82dff338-35e1-44df-8f20-a4d4d8b3c198\" (UID: \"82dff338-35e1-44df-8f20-a4d4d8b3c198\") " Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.936889 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82dff338-35e1-44df-8f20-a4d4d8b3c198-utilities" (OuterVolumeSpecName: "utilities") pod "82dff338-35e1-44df-8f20-a4d4d8b3c198" (UID: "82dff338-35e1-44df-8f20-a4d4d8b3c198"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.944630 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82dff338-35e1-44df-8f20-a4d4d8b3c198-kube-api-access-2rnbt" (OuterVolumeSpecName: "kube-api-access-2rnbt") pod "82dff338-35e1-44df-8f20-a4d4d8b3c198" (UID: "82dff338-35e1-44df-8f20-a4d4d8b3c198"). InnerVolumeSpecName "kube-api-access-2rnbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.973396 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.982110 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:15:28 crc kubenswrapper[4921]: I0312 13:15:28.992933 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.020054 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82dff338-35e1-44df-8f20-a4d4d8b3c198-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82dff338-35e1-44df-8f20-a4d4d8b3c198" (UID: "82dff338-35e1-44df-8f20-a4d4d8b3c198"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.037550 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv5s4\" (UniqueName: \"kubernetes.io/projected/680f8033-da87-4897-bf8c-23b2ad8af659-kube-api-access-zv5s4\") pod \"680f8033-da87-4897-bf8c-23b2ad8af659\" (UID: \"680f8033-da87-4897-bf8c-23b2ad8af659\") " Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.037637 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/680f8033-da87-4897-bf8c-23b2ad8af659-marketplace-operator-metrics\") pod \"680f8033-da87-4897-bf8c-23b2ad8af659\" (UID: \"680f8033-da87-4897-bf8c-23b2ad8af659\") " Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.037689 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fef7638-98df-405a-b04b-f47997b46eac-utilities\") pod \"7fef7638-98df-405a-b04b-f47997b46eac\" (UID: \"7fef7638-98df-405a-b04b-f47997b46eac\") " Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.037754 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/680f8033-da87-4897-bf8c-23b2ad8af659-marketplace-trusted-ca\") pod \"680f8033-da87-4897-bf8c-23b2ad8af659\" (UID: \"680f8033-da87-4897-bf8c-23b2ad8af659\") " Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.037772 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fef7638-98df-405a-b04b-f47997b46eac-catalog-content\") pod \"7fef7638-98df-405a-b04b-f47997b46eac\" (UID: \"7fef7638-98df-405a-b04b-f47997b46eac\") " Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.037791 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzpbh\" (UniqueName: \"kubernetes.io/projected/7fef7638-98df-405a-b04b-f47997b46eac-kube-api-access-gzpbh\") pod \"7fef7638-98df-405a-b04b-f47997b46eac\" (UID: \"7fef7638-98df-405a-b04b-f47997b46eac\") " Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.038155 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rnbt\" (UniqueName: \"kubernetes.io/projected/82dff338-35e1-44df-8f20-a4d4d8b3c198-kube-api-access-2rnbt\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.038172 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82dff338-35e1-44df-8f20-a4d4d8b3c198-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.038181 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82dff338-35e1-44df-8f20-a4d4d8b3c198-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.039037 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fef7638-98df-405a-b04b-f47997b46eac-utilities" (OuterVolumeSpecName: "utilities") pod "7fef7638-98df-405a-b04b-f47997b46eac" (UID: "7fef7638-98df-405a-b04b-f47997b46eac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.039235 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/680f8033-da87-4897-bf8c-23b2ad8af659-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "680f8033-da87-4897-bf8c-23b2ad8af659" (UID: "680f8033-da87-4897-bf8c-23b2ad8af659"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.041454 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/680f8033-da87-4897-bf8c-23b2ad8af659-kube-api-access-zv5s4" (OuterVolumeSpecName: "kube-api-access-zv5s4") pod "680f8033-da87-4897-bf8c-23b2ad8af659" (UID: "680f8033-da87-4897-bf8c-23b2ad8af659"). InnerVolumeSpecName "kube-api-access-zv5s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.041949 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/680f8033-da87-4897-bf8c-23b2ad8af659-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "680f8033-da87-4897-bf8c-23b2ad8af659" (UID: "680f8033-da87-4897-bf8c-23b2ad8af659"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.042706 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fef7638-98df-405a-b04b-f47997b46eac-kube-api-access-gzpbh" (OuterVolumeSpecName: "kube-api-access-gzpbh") pod "7fef7638-98df-405a-b04b-f47997b46eac" (UID: "7fef7638-98df-405a-b04b-f47997b46eac"). InnerVolumeSpecName "kube-api-access-gzpbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.142503 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b934596-5580-41ba-8ad2-8722f4cf476d-utilities\") pod \"2b934596-5580-41ba-8ad2-8722f4cf476d\" (UID: \"2b934596-5580-41ba-8ad2-8722f4cf476d\") " Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.144562 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b934596-5580-41ba-8ad2-8722f4cf476d-catalog-content\") pod \"2b934596-5580-41ba-8ad2-8722f4cf476d\" (UID: \"2b934596-5580-41ba-8ad2-8722f4cf476d\") " Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.145230 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8nv5\" (UniqueName: \"kubernetes.io/projected/2b934596-5580-41ba-8ad2-8722f4cf476d-kube-api-access-f8nv5\") pod \"2b934596-5580-41ba-8ad2-8722f4cf476d\" (UID: \"2b934596-5580-41ba-8ad2-8722f4cf476d\") " Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.144007 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b934596-5580-41ba-8ad2-8722f4cf476d-utilities" (OuterVolumeSpecName: "utilities") pod "2b934596-5580-41ba-8ad2-8722f4cf476d" (UID: "2b934596-5580-41ba-8ad2-8722f4cf476d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.145950 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv5s4\" (UniqueName: \"kubernetes.io/projected/680f8033-da87-4897-bf8c-23b2ad8af659-kube-api-access-zv5s4\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.146100 4921 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/680f8033-da87-4897-bf8c-23b2ad8af659-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.146224 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fef7638-98df-405a-b04b-f47997b46eac-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.146349 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b934596-5580-41ba-8ad2-8722f4cf476d-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.146468 4921 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/680f8033-da87-4897-bf8c-23b2ad8af659-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.146586 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzpbh\" (UniqueName: \"kubernetes.io/projected/7fef7638-98df-405a-b04b-f47997b46eac-kube-api-access-gzpbh\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.149748 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b934596-5580-41ba-8ad2-8722f4cf476d-kube-api-access-f8nv5" (OuterVolumeSpecName: "kube-api-access-f8nv5") pod "2b934596-5580-41ba-8ad2-8722f4cf476d" (UID: "2b934596-5580-41ba-8ad2-8722f4cf476d"). InnerVolumeSpecName "kube-api-access-f8nv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.176177 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b934596-5580-41ba-8ad2-8722f4cf476d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b934596-5580-41ba-8ad2-8722f4cf476d" (UID: "2b934596-5580-41ba-8ad2-8722f4cf476d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.179974 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-cc774"] Mar 12 13:15:29 crc kubenswrapper[4921]: W0312 13:15:29.180414 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8eea941_027c_44f8_a189_b7e9b3c6cb55.slice/crio-46b7042c7c8c9f571f1c207cd83ee1fce25737e2eb175d9a476fe07dd32023ba WatchSource:0}: Error finding container 46b7042c7c8c9f571f1c207cd83ee1fce25737e2eb175d9a476fe07dd32023ba: Status 404 returned error can't find the container with id 46b7042c7c8c9f571f1c207cd83ee1fce25737e2eb175d9a476fe07dd32023ba Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.184461 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fef7638-98df-405a-b04b-f47997b46eac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fef7638-98df-405a-b04b-f47997b46eac" (UID: "7fef7638-98df-405a-b04b-f47997b46eac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.247607 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b934596-5580-41ba-8ad2-8722f4cf476d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.247630 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fef7638-98df-405a-b04b-f47997b46eac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.247640 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8nv5\" (UniqueName: \"kubernetes.io/projected/2b934596-5580-41ba-8ad2-8722f4cf476d-kube-api-access-f8nv5\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.533788 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cc774" event={"ID":"f8eea941-027c-44f8-a189-b7e9b3c6cb55","Type":"ContainerStarted","Data":"5d62ddccfe1657aa6208e89a4d61e54f1d9712db0b9eca9de6d11ed45cac96fa"} Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.533858 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-cc774" event={"ID":"f8eea941-027c-44f8-a189-b7e9b3c6cb55","Type":"ContainerStarted","Data":"46b7042c7c8c9f571f1c207cd83ee1fce25737e2eb175d9a476fe07dd32023ba"} Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.534015 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-cc774" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.535751 4921 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-cc774 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.70:8080/healthz\": dial tcp 10.217.0.70:8080: connect: connection refused" start-of-body= Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.535797 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-cc774" podUID="f8eea941-027c-44f8-a189-b7e9b3c6cb55" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.70:8080/healthz\": dial tcp 10.217.0.70:8080: connect: connection refused" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.537137 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6tzw9" event={"ID":"2b934596-5580-41ba-8ad2-8722f4cf476d","Type":"ContainerDied","Data":"1030c6eede1e10e8324e130cc3a0269d20c3774c6fecbfbec94ac44e1a5bebf3"} Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.537198 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6tzw9" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.537204 4921 scope.go:117] "RemoveContainer" containerID="3febb8809e812f601cb90a1cdaf8dd74fde388aa3401509f9ee8a97dd8003ec0" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.539056 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lvkz9" event={"ID":"82dff338-35e1-44df-8f20-a4d4d8b3c198","Type":"ContainerDied","Data":"6981606ed8b29046bde701e12e11093518e4ea9b9abe1c310554637754003ec3"} Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.539066 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lvkz9" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.550183 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.550400 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7gdw7" event={"ID":"680f8033-da87-4897-bf8c-23b2ad8af659","Type":"ContainerDied","Data":"0fdae05221a58a82c0c0e416b5d7fd3d85e42031c179195c3bea64c40366cd33"} Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.558591 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ll7bv" event={"ID":"d6868925-795c-4765-9343-0b147db98216","Type":"ContainerDied","Data":"b20f7dc4ec8a60f8e268eca74ad424470f1ab46f3317b3392240dbffbe2d776e"} Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.558695 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ll7bv" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.559640 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-cc774" podStartSLOduration=1.559628691 podStartE2EDuration="1.559628691s" podCreationTimestamp="2026-03-12 13:15:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:15:29.552731382 +0000 UTC m=+352.242803363" watchObservedRunningTime="2026-03-12 13:15:29.559628691 +0000 UTC m=+352.249700662" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.565185 4921 scope.go:117] "RemoveContainer" containerID="95ba84464b900e02fd0ae21b7b33879edd89ecb73645b67258f5728da6075215" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.573204 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m6c4k" event={"ID":"7fef7638-98df-405a-b04b-f47997b46eac","Type":"ContainerDied","Data":"2088bf489087b1ad5f240daf5b7a6b8bd0ad936491075b8b1a3102de7c356546"} Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.573399 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m6c4k" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.579661 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tzw9"] Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.599797 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6tzw9"] Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.604776 4921 scope.go:117] "RemoveContainer" containerID="e2fb925d9c390d588c644d7893e9e701b9fd982b2d5ab2f17b19f9d6ea8f2acc" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.610427 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lvkz9"] Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.614010 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lvkz9"] Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.630565 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7gdw7"] Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.634158 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7gdw7"] Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.642004 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ll7bv"] Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.645632 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ll7bv"] Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.650002 4921 scope.go:117] "RemoveContainer" containerID="099bc7c95faf2466e5fc9b2cb9bdd431336e4f297d4ccc1fbbfff42554b33019" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.655903 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m6c4k"] Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.659515 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m6c4k"] Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.671629 4921 scope.go:117] "RemoveContainer" containerID="b214c191ba23cf298a0840b1884d17db2c3fd3731dcd017fe97b6fc0f9cbb977" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.687889 4921 scope.go:117] "RemoveContainer" containerID="9ad326b7d0a378ea95e6e8aecd64182c5ec970f35774884b64f7bb38728636ab" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.713877 4921 scope.go:117] "RemoveContainer" containerID="422deaa84203af77ccb5781bfdd59512046bd17f282a52a9d3f6b65053857949" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.727644 4921 scope.go:117] "RemoveContainer" containerID="6e406b90e54138979540bc2ad36218f2b8fc106654e5200e87fb642f8e0786e6" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.741707 4921 scope.go:117] "RemoveContainer" containerID="550bea0c575ebee0dad224b93817bc115249971c559027fe35e56d75bd1233aa" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.767584 4921 scope.go:117] "RemoveContainer" containerID="188e540798880f0c525ad97b0007253b8558e5f555ef06379a81211555274157" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.789034 4921 scope.go:117] "RemoveContainer" containerID="87350e95bdd8b46b20adb55b84579df94eff35960e97f961929a1f169e337d9a" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.805179 4921 scope.go:117] "RemoveContainer" containerID="b3098ec18ce1e8a07b54a5064ee66b9aa4b52da2e9dd4e250506dea2df8590cd" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.824504 4921 scope.go:117] "RemoveContainer" containerID="8a428a004257dc86d124063cbca02a8f630a0448db405cd9804ec3c2ca410ac8" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.990182 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b934596-5580-41ba-8ad2-8722f4cf476d" path="/var/lib/kubelet/pods/2b934596-5580-41ba-8ad2-8722f4cf476d/volumes" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.990776 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="680f8033-da87-4897-bf8c-23b2ad8af659" path="/var/lib/kubelet/pods/680f8033-da87-4897-bf8c-23b2ad8af659/volumes" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.991213 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fef7638-98df-405a-b04b-f47997b46eac" path="/var/lib/kubelet/pods/7fef7638-98df-405a-b04b-f47997b46eac/volumes" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.992320 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82dff338-35e1-44df-8f20-a4d4d8b3c198" path="/var/lib/kubelet/pods/82dff338-35e1-44df-8f20-a4d4d8b3c198/volumes" Mar 12 13:15:29 crc kubenswrapper[4921]: I0312 13:15:29.992948 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6868925-795c-4765-9343-0b147db98216" path="/var/lib/kubelet/pods/d6868925-795c-4765-9343-0b147db98216/volumes" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.294830 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jnh65"] Mar 12 13:15:30 crc kubenswrapper[4921]: E0312 13:15:30.295024 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b934596-5580-41ba-8ad2-8722f4cf476d" containerName="extract-content" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295035 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b934596-5580-41ba-8ad2-8722f4cf476d" containerName="extract-content" Mar 12 13:15:30 crc kubenswrapper[4921]: E0312 13:15:30.295044 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82dff338-35e1-44df-8f20-a4d4d8b3c198" containerName="registry-server" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295050 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="82dff338-35e1-44df-8f20-a4d4d8b3c198" containerName="registry-server" Mar 12 13:15:30 crc kubenswrapper[4921]: E0312 13:15:30.295060 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b934596-5580-41ba-8ad2-8722f4cf476d" containerName="extract-utilities" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295066 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b934596-5580-41ba-8ad2-8722f4cf476d" containerName="extract-utilities" Mar 12 13:15:30 crc kubenswrapper[4921]: E0312 13:15:30.295074 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fef7638-98df-405a-b04b-f47997b46eac" containerName="registry-server" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295079 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fef7638-98df-405a-b04b-f47997b46eac" containerName="registry-server" Mar 12 13:15:30 crc kubenswrapper[4921]: E0312 13:15:30.295089 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="680f8033-da87-4897-bf8c-23b2ad8af659" containerName="marketplace-operator" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295094 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="680f8033-da87-4897-bf8c-23b2ad8af659" containerName="marketplace-operator" Mar 12 13:15:30 crc kubenswrapper[4921]: E0312 13:15:30.295101 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6868925-795c-4765-9343-0b147db98216" containerName="extract-utilities" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295107 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6868925-795c-4765-9343-0b147db98216" containerName="extract-utilities" Mar 12 13:15:30 crc kubenswrapper[4921]: E0312 13:15:30.295113 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6868925-795c-4765-9343-0b147db98216" containerName="registry-server" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295120 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6868925-795c-4765-9343-0b147db98216" containerName="registry-server" Mar 12 13:15:30 crc kubenswrapper[4921]: E0312 13:15:30.295126 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82dff338-35e1-44df-8f20-a4d4d8b3c198" containerName="extract-content" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295132 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="82dff338-35e1-44df-8f20-a4d4d8b3c198" containerName="extract-content" Mar 12 13:15:30 crc kubenswrapper[4921]: E0312 13:15:30.295139 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fef7638-98df-405a-b04b-f47997b46eac" containerName="extract-content" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295145 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fef7638-98df-405a-b04b-f47997b46eac" containerName="extract-content" Mar 12 13:15:30 crc kubenswrapper[4921]: E0312 13:15:30.295154 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6868925-795c-4765-9343-0b147db98216" containerName="extract-content" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295159 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6868925-795c-4765-9343-0b147db98216" containerName="extract-content" Mar 12 13:15:30 crc kubenswrapper[4921]: E0312 13:15:30.295167 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fef7638-98df-405a-b04b-f47997b46eac" containerName="extract-utilities" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295172 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fef7638-98df-405a-b04b-f47997b46eac" containerName="extract-utilities" Mar 12 13:15:30 crc kubenswrapper[4921]: E0312 13:15:30.295180 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82dff338-35e1-44df-8f20-a4d4d8b3c198" containerName="extract-utilities" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295186 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="82dff338-35e1-44df-8f20-a4d4d8b3c198" containerName="extract-utilities" Mar 12 13:15:30 crc kubenswrapper[4921]: E0312 13:15:30.295195 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b934596-5580-41ba-8ad2-8722f4cf476d" containerName="registry-server" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295202 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b934596-5580-41ba-8ad2-8722f4cf476d" containerName="registry-server" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295308 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b934596-5580-41ba-8ad2-8722f4cf476d" containerName="registry-server" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295322 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="82dff338-35e1-44df-8f20-a4d4d8b3c198" containerName="registry-server" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295331 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6868925-795c-4765-9343-0b147db98216" containerName="registry-server" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295340 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="680f8033-da87-4897-bf8c-23b2ad8af659" containerName="marketplace-operator" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.295351 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fef7638-98df-405a-b04b-f47997b46eac" containerName="registry-server" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.297485 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.299775 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.315213 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnh65"] Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.358779 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/587b8721-fb47-4cd2-8c47-917e0b6dd5dc-catalog-content\") pod \"redhat-marketplace-jnh65\" (UID: \"587b8721-fb47-4cd2-8c47-917e0b6dd5dc\") " pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.358865 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/587b8721-fb47-4cd2-8c47-917e0b6dd5dc-utilities\") pod \"redhat-marketplace-jnh65\" (UID: \"587b8721-fb47-4cd2-8c47-917e0b6dd5dc\") " pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.358942 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwsg7\" (UniqueName: \"kubernetes.io/projected/587b8721-fb47-4cd2-8c47-917e0b6dd5dc-kube-api-access-zwsg7\") pod \"redhat-marketplace-jnh65\" (UID: \"587b8721-fb47-4cd2-8c47-917e0b6dd5dc\") " pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.459896 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/587b8721-fb47-4cd2-8c47-917e0b6dd5dc-catalog-content\") pod \"redhat-marketplace-jnh65\" (UID: \"587b8721-fb47-4cd2-8c47-917e0b6dd5dc\") " pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.459986 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/587b8721-fb47-4cd2-8c47-917e0b6dd5dc-utilities\") pod \"redhat-marketplace-jnh65\" (UID: \"587b8721-fb47-4cd2-8c47-917e0b6dd5dc\") " pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.460023 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwsg7\" (UniqueName: \"kubernetes.io/projected/587b8721-fb47-4cd2-8c47-917e0b6dd5dc-kube-api-access-zwsg7\") pod \"redhat-marketplace-jnh65\" (UID: \"587b8721-fb47-4cd2-8c47-917e0b6dd5dc\") " pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.461053 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/587b8721-fb47-4cd2-8c47-917e0b6dd5dc-catalog-content\") pod \"redhat-marketplace-jnh65\" (UID: \"587b8721-fb47-4cd2-8c47-917e0b6dd5dc\") " pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.461333 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/587b8721-fb47-4cd2-8c47-917e0b6dd5dc-utilities\") pod \"redhat-marketplace-jnh65\" (UID: \"587b8721-fb47-4cd2-8c47-917e0b6dd5dc\") " pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.477887 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwsg7\" (UniqueName: \"kubernetes.io/projected/587b8721-fb47-4cd2-8c47-917e0b6dd5dc-kube-api-access-zwsg7\") pod \"redhat-marketplace-jnh65\" (UID: \"587b8721-fb47-4cd2-8c47-917e0b6dd5dc\") " pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.492302 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6gjth"] Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.493306 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.497152 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.511247 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6gjth"] Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.561665 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01d61927-e67d-49cf-97e5-70d2fed9192b-utilities\") pod \"certified-operators-6gjth\" (UID: \"01d61927-e67d-49cf-97e5-70d2fed9192b\") " pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.561857 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01d61927-e67d-49cf-97e5-70d2fed9192b-catalog-content\") pod \"certified-operators-6gjth\" (UID: \"01d61927-e67d-49cf-97e5-70d2fed9192b\") " pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.561883 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scd2r\" (UniqueName: \"kubernetes.io/projected/01d61927-e67d-49cf-97e5-70d2fed9192b-kube-api-access-scd2r\") pod \"certified-operators-6gjth\" (UID: \"01d61927-e67d-49cf-97e5-70d2fed9192b\") " pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.591804 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-cc774" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.639598 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.664256 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01d61927-e67d-49cf-97e5-70d2fed9192b-catalog-content\") pod \"certified-operators-6gjth\" (UID: \"01d61927-e67d-49cf-97e5-70d2fed9192b\") " pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.664332 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scd2r\" (UniqueName: \"kubernetes.io/projected/01d61927-e67d-49cf-97e5-70d2fed9192b-kube-api-access-scd2r\") pod \"certified-operators-6gjth\" (UID: \"01d61927-e67d-49cf-97e5-70d2fed9192b\") " pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.664383 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01d61927-e67d-49cf-97e5-70d2fed9192b-utilities\") pod \"certified-operators-6gjth\" (UID: \"01d61927-e67d-49cf-97e5-70d2fed9192b\") " pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.666808 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01d61927-e67d-49cf-97e5-70d2fed9192b-catalog-content\") pod \"certified-operators-6gjth\" (UID: \"01d61927-e67d-49cf-97e5-70d2fed9192b\") " pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.667246 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01d61927-e67d-49cf-97e5-70d2fed9192b-utilities\") pod \"certified-operators-6gjth\" (UID: \"01d61927-e67d-49cf-97e5-70d2fed9192b\") " pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.684079 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scd2r\" (UniqueName: \"kubernetes.io/projected/01d61927-e67d-49cf-97e5-70d2fed9192b-kube-api-access-scd2r\") pod \"certified-operators-6gjth\" (UID: \"01d61927-e67d-49cf-97e5-70d2fed9192b\") " pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:30 crc kubenswrapper[4921]: I0312 13:15:30.828515 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:31 crc kubenswrapper[4921]: I0312 13:15:31.040543 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jnh65"] Mar 12 13:15:31 crc kubenswrapper[4921]: W0312 13:15:31.041493 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod587b8721_fb47_4cd2_8c47_917e0b6dd5dc.slice/crio-0283bc2495ac53902a89b8a46654ec4279fa7b9ed1c2cda435fdb9032843f92a WatchSource:0}: Error finding container 0283bc2495ac53902a89b8a46654ec4279fa7b9ed1c2cda435fdb9032843f92a: Status 404 returned error can't find the container with id 0283bc2495ac53902a89b8a46654ec4279fa7b9ed1c2cda435fdb9032843f92a Mar 12 13:15:31 crc kubenswrapper[4921]: I0312 13:15:31.226659 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6gjth"] Mar 12 13:15:31 crc kubenswrapper[4921]: W0312 13:15:31.283788 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01d61927_e67d_49cf_97e5_70d2fed9192b.slice/crio-d0bd7f530c084f4c1e7a8d80e23a6713f4fb991c41b438237a8f6028216a3d6b WatchSource:0}: Error finding container d0bd7f530c084f4c1e7a8d80e23a6713f4fb991c41b438237a8f6028216a3d6b: Status 404 returned error can't find the container with id d0bd7f530c084f4c1e7a8d80e23a6713f4fb991c41b438237a8f6028216a3d6b Mar 12 13:15:31 crc kubenswrapper[4921]: I0312 13:15:31.609975 4921 generic.go:334] "Generic (PLEG): container finished" podID="01d61927-e67d-49cf-97e5-70d2fed9192b" containerID="6a55849e4c4bd4a84babc093ebc13f70a0c93a452dcad58ae658a2ab3a602a2a" exitCode=0 Mar 12 13:15:31 crc kubenswrapper[4921]: I0312 13:15:31.610080 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gjth" event={"ID":"01d61927-e67d-49cf-97e5-70d2fed9192b","Type":"ContainerDied","Data":"6a55849e4c4bd4a84babc093ebc13f70a0c93a452dcad58ae658a2ab3a602a2a"} Mar 12 13:15:31 crc kubenswrapper[4921]: I0312 13:15:31.610460 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gjth" event={"ID":"01d61927-e67d-49cf-97e5-70d2fed9192b","Type":"ContainerStarted","Data":"d0bd7f530c084f4c1e7a8d80e23a6713f4fb991c41b438237a8f6028216a3d6b"} Mar 12 13:15:31 crc kubenswrapper[4921]: I0312 13:15:31.614407 4921 generic.go:334] "Generic (PLEG): container finished" podID="587b8721-fb47-4cd2-8c47-917e0b6dd5dc" containerID="23962ad08acc0efd592902b482ba7d23c342f5cd684f53e91d4a17f3ba4a5967" exitCode=0 Mar 12 13:15:31 crc kubenswrapper[4921]: I0312 13:15:31.614631 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnh65" event={"ID":"587b8721-fb47-4cd2-8c47-917e0b6dd5dc","Type":"ContainerDied","Data":"23962ad08acc0efd592902b482ba7d23c342f5cd684f53e91d4a17f3ba4a5967"} Mar 12 13:15:31 crc kubenswrapper[4921]: I0312 13:15:31.614683 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnh65" event={"ID":"587b8721-fb47-4cd2-8c47-917e0b6dd5dc","Type":"ContainerStarted","Data":"0283bc2495ac53902a89b8a46654ec4279fa7b9ed1c2cda435fdb9032843f92a"} Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.081823 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-5nktn" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.151302 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f2bw7"] Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.621698 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnh65" event={"ID":"587b8721-fb47-4cd2-8c47-917e0b6dd5dc","Type":"ContainerStarted","Data":"4cadf550a5dda502a20af530a9752816ee593bd4783eb1623cfa3336664d5135"} Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.698994 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-26m8d"] Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.700385 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.704148 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.710651 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-26m8d"] Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.795734 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bf56069-058f-4126-8b69-bd0011f99b1e-utilities\") pod \"redhat-operators-26m8d\" (UID: \"6bf56069-058f-4126-8b69-bd0011f99b1e\") " pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.795785 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4572h\" (UniqueName: \"kubernetes.io/projected/6bf56069-058f-4126-8b69-bd0011f99b1e-kube-api-access-4572h\") pod \"redhat-operators-26m8d\" (UID: \"6bf56069-058f-4126-8b69-bd0011f99b1e\") " pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.795843 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bf56069-058f-4126-8b69-bd0011f99b1e-catalog-content\") pod \"redhat-operators-26m8d\" (UID: \"6bf56069-058f-4126-8b69-bd0011f99b1e\") " pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.893756 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-drchr"] Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.894677 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.897184 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4572h\" (UniqueName: \"kubernetes.io/projected/6bf56069-058f-4126-8b69-bd0011f99b1e-kube-api-access-4572h\") pod \"redhat-operators-26m8d\" (UID: \"6bf56069-058f-4126-8b69-bd0011f99b1e\") " pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.897366 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bf56069-058f-4126-8b69-bd0011f99b1e-catalog-content\") pod \"redhat-operators-26m8d\" (UID: \"6bf56069-058f-4126-8b69-bd0011f99b1e\") " pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.897553 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bf56069-058f-4126-8b69-bd0011f99b1e-utilities\") pod \"redhat-operators-26m8d\" (UID: \"6bf56069-058f-4126-8b69-bd0011f99b1e\") " pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.898516 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bf56069-058f-4126-8b69-bd0011f99b1e-utilities\") pod \"redhat-operators-26m8d\" (UID: \"6bf56069-058f-4126-8b69-bd0011f99b1e\") " pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.899205 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.899666 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bf56069-058f-4126-8b69-bd0011f99b1e-catalog-content\") pod \"redhat-operators-26m8d\" (UID: \"6bf56069-058f-4126-8b69-bd0011f99b1e\") " pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.918432 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-drchr"] Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.924062 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4572h\" (UniqueName: \"kubernetes.io/projected/6bf56069-058f-4126-8b69-bd0011f99b1e-kube-api-access-4572h\") pod \"redhat-operators-26m8d\" (UID: \"6bf56069-058f-4126-8b69-bd0011f99b1e\") " pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.999580 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcthw\" (UniqueName: \"kubernetes.io/projected/333229d4-397f-4504-9e02-f793f42324f4-kube-api-access-jcthw\") pod \"community-operators-drchr\" (UID: \"333229d4-397f-4504-9e02-f793f42324f4\") " pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:32 crc kubenswrapper[4921]: I0312 13:15:32.999637 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/333229d4-397f-4504-9e02-f793f42324f4-utilities\") pod \"community-operators-drchr\" (UID: \"333229d4-397f-4504-9e02-f793f42324f4\") " pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:32.999743 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/333229d4-397f-4504-9e02-f793f42324f4-catalog-content\") pod \"community-operators-drchr\" (UID: \"333229d4-397f-4504-9e02-f793f42324f4\") " pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.046255 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.101480 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcthw\" (UniqueName: \"kubernetes.io/projected/333229d4-397f-4504-9e02-f793f42324f4-kube-api-access-jcthw\") pod \"community-operators-drchr\" (UID: \"333229d4-397f-4504-9e02-f793f42324f4\") " pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.101522 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/333229d4-397f-4504-9e02-f793f42324f4-utilities\") pod \"community-operators-drchr\" (UID: \"333229d4-397f-4504-9e02-f793f42324f4\") " pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.101780 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/333229d4-397f-4504-9e02-f793f42324f4-catalog-content\") pod \"community-operators-drchr\" (UID: \"333229d4-397f-4504-9e02-f793f42324f4\") " pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.102319 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/333229d4-397f-4504-9e02-f793f42324f4-catalog-content\") pod \"community-operators-drchr\" (UID: \"333229d4-397f-4504-9e02-f793f42324f4\") " pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.102402 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/333229d4-397f-4504-9e02-f793f42324f4-utilities\") pod \"community-operators-drchr\" (UID: \"333229d4-397f-4504-9e02-f793f42324f4\") " pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.120537 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcthw\" (UniqueName: \"kubernetes.io/projected/333229d4-397f-4504-9e02-f793f42324f4-kube-api-access-jcthw\") pod \"community-operators-drchr\" (UID: \"333229d4-397f-4504-9e02-f793f42324f4\") " pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.261748 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.520259 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-26m8d"] Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.675795 4921 generic.go:334] "Generic (PLEG): container finished" podID="587b8721-fb47-4cd2-8c47-917e0b6dd5dc" containerID="4cadf550a5dda502a20af530a9752816ee593bd4783eb1623cfa3336664d5135" exitCode=0 Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.675861 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnh65" event={"ID":"587b8721-fb47-4cd2-8c47-917e0b6dd5dc","Type":"ContainerDied","Data":"4cadf550a5dda502a20af530a9752816ee593bd4783eb1623cfa3336664d5135"} Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.680053 4921 generic.go:334] "Generic (PLEG): container finished" podID="01d61927-e67d-49cf-97e5-70d2fed9192b" containerID="f69ddfbb98de56633a6bef357466ddb2a0e40bebfc87ea13817cb12d2b22a240" exitCode=0 Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.680143 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gjth" event={"ID":"01d61927-e67d-49cf-97e5-70d2fed9192b","Type":"ContainerDied","Data":"f69ddfbb98de56633a6bef357466ddb2a0e40bebfc87ea13817cb12d2b22a240"} Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.687637 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-26m8d" event={"ID":"6bf56069-058f-4126-8b69-bd0011f99b1e","Type":"ContainerStarted","Data":"4271ef9bdfd6025d75ccf51173b7b9384deb71a28c6ad392c410c7e84a768ff7"} Mar 12 13:15:33 crc kubenswrapper[4921]: I0312 13:15:33.730236 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-drchr"] Mar 12 13:15:34 crc kubenswrapper[4921]: I0312 13:15:34.697177 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jnh65" event={"ID":"587b8721-fb47-4cd2-8c47-917e0b6dd5dc","Type":"ContainerStarted","Data":"e563b3f84bbe265a4ad8177413557f02aa947247b99e5b42e7f6272bd6b11f9d"} Mar 12 13:15:34 crc kubenswrapper[4921]: I0312 13:15:34.698634 4921 generic.go:334] "Generic (PLEG): container finished" podID="333229d4-397f-4504-9e02-f793f42324f4" containerID="a78d12a4c60f8a5b36db1dd4b417b6ea47e9aea39886d04b7a8bd45328b7606d" exitCode=0 Mar 12 13:15:34 crc kubenswrapper[4921]: I0312 13:15:34.698714 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-drchr" event={"ID":"333229d4-397f-4504-9e02-f793f42324f4","Type":"ContainerDied","Data":"a78d12a4c60f8a5b36db1dd4b417b6ea47e9aea39886d04b7a8bd45328b7606d"} Mar 12 13:15:34 crc kubenswrapper[4921]: I0312 13:15:34.698744 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-drchr" event={"ID":"333229d4-397f-4504-9e02-f793f42324f4","Type":"ContainerStarted","Data":"ef8c16ca9f3b46878fba56a45597c4580bde2e2ef5705c4331e696f3fe0664cd"} Mar 12 13:15:34 crc kubenswrapper[4921]: I0312 13:15:34.701033 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gjth" event={"ID":"01d61927-e67d-49cf-97e5-70d2fed9192b","Type":"ContainerStarted","Data":"9371149fdb62254ea665502432e559a3d6c046a64aaf36817e24e843c109472d"} Mar 12 13:15:34 crc kubenswrapper[4921]: I0312 13:15:34.704750 4921 generic.go:334] "Generic (PLEG): container finished" podID="6bf56069-058f-4126-8b69-bd0011f99b1e" containerID="4af06eb132b9e7e75b9b8726105028878e416c7463862fe889d5a924edb0e1e2" exitCode=0 Mar 12 13:15:34 crc kubenswrapper[4921]: I0312 13:15:34.704792 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-26m8d" event={"ID":"6bf56069-058f-4126-8b69-bd0011f99b1e","Type":"ContainerDied","Data":"4af06eb132b9e7e75b9b8726105028878e416c7463862fe889d5a924edb0e1e2"} Mar 12 13:15:34 crc kubenswrapper[4921]: I0312 13:15:34.766801 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jnh65" podStartSLOduration=2.094277618 podStartE2EDuration="4.766779667s" podCreationTimestamp="2026-03-12 13:15:30 +0000 UTC" firstStartedPulling="2026-03-12 13:15:31.615730733 +0000 UTC m=+354.305802744" lastFinishedPulling="2026-03-12 13:15:34.288232782 +0000 UTC m=+356.978304793" observedRunningTime="2026-03-12 13:15:34.765131815 +0000 UTC m=+357.455203786" watchObservedRunningTime="2026-03-12 13:15:34.766779667 +0000 UTC m=+357.456851638" Mar 12 13:15:34 crc kubenswrapper[4921]: I0312 13:15:34.836935 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6gjth" podStartSLOduration=2.254204233 podStartE2EDuration="4.836909721s" podCreationTimestamp="2026-03-12 13:15:30 +0000 UTC" firstStartedPulling="2026-03-12 13:15:31.613202763 +0000 UTC m=+354.303274734" lastFinishedPulling="2026-03-12 13:15:34.195908211 +0000 UTC m=+356.885980222" observedRunningTime="2026-03-12 13:15:34.829302449 +0000 UTC m=+357.519374420" watchObservedRunningTime="2026-03-12 13:15:34.836909721 +0000 UTC m=+357.526981702" Mar 12 13:15:36 crc kubenswrapper[4921]: I0312 13:15:36.717090 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-26m8d" event={"ID":"6bf56069-058f-4126-8b69-bd0011f99b1e","Type":"ContainerStarted","Data":"a5eeba53cc490acd0e40784a12ac754f146452ea0f69291e30323cfdbbee05d4"} Mar 12 13:15:36 crc kubenswrapper[4921]: I0312 13:15:36.719328 4921 generic.go:334] "Generic (PLEG): container finished" podID="333229d4-397f-4504-9e02-f793f42324f4" containerID="3d4e0c57be575143ea1013d89ace638b59d3c5d7cc5dc354e3fe812da5db0bd4" exitCode=0 Mar 12 13:15:36 crc kubenswrapper[4921]: I0312 13:15:36.719372 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-drchr" event={"ID":"333229d4-397f-4504-9e02-f793f42324f4","Type":"ContainerDied","Data":"3d4e0c57be575143ea1013d89ace638b59d3c5d7cc5dc354e3fe812da5db0bd4"} Mar 12 13:15:37 crc kubenswrapper[4921]: I0312 13:15:37.728716 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-drchr" event={"ID":"333229d4-397f-4504-9e02-f793f42324f4","Type":"ContainerStarted","Data":"0ea5667e79a1622dcd0742e618db96e41479f5120bae756a04281871e8d2b135"} Mar 12 13:15:37 crc kubenswrapper[4921]: I0312 13:15:37.731600 4921 generic.go:334] "Generic (PLEG): container finished" podID="6bf56069-058f-4126-8b69-bd0011f99b1e" containerID="a5eeba53cc490acd0e40784a12ac754f146452ea0f69291e30323cfdbbee05d4" exitCode=0 Mar 12 13:15:37 crc kubenswrapper[4921]: I0312 13:15:37.731636 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-26m8d" event={"ID":"6bf56069-058f-4126-8b69-bd0011f99b1e","Type":"ContainerDied","Data":"a5eeba53cc490acd0e40784a12ac754f146452ea0f69291e30323cfdbbee05d4"} Mar 12 13:15:37 crc kubenswrapper[4921]: I0312 13:15:37.775969 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-drchr" podStartSLOduration=3.251422368 podStartE2EDuration="5.775929741s" podCreationTimestamp="2026-03-12 13:15:32 +0000 UTC" firstStartedPulling="2026-03-12 13:15:34.701346403 +0000 UTC m=+357.391418414" lastFinishedPulling="2026-03-12 13:15:37.225853776 +0000 UTC m=+359.915925787" observedRunningTime="2026-03-12 13:15:37.769032931 +0000 UTC m=+360.459104982" watchObservedRunningTime="2026-03-12 13:15:37.775929741 +0000 UTC m=+360.466001722" Mar 12 13:15:38 crc kubenswrapper[4921]: I0312 13:15:38.742248 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-26m8d" event={"ID":"6bf56069-058f-4126-8b69-bd0011f99b1e","Type":"ContainerStarted","Data":"1b640f394bfa71e82dbf897c645c08051fb35472263979faada9c537255cf54e"} Mar 12 13:15:38 crc kubenswrapper[4921]: I0312 13:15:38.775134 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-26m8d" podStartSLOduration=3.187559332 podStartE2EDuration="6.775116262s" podCreationTimestamp="2026-03-12 13:15:32 +0000 UTC" firstStartedPulling="2026-03-12 13:15:34.708918694 +0000 UTC m=+357.398990665" lastFinishedPulling="2026-03-12 13:15:38.296475624 +0000 UTC m=+360.986547595" observedRunningTime="2026-03-12 13:15:38.771153985 +0000 UTC m=+361.461225966" watchObservedRunningTime="2026-03-12 13:15:38.775116262 +0000 UTC m=+361.465188233" Mar 12 13:15:40 crc kubenswrapper[4921]: I0312 13:15:40.640468 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:40 crc kubenswrapper[4921]: I0312 13:15:40.640534 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:40 crc kubenswrapper[4921]: I0312 13:15:40.712237 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:40 crc kubenswrapper[4921]: I0312 13:15:40.800422 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jnh65" Mar 12 13:15:40 crc kubenswrapper[4921]: I0312 13:15:40.828946 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:40 crc kubenswrapper[4921]: I0312 13:15:40.828983 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:40 crc kubenswrapper[4921]: I0312 13:15:40.925541 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:41 crc kubenswrapper[4921]: I0312 13:15:41.832043 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6gjth" Mar 12 13:15:43 crc kubenswrapper[4921]: I0312 13:15:43.046502 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:43 crc kubenswrapper[4921]: I0312 13:15:43.046824 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:43 crc kubenswrapper[4921]: I0312 13:15:43.262804 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:43 crc kubenswrapper[4921]: I0312 13:15:43.262914 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:43 crc kubenswrapper[4921]: I0312 13:15:43.308705 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:43 crc kubenswrapper[4921]: I0312 13:15:43.820642 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-drchr" Mar 12 13:15:44 crc kubenswrapper[4921]: I0312 13:15:44.084692 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-26m8d" podUID="6bf56069-058f-4126-8b69-bd0011f99b1e" containerName="registry-server" probeResult="failure" output=< Mar 12 13:15:44 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 13:15:44 crc kubenswrapper[4921]: > Mar 12 13:15:53 crc kubenswrapper[4921]: I0312 13:15:53.109359 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:53 crc kubenswrapper[4921]: I0312 13:15:53.173435 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.199049 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" podUID="29a3ac39-3f54-47f8-947e-c5d5f4709c23" containerName="registry" containerID="cri-o://1285ddcfe93f2f1759ca10ab17d22ad416b14b5247186241e5f08b7fd0b691ed" gracePeriod=30 Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.609393 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.808416 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29a3ac39-3f54-47f8-947e-c5d5f4709c23-trusted-ca\") pod \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.808507 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/29a3ac39-3f54-47f8-947e-c5d5f4709c23-registry-certificates\") pod \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.808537 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt5fc\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-kube-api-access-nt5fc\") pod \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.808700 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.808730 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-bound-sa-token\") pod \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.808755 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-registry-tls\") pod \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.808773 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/29a3ac39-3f54-47f8-947e-c5d5f4709c23-installation-pull-secrets\") pod \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.808800 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/29a3ac39-3f54-47f8-947e-c5d5f4709c23-ca-trust-extracted\") pod \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\" (UID: \"29a3ac39-3f54-47f8-947e-c5d5f4709c23\") " Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.809394 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a3ac39-3f54-47f8-947e-c5d5f4709c23-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "29a3ac39-3f54-47f8-947e-c5d5f4709c23" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.813471 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a3ac39-3f54-47f8-947e-c5d5f4709c23-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "29a3ac39-3f54-47f8-947e-c5d5f4709c23" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.815067 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "29a3ac39-3f54-47f8-947e-c5d5f4709c23" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.819141 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-kube-api-access-nt5fc" (OuterVolumeSpecName: "kube-api-access-nt5fc") pod "29a3ac39-3f54-47f8-947e-c5d5f4709c23" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23"). InnerVolumeSpecName "kube-api-access-nt5fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.819284 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a3ac39-3f54-47f8-947e-c5d5f4709c23-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "29a3ac39-3f54-47f8-947e-c5d5f4709c23" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.821231 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "29a3ac39-3f54-47f8-947e-c5d5f4709c23" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.825556 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "29a3ac39-3f54-47f8-947e-c5d5f4709c23" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.829164 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a3ac39-3f54-47f8-947e-c5d5f4709c23-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "29a3ac39-3f54-47f8-947e-c5d5f4709c23" (UID: "29a3ac39-3f54-47f8-947e-c5d5f4709c23"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.846745 4921 generic.go:334] "Generic (PLEG): container finished" podID="29a3ac39-3f54-47f8-947e-c5d5f4709c23" containerID="1285ddcfe93f2f1759ca10ab17d22ad416b14b5247186241e5f08b7fd0b691ed" exitCode=0 Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.846785 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" event={"ID":"29a3ac39-3f54-47f8-947e-c5d5f4709c23","Type":"ContainerDied","Data":"1285ddcfe93f2f1759ca10ab17d22ad416b14b5247186241e5f08b7fd0b691ed"} Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.846804 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.846836 4921 scope.go:117] "RemoveContainer" containerID="1285ddcfe93f2f1759ca10ab17d22ad416b14b5247186241e5f08b7fd0b691ed" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.846826 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-f2bw7" event={"ID":"29a3ac39-3f54-47f8-947e-c5d5f4709c23","Type":"ContainerDied","Data":"ef4379a2a58f57caa146d2690162fcab9e8047b97f9b297a5f0933354c774d1b"} Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.871230 4921 scope.go:117] "RemoveContainer" containerID="1285ddcfe93f2f1759ca10ab17d22ad416b14b5247186241e5f08b7fd0b691ed" Mar 12 13:15:57 crc kubenswrapper[4921]: E0312 13:15:57.874106 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1285ddcfe93f2f1759ca10ab17d22ad416b14b5247186241e5f08b7fd0b691ed\": container with ID starting with 1285ddcfe93f2f1759ca10ab17d22ad416b14b5247186241e5f08b7fd0b691ed not found: ID does not exist" containerID="1285ddcfe93f2f1759ca10ab17d22ad416b14b5247186241e5f08b7fd0b691ed" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.874160 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1285ddcfe93f2f1759ca10ab17d22ad416b14b5247186241e5f08b7fd0b691ed"} err="failed to get container status \"1285ddcfe93f2f1759ca10ab17d22ad416b14b5247186241e5f08b7fd0b691ed\": rpc error: code = NotFound desc = could not find container \"1285ddcfe93f2f1759ca10ab17d22ad416b14b5247186241e5f08b7fd0b691ed\": container with ID starting with 1285ddcfe93f2f1759ca10ab17d22ad416b14b5247186241e5f08b7fd0b691ed not found: ID does not exist" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.884850 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f2bw7"] Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.895144 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-f2bw7"] Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.909862 4921 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/29a3ac39-3f54-47f8-947e-c5d5f4709c23-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.909896 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29a3ac39-3f54-47f8-947e-c5d5f4709c23-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.909906 4921 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/29a3ac39-3f54-47f8-947e-c5d5f4709c23-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.909923 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt5fc\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-kube-api-access-nt5fc\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.909933 4921 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.909953 4921 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/29a3ac39-3f54-47f8-947e-c5d5f4709c23-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.909962 4921 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/29a3ac39-3f54-47f8-947e-c5d5f4709c23-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 12 13:15:57 crc kubenswrapper[4921]: I0312 13:15:57.997980 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a3ac39-3f54-47f8-947e-c5d5f4709c23" path="/var/lib/kubelet/pods/29a3ac39-3f54-47f8-947e-c5d5f4709c23/volumes" Mar 12 13:16:00 crc kubenswrapper[4921]: I0312 13:16:00.144247 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555356-jkdbf"] Mar 12 13:16:00 crc kubenswrapper[4921]: E0312 13:16:00.146112 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a3ac39-3f54-47f8-947e-c5d5f4709c23" containerName="registry" Mar 12 13:16:00 crc kubenswrapper[4921]: I0312 13:16:00.146275 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a3ac39-3f54-47f8-947e-c5d5f4709c23" containerName="registry" Mar 12 13:16:00 crc kubenswrapper[4921]: I0312 13:16:00.146608 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a3ac39-3f54-47f8-947e-c5d5f4709c23" containerName="registry" Mar 12 13:16:00 crc kubenswrapper[4921]: I0312 13:16:00.147334 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555356-jkdbf" Mar 12 13:16:00 crc kubenswrapper[4921]: I0312 13:16:00.151650 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555356-jkdbf"] Mar 12 13:16:00 crc kubenswrapper[4921]: I0312 13:16:00.152508 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:16:00 crc kubenswrapper[4921]: I0312 13:16:00.153108 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:16:00 crc kubenswrapper[4921]: I0312 13:16:00.153204 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:16:00 crc kubenswrapper[4921]: I0312 13:16:00.345614 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj8qp\" (UniqueName: \"kubernetes.io/projected/0a9d799a-ae1e-4012-8c57-c78ceeadfb49-kube-api-access-kj8qp\") pod \"auto-csr-approver-29555356-jkdbf\" (UID: \"0a9d799a-ae1e-4012-8c57-c78ceeadfb49\") " pod="openshift-infra/auto-csr-approver-29555356-jkdbf" Mar 12 13:16:00 crc kubenswrapper[4921]: I0312 13:16:00.447520 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj8qp\" (UniqueName: \"kubernetes.io/projected/0a9d799a-ae1e-4012-8c57-c78ceeadfb49-kube-api-access-kj8qp\") pod \"auto-csr-approver-29555356-jkdbf\" (UID: \"0a9d799a-ae1e-4012-8c57-c78ceeadfb49\") " pod="openshift-infra/auto-csr-approver-29555356-jkdbf" Mar 12 13:16:00 crc kubenswrapper[4921]: I0312 13:16:00.477000 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj8qp\" (UniqueName: \"kubernetes.io/projected/0a9d799a-ae1e-4012-8c57-c78ceeadfb49-kube-api-access-kj8qp\") pod \"auto-csr-approver-29555356-jkdbf\" (UID: \"0a9d799a-ae1e-4012-8c57-c78ceeadfb49\") " pod="openshift-infra/auto-csr-approver-29555356-jkdbf" Mar 12 13:16:01 crc kubenswrapper[4921]: I0312 13:16:01.117054 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555356-jkdbf" Mar 12 13:16:01 crc kubenswrapper[4921]: I0312 13:16:01.491870 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555356-jkdbf"] Mar 12 13:16:01 crc kubenswrapper[4921]: I0312 13:16:01.871587 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555356-jkdbf" event={"ID":"0a9d799a-ae1e-4012-8c57-c78ceeadfb49","Type":"ContainerStarted","Data":"935a8ebc5962d2706c16751174d7b77cd84fd54a4644db1f6be123fe2434a821"} Mar 12 13:16:03 crc kubenswrapper[4921]: I0312 13:16:03.885097 4921 generic.go:334] "Generic (PLEG): container finished" podID="0a9d799a-ae1e-4012-8c57-c78ceeadfb49" containerID="82c2fad463f497e13167cbacc577b607de465341fbac53636744fe234966b3b1" exitCode=0 Mar 12 13:16:03 crc kubenswrapper[4921]: I0312 13:16:03.885169 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555356-jkdbf" event={"ID":"0a9d799a-ae1e-4012-8c57-c78ceeadfb49","Type":"ContainerDied","Data":"82c2fad463f497e13167cbacc577b607de465341fbac53636744fe234966b3b1"} Mar 12 13:16:05 crc kubenswrapper[4921]: I0312 13:16:05.127243 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555356-jkdbf" Mar 12 13:16:05 crc kubenswrapper[4921]: I0312 13:16:05.307887 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj8qp\" (UniqueName: \"kubernetes.io/projected/0a9d799a-ae1e-4012-8c57-c78ceeadfb49-kube-api-access-kj8qp\") pod \"0a9d799a-ae1e-4012-8c57-c78ceeadfb49\" (UID: \"0a9d799a-ae1e-4012-8c57-c78ceeadfb49\") " Mar 12 13:16:05 crc kubenswrapper[4921]: I0312 13:16:05.318598 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9d799a-ae1e-4012-8c57-c78ceeadfb49-kube-api-access-kj8qp" (OuterVolumeSpecName: "kube-api-access-kj8qp") pod "0a9d799a-ae1e-4012-8c57-c78ceeadfb49" (UID: "0a9d799a-ae1e-4012-8c57-c78ceeadfb49"). InnerVolumeSpecName "kube-api-access-kj8qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:16:05 crc kubenswrapper[4921]: I0312 13:16:05.409431 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj8qp\" (UniqueName: \"kubernetes.io/projected/0a9d799a-ae1e-4012-8c57-c78ceeadfb49-kube-api-access-kj8qp\") on node \"crc\" DevicePath \"\"" Mar 12 13:16:05 crc kubenswrapper[4921]: I0312 13:16:05.900501 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555356-jkdbf" event={"ID":"0a9d799a-ae1e-4012-8c57-c78ceeadfb49","Type":"ContainerDied","Data":"935a8ebc5962d2706c16751174d7b77cd84fd54a4644db1f6be123fe2434a821"} Mar 12 13:16:05 crc kubenswrapper[4921]: I0312 13:16:05.900559 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="935a8ebc5962d2706c16751174d7b77cd84fd54a4644db1f6be123fe2434a821" Mar 12 13:16:05 crc kubenswrapper[4921]: I0312 13:16:05.900570 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555356-jkdbf" Mar 12 13:16:26 crc kubenswrapper[4921]: I0312 13:16:26.324728 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:16:26 crc kubenswrapper[4921]: I0312 13:16:26.325440 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:16:56 crc kubenswrapper[4921]: I0312 13:16:56.323732 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:16:56 crc kubenswrapper[4921]: I0312 13:16:56.324261 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:17:26 crc kubenswrapper[4921]: I0312 13:17:26.324179 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:17:26 crc kubenswrapper[4921]: I0312 13:17:26.324928 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:17:26 crc kubenswrapper[4921]: I0312 13:17:26.324985 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:17:26 crc kubenswrapper[4921]: I0312 13:17:26.325634 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10c61861e52d240193680813d2394b39e92b34ce948352b7c71e1120e87603ad"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 13:17:26 crc kubenswrapper[4921]: I0312 13:17:26.325705 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://10c61861e52d240193680813d2394b39e92b34ce948352b7c71e1120e87603ad" gracePeriod=600 Mar 12 13:17:27 crc kubenswrapper[4921]: I0312 13:17:27.441280 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="10c61861e52d240193680813d2394b39e92b34ce948352b7c71e1120e87603ad" exitCode=0 Mar 12 13:17:27 crc kubenswrapper[4921]: I0312 13:17:27.441356 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"10c61861e52d240193680813d2394b39e92b34ce948352b7c71e1120e87603ad"} Mar 12 13:17:27 crc kubenswrapper[4921]: I0312 13:17:27.441869 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"51cd28594939b8d7f25cf0501cf5d7fac94d792e25a45e58d39d8c8a5553a580"} Mar 12 13:17:27 crc kubenswrapper[4921]: I0312 13:17:27.441891 4921 scope.go:117] "RemoveContainer" containerID="3558d676a3c882348661fd9967700d03038460628a1f557e21868fc5a9c603bc" Mar 12 13:18:00 crc kubenswrapper[4921]: I0312 13:18:00.152414 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555358-bbn64"] Mar 12 13:18:00 crc kubenswrapper[4921]: E0312 13:18:00.155473 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9d799a-ae1e-4012-8c57-c78ceeadfb49" containerName="oc" Mar 12 13:18:00 crc kubenswrapper[4921]: I0312 13:18:00.155661 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9d799a-ae1e-4012-8c57-c78ceeadfb49" containerName="oc" Mar 12 13:18:00 crc kubenswrapper[4921]: I0312 13:18:00.156173 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a9d799a-ae1e-4012-8c57-c78ceeadfb49" containerName="oc" Mar 12 13:18:00 crc kubenswrapper[4921]: I0312 13:18:00.157285 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555358-bbn64" Mar 12 13:18:00 crc kubenswrapper[4921]: I0312 13:18:00.161212 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:18:00 crc kubenswrapper[4921]: I0312 13:18:00.161373 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:18:00 crc kubenswrapper[4921]: I0312 13:18:00.161224 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:18:00 crc kubenswrapper[4921]: I0312 13:18:00.163135 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555358-bbn64"] Mar 12 13:18:00 crc kubenswrapper[4921]: I0312 13:18:00.266161 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n772j\" (UniqueName: \"kubernetes.io/projected/485ff5b7-a500-4dd6-b619-876713a66893-kube-api-access-n772j\") pod \"auto-csr-approver-29555358-bbn64\" (UID: \"485ff5b7-a500-4dd6-b619-876713a66893\") " pod="openshift-infra/auto-csr-approver-29555358-bbn64" Mar 12 13:18:00 crc kubenswrapper[4921]: I0312 13:18:00.367770 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n772j\" (UniqueName: \"kubernetes.io/projected/485ff5b7-a500-4dd6-b619-876713a66893-kube-api-access-n772j\") pod \"auto-csr-approver-29555358-bbn64\" (UID: \"485ff5b7-a500-4dd6-b619-876713a66893\") " pod="openshift-infra/auto-csr-approver-29555358-bbn64" Mar 12 13:18:00 crc kubenswrapper[4921]: I0312 13:18:00.402351 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n772j\" (UniqueName: \"kubernetes.io/projected/485ff5b7-a500-4dd6-b619-876713a66893-kube-api-access-n772j\") pod \"auto-csr-approver-29555358-bbn64\" (UID: \"485ff5b7-a500-4dd6-b619-876713a66893\") " pod="openshift-infra/auto-csr-approver-29555358-bbn64" Mar 12 13:18:00 crc kubenswrapper[4921]: I0312 13:18:00.486073 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555358-bbn64" Mar 12 13:18:00 crc kubenswrapper[4921]: I0312 13:18:00.947026 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555358-bbn64"] Mar 12 13:18:00 crc kubenswrapper[4921]: I0312 13:18:00.955424 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:18:01 crc kubenswrapper[4921]: I0312 13:18:01.709354 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555358-bbn64" event={"ID":"485ff5b7-a500-4dd6-b619-876713a66893","Type":"ContainerStarted","Data":"37ab72dd0a4ffa74c145b1e73bdb54a412cbb209b08875d55e65b5cb03d532a9"} Mar 12 13:18:02 crc kubenswrapper[4921]: I0312 13:18:02.723913 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555358-bbn64" event={"ID":"485ff5b7-a500-4dd6-b619-876713a66893","Type":"ContainerStarted","Data":"de2ba02afd825c8f08eddc6e570e30f2421d22424ee4e2ba6b29e1c1910aa587"} Mar 12 13:18:02 crc kubenswrapper[4921]: I0312 13:18:02.744876 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555358-bbn64" podStartSLOduration=1.397207475 podStartE2EDuration="2.744858815s" podCreationTimestamp="2026-03-12 13:18:00 +0000 UTC" firstStartedPulling="2026-03-12 13:18:00.954911555 +0000 UTC m=+503.644983566" lastFinishedPulling="2026-03-12 13:18:02.302562885 +0000 UTC m=+504.992634906" observedRunningTime="2026-03-12 13:18:02.74093936 +0000 UTC m=+505.431011381" watchObservedRunningTime="2026-03-12 13:18:02.744858815 +0000 UTC m=+505.434930786" Mar 12 13:18:03 crc kubenswrapper[4921]: I0312 13:18:03.734275 4921 generic.go:334] "Generic (PLEG): container finished" podID="485ff5b7-a500-4dd6-b619-876713a66893" containerID="de2ba02afd825c8f08eddc6e570e30f2421d22424ee4e2ba6b29e1c1910aa587" exitCode=0 Mar 12 13:18:03 crc kubenswrapper[4921]: I0312 13:18:03.734334 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555358-bbn64" event={"ID":"485ff5b7-a500-4dd6-b619-876713a66893","Type":"ContainerDied","Data":"de2ba02afd825c8f08eddc6e570e30f2421d22424ee4e2ba6b29e1c1910aa587"} Mar 12 13:18:05 crc kubenswrapper[4921]: I0312 13:18:05.035685 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555358-bbn64" Mar 12 13:18:05 crc kubenswrapper[4921]: I0312 13:18:05.141987 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n772j\" (UniqueName: \"kubernetes.io/projected/485ff5b7-a500-4dd6-b619-876713a66893-kube-api-access-n772j\") pod \"485ff5b7-a500-4dd6-b619-876713a66893\" (UID: \"485ff5b7-a500-4dd6-b619-876713a66893\") " Mar 12 13:18:05 crc kubenswrapper[4921]: I0312 13:18:05.150581 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485ff5b7-a500-4dd6-b619-876713a66893-kube-api-access-n772j" (OuterVolumeSpecName: "kube-api-access-n772j") pod "485ff5b7-a500-4dd6-b619-876713a66893" (UID: "485ff5b7-a500-4dd6-b619-876713a66893"). InnerVolumeSpecName "kube-api-access-n772j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:18:05 crc kubenswrapper[4921]: I0312 13:18:05.243270 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n772j\" (UniqueName: \"kubernetes.io/projected/485ff5b7-a500-4dd6-b619-876713a66893-kube-api-access-n772j\") on node \"crc\" DevicePath \"\"" Mar 12 13:18:05 crc kubenswrapper[4921]: I0312 13:18:05.756003 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555358-bbn64" event={"ID":"485ff5b7-a500-4dd6-b619-876713a66893","Type":"ContainerDied","Data":"37ab72dd0a4ffa74c145b1e73bdb54a412cbb209b08875d55e65b5cb03d532a9"} Mar 12 13:18:05 crc kubenswrapper[4921]: I0312 13:18:05.756069 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37ab72dd0a4ffa74c145b1e73bdb54a412cbb209b08875d55e65b5cb03d532a9" Mar 12 13:18:05 crc kubenswrapper[4921]: I0312 13:18:05.756109 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555358-bbn64" Mar 12 13:18:05 crc kubenswrapper[4921]: I0312 13:18:05.810393 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555352-kdcsz"] Mar 12 13:18:05 crc kubenswrapper[4921]: I0312 13:18:05.814429 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555352-kdcsz"] Mar 12 13:18:05 crc kubenswrapper[4921]: I0312 13:18:05.993051 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc6eb617-cfea-4abf-81fd-8417dc305d9c" path="/var/lib/kubelet/pods/fc6eb617-cfea-4abf-81fd-8417dc305d9c/volumes" Mar 12 13:18:38 crc kubenswrapper[4921]: I0312 13:18:38.327158 4921 scope.go:117] "RemoveContainer" containerID="e431e0e27e2398cdb9c5a15802593ff33ca2e82474e2c9fedf1b2d11a2daf186" Mar 12 13:18:38 crc kubenswrapper[4921]: I0312 13:18:38.382405 4921 scope.go:117] "RemoveContainer" containerID="5cc786e0764817cbde2bb24b1ebed0ddc75fc26e757dba2a7eccc80eab22b728" Mar 12 13:19:38 crc kubenswrapper[4921]: I0312 13:19:38.418648 4921 scope.go:117] "RemoveContainer" containerID="db68c1cea3e3dc00ddeae4350261332d29e67adfadb9250501e6ae376506065a" Mar 12 13:19:38 crc kubenswrapper[4921]: I0312 13:19:38.447857 4921 scope.go:117] "RemoveContainer" containerID="d1fb2f0dcd47401856c892e8ec8f44677ec89df061c36fdce8ab427add3b0414" Mar 12 13:19:56 crc kubenswrapper[4921]: I0312 13:19:56.323480 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:19:56 crc kubenswrapper[4921]: I0312 13:19:56.324127 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:20:00 crc kubenswrapper[4921]: I0312 13:20:00.167537 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555360-t8xh2"] Mar 12 13:20:00 crc kubenswrapper[4921]: E0312 13:20:00.168298 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485ff5b7-a500-4dd6-b619-876713a66893" containerName="oc" Mar 12 13:20:00 crc kubenswrapper[4921]: I0312 13:20:00.168320 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="485ff5b7-a500-4dd6-b619-876713a66893" containerName="oc" Mar 12 13:20:00 crc kubenswrapper[4921]: I0312 13:20:00.168513 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="485ff5b7-a500-4dd6-b619-876713a66893" containerName="oc" Mar 12 13:20:00 crc kubenswrapper[4921]: I0312 13:20:00.169208 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555360-t8xh2" Mar 12 13:20:00 crc kubenswrapper[4921]: I0312 13:20:00.173752 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:20:00 crc kubenswrapper[4921]: I0312 13:20:00.174118 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:20:00 crc kubenswrapper[4921]: I0312 13:20:00.175499 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:20:00 crc kubenswrapper[4921]: I0312 13:20:00.188861 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555360-t8xh2"] Mar 12 13:20:00 crc kubenswrapper[4921]: I0312 13:20:00.365043 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk9g5\" (UniqueName: \"kubernetes.io/projected/68c6a2d3-4d37-4e4a-b16b-011befefbb0c-kube-api-access-sk9g5\") pod \"auto-csr-approver-29555360-t8xh2\" (UID: \"68c6a2d3-4d37-4e4a-b16b-011befefbb0c\") " pod="openshift-infra/auto-csr-approver-29555360-t8xh2" Mar 12 13:20:00 crc kubenswrapper[4921]: I0312 13:20:00.465879 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk9g5\" (UniqueName: \"kubernetes.io/projected/68c6a2d3-4d37-4e4a-b16b-011befefbb0c-kube-api-access-sk9g5\") pod \"auto-csr-approver-29555360-t8xh2\" (UID: \"68c6a2d3-4d37-4e4a-b16b-011befefbb0c\") " pod="openshift-infra/auto-csr-approver-29555360-t8xh2" Mar 12 13:20:00 crc kubenswrapper[4921]: I0312 13:20:00.491792 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk9g5\" (UniqueName: \"kubernetes.io/projected/68c6a2d3-4d37-4e4a-b16b-011befefbb0c-kube-api-access-sk9g5\") pod \"auto-csr-approver-29555360-t8xh2\" (UID: \"68c6a2d3-4d37-4e4a-b16b-011befefbb0c\") " pod="openshift-infra/auto-csr-approver-29555360-t8xh2" Mar 12 13:20:00 crc kubenswrapper[4921]: I0312 13:20:00.788845 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555360-t8xh2" Mar 12 13:20:01 crc kubenswrapper[4921]: I0312 13:20:01.072576 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555360-t8xh2"] Mar 12 13:20:01 crc kubenswrapper[4921]: I0312 13:20:01.551660 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555360-t8xh2" event={"ID":"68c6a2d3-4d37-4e4a-b16b-011befefbb0c","Type":"ContainerStarted","Data":"8d79a58d06758d98c9ba5dc26200bbba7f9891c5c85d42c4682c47041f137c60"} Mar 12 13:20:02 crc kubenswrapper[4921]: I0312 13:20:02.563079 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555360-t8xh2" event={"ID":"68c6a2d3-4d37-4e4a-b16b-011befefbb0c","Type":"ContainerStarted","Data":"1e1266d4e3cb9f6d4ada0273b076dc709587dba584b297b496c13f57a8b2cc16"} Mar 12 13:20:02 crc kubenswrapper[4921]: I0312 13:20:02.583539 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555360-t8xh2" podStartSLOduration=1.495987783 podStartE2EDuration="2.583514816s" podCreationTimestamp="2026-03-12 13:20:00 +0000 UTC" firstStartedPulling="2026-03-12 13:20:01.094100224 +0000 UTC m=+623.784172205" lastFinishedPulling="2026-03-12 13:20:02.181627227 +0000 UTC m=+624.871699238" observedRunningTime="2026-03-12 13:20:02.579371653 +0000 UTC m=+625.269443624" watchObservedRunningTime="2026-03-12 13:20:02.583514816 +0000 UTC m=+625.273586827" Mar 12 13:20:03 crc kubenswrapper[4921]: I0312 13:20:03.570150 4921 generic.go:334] "Generic (PLEG): container finished" podID="68c6a2d3-4d37-4e4a-b16b-011befefbb0c" containerID="1e1266d4e3cb9f6d4ada0273b076dc709587dba584b297b496c13f57a8b2cc16" exitCode=0 Mar 12 13:20:03 crc kubenswrapper[4921]: I0312 13:20:03.570196 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555360-t8xh2" event={"ID":"68c6a2d3-4d37-4e4a-b16b-011befefbb0c","Type":"ContainerDied","Data":"1e1266d4e3cb9f6d4ada0273b076dc709587dba584b297b496c13f57a8b2cc16"} Mar 12 13:20:04 crc kubenswrapper[4921]: I0312 13:20:04.851269 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555360-t8xh2" Mar 12 13:20:05 crc kubenswrapper[4921]: I0312 13:20:05.034230 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk9g5\" (UniqueName: \"kubernetes.io/projected/68c6a2d3-4d37-4e4a-b16b-011befefbb0c-kube-api-access-sk9g5\") pod \"68c6a2d3-4d37-4e4a-b16b-011befefbb0c\" (UID: \"68c6a2d3-4d37-4e4a-b16b-011befefbb0c\") " Mar 12 13:20:05 crc kubenswrapper[4921]: I0312 13:20:05.042910 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68c6a2d3-4d37-4e4a-b16b-011befefbb0c-kube-api-access-sk9g5" (OuterVolumeSpecName: "kube-api-access-sk9g5") pod "68c6a2d3-4d37-4e4a-b16b-011befefbb0c" (UID: "68c6a2d3-4d37-4e4a-b16b-011befefbb0c"). InnerVolumeSpecName "kube-api-access-sk9g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:20:05 crc kubenswrapper[4921]: I0312 13:20:05.136435 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk9g5\" (UniqueName: \"kubernetes.io/projected/68c6a2d3-4d37-4e4a-b16b-011befefbb0c-kube-api-access-sk9g5\") on node \"crc\" DevicePath \"\"" Mar 12 13:20:05 crc kubenswrapper[4921]: I0312 13:20:05.585200 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555360-t8xh2" event={"ID":"68c6a2d3-4d37-4e4a-b16b-011befefbb0c","Type":"ContainerDied","Data":"8d79a58d06758d98c9ba5dc26200bbba7f9891c5c85d42c4682c47041f137c60"} Mar 12 13:20:05 crc kubenswrapper[4921]: I0312 13:20:05.585257 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555360-t8xh2" Mar 12 13:20:05 crc kubenswrapper[4921]: I0312 13:20:05.585276 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d79a58d06758d98c9ba5dc26200bbba7f9891c5c85d42c4682c47041f137c60" Mar 12 13:20:05 crc kubenswrapper[4921]: I0312 13:20:05.652907 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555354-kkl75"] Mar 12 13:20:05 crc kubenswrapper[4921]: I0312 13:20:05.657599 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555354-kkl75"] Mar 12 13:20:05 crc kubenswrapper[4921]: I0312 13:20:05.994131 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5704616-685f-49f3-9dd7-bb080b87cf29" path="/var/lib/kubelet/pods/d5704616-685f-49f3-9dd7-bb080b87cf29/volumes" Mar 12 13:20:26 crc kubenswrapper[4921]: I0312 13:20:26.324716 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:20:26 crc kubenswrapper[4921]: I0312 13:20:26.325445 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:20:38 crc kubenswrapper[4921]: I0312 13:20:38.492433 4921 scope.go:117] "RemoveContainer" containerID="192152a3bc8743f7d3d4259bd68947af5ad7ef207a58fbb29a437f3f703149eb" Mar 12 13:20:56 crc kubenswrapper[4921]: I0312 13:20:56.324436 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:20:56 crc kubenswrapper[4921]: I0312 13:20:56.324993 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:20:56 crc kubenswrapper[4921]: I0312 13:20:56.325052 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:20:56 crc kubenswrapper[4921]: I0312 13:20:56.325725 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"51cd28594939b8d7f25cf0501cf5d7fac94d792e25a45e58d39d8c8a5553a580"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 13:20:56 crc kubenswrapper[4921]: I0312 13:20:56.325804 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://51cd28594939b8d7f25cf0501cf5d7fac94d792e25a45e58d39d8c8a5553a580" gracePeriod=600 Mar 12 13:20:56 crc kubenswrapper[4921]: E0312 13:20:56.462681 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae82cb49_657a_4b47_8107_0729b9edf47b.slice/crio-conmon-51cd28594939b8d7f25cf0501cf5d7fac94d792e25a45e58d39d8c8a5553a580.scope\": RecentStats: unable to find data in memory cache]" Mar 12 13:20:56 crc kubenswrapper[4921]: I0312 13:20:56.934271 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="51cd28594939b8d7f25cf0501cf5d7fac94d792e25a45e58d39d8c8a5553a580" exitCode=0 Mar 12 13:20:56 crc kubenswrapper[4921]: I0312 13:20:56.934351 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"51cd28594939b8d7f25cf0501cf5d7fac94d792e25a45e58d39d8c8a5553a580"} Mar 12 13:20:56 crc kubenswrapper[4921]: I0312 13:20:56.934733 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"107f4a8503d4c0486ad2c3402e1b2b2b1ceede9b611f44e27a27f3de56a8e4cf"} Mar 12 13:20:56 crc kubenswrapper[4921]: I0312 13:20:56.934757 4921 scope.go:117] "RemoveContainer" containerID="10c61861e52d240193680813d2394b39e92b34ce948352b7c71e1120e87603ad" Mar 12 13:21:52 crc kubenswrapper[4921]: I0312 13:21:52.136353 4921 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.179237 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555362-df84w"] Mar 12 13:22:00 crc kubenswrapper[4921]: E0312 13:22:00.179877 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68c6a2d3-4d37-4e4a-b16b-011befefbb0c" containerName="oc" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.179890 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c6a2d3-4d37-4e4a-b16b-011befefbb0c" containerName="oc" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.179994 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="68c6a2d3-4d37-4e4a-b16b-011befefbb0c" containerName="oc" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.180307 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555362-df84w" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.181907 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.182266 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.182281 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.196514 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555362-df84w"] Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.365063 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99wnx\" (UniqueName: \"kubernetes.io/projected/90e979e7-b235-4702-b7b6-303d881df7bb-kube-api-access-99wnx\") pod \"auto-csr-approver-29555362-df84w\" (UID: \"90e979e7-b235-4702-b7b6-303d881df7bb\") " pod="openshift-infra/auto-csr-approver-29555362-df84w" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.466323 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99wnx\" (UniqueName: \"kubernetes.io/projected/90e979e7-b235-4702-b7b6-303d881df7bb-kube-api-access-99wnx\") pod \"auto-csr-approver-29555362-df84w\" (UID: \"90e979e7-b235-4702-b7b6-303d881df7bb\") " pod="openshift-infra/auto-csr-approver-29555362-df84w" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.475828 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-bbpqn"] Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.476550 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-bbpqn" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.480746 4921 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-244xc" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.482428 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.482434 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.488523 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-jw9bb"] Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.489283 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-jw9bb" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.492190 4921 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-6f2wx" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.499848 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99wnx\" (UniqueName: \"kubernetes.io/projected/90e979e7-b235-4702-b7b6-303d881df7bb-kube-api-access-99wnx\") pod \"auto-csr-approver-29555362-df84w\" (UID: \"90e979e7-b235-4702-b7b6-303d881df7bb\") " pod="openshift-infra/auto-csr-approver-29555362-df84w" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.501877 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-d22fr"] Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.502575 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-d22fr" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.504143 4921 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-xr5wh" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.511657 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-bbpqn"] Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.530619 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-d22fr"] Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.548616 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-jw9bb"] Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.669417 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gstrs\" (UniqueName: \"kubernetes.io/projected/aabe30ef-92c9-4d25-8278-09d1dba1583b-kube-api-access-gstrs\") pod \"cert-manager-858654f9db-d22fr\" (UID: \"aabe30ef-92c9-4d25-8278-09d1dba1583b\") " pod="cert-manager/cert-manager-858654f9db-d22fr" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.669510 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbgnt\" (UniqueName: \"kubernetes.io/projected/b02a546a-2d4e-4de2-9673-9c7b2d37a6e8-kube-api-access-bbgnt\") pod \"cert-manager-cainjector-cf98fcc89-bbpqn\" (UID: \"b02a546a-2d4e-4de2-9673-9c7b2d37a6e8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-bbpqn" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.669601 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4vbx\" (UniqueName: \"kubernetes.io/projected/5e022cd5-783e-4dbe-a554-42a43e2bc746-kube-api-access-d4vbx\") pod \"cert-manager-webhook-687f57d79b-jw9bb\" (UID: \"5e022cd5-783e-4dbe-a554-42a43e2bc746\") " pod="cert-manager/cert-manager-webhook-687f57d79b-jw9bb" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.770448 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gstrs\" (UniqueName: \"kubernetes.io/projected/aabe30ef-92c9-4d25-8278-09d1dba1583b-kube-api-access-gstrs\") pod \"cert-manager-858654f9db-d22fr\" (UID: \"aabe30ef-92c9-4d25-8278-09d1dba1583b\") " pod="cert-manager/cert-manager-858654f9db-d22fr" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.771268 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbgnt\" (UniqueName: \"kubernetes.io/projected/b02a546a-2d4e-4de2-9673-9c7b2d37a6e8-kube-api-access-bbgnt\") pod \"cert-manager-cainjector-cf98fcc89-bbpqn\" (UID: \"b02a546a-2d4e-4de2-9673-9c7b2d37a6e8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-bbpqn" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.771526 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4vbx\" (UniqueName: \"kubernetes.io/projected/5e022cd5-783e-4dbe-a554-42a43e2bc746-kube-api-access-d4vbx\") pod \"cert-manager-webhook-687f57d79b-jw9bb\" (UID: \"5e022cd5-783e-4dbe-a554-42a43e2bc746\") " pod="cert-manager/cert-manager-webhook-687f57d79b-jw9bb" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.787953 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gstrs\" (UniqueName: \"kubernetes.io/projected/aabe30ef-92c9-4d25-8278-09d1dba1583b-kube-api-access-gstrs\") pod \"cert-manager-858654f9db-d22fr\" (UID: \"aabe30ef-92c9-4d25-8278-09d1dba1583b\") " pod="cert-manager/cert-manager-858654f9db-d22fr" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.791285 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4vbx\" (UniqueName: \"kubernetes.io/projected/5e022cd5-783e-4dbe-a554-42a43e2bc746-kube-api-access-d4vbx\") pod \"cert-manager-webhook-687f57d79b-jw9bb\" (UID: \"5e022cd5-783e-4dbe-a554-42a43e2bc746\") " pod="cert-manager/cert-manager-webhook-687f57d79b-jw9bb" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.794110 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555362-df84w" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.799366 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbgnt\" (UniqueName: \"kubernetes.io/projected/b02a546a-2d4e-4de2-9673-9c7b2d37a6e8-kube-api-access-bbgnt\") pod \"cert-manager-cainjector-cf98fcc89-bbpqn\" (UID: \"b02a546a-2d4e-4de2-9673-9c7b2d37a6e8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-bbpqn" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.833723 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-bbpqn" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.841857 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-jw9bb" Mar 12 13:22:00 crc kubenswrapper[4921]: I0312 13:22:00.847695 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-d22fr" Mar 12 13:22:01 crc kubenswrapper[4921]: I0312 13:22:01.000690 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555362-df84w"] Mar 12 13:22:01 crc kubenswrapper[4921]: I0312 13:22:01.263677 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-bbpqn"] Mar 12 13:22:01 crc kubenswrapper[4921]: W0312 13:22:01.268649 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb02a546a_2d4e_4de2_9673_9c7b2d37a6e8.slice/crio-fb345cc80f5a702f4d6f05ef00fa42a64e931899aedd69cb6bf0dcaf34f63358 WatchSource:0}: Error finding container fb345cc80f5a702f4d6f05ef00fa42a64e931899aedd69cb6bf0dcaf34f63358: Status 404 returned error can't find the container with id fb345cc80f5a702f4d6f05ef00fa42a64e931899aedd69cb6bf0dcaf34f63358 Mar 12 13:22:01 crc kubenswrapper[4921]: I0312 13:22:01.312760 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-d22fr"] Mar 12 13:22:01 crc kubenswrapper[4921]: W0312 13:22:01.315702 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaabe30ef_92c9_4d25_8278_09d1dba1583b.slice/crio-4c4249b8c3d07352bf5d7cfa04adced14f00ac5448b7a2453822aea035258813 WatchSource:0}: Error finding container 4c4249b8c3d07352bf5d7cfa04adced14f00ac5448b7a2453822aea035258813: Status 404 returned error can't find the container with id 4c4249b8c3d07352bf5d7cfa04adced14f00ac5448b7a2453822aea035258813 Mar 12 13:22:01 crc kubenswrapper[4921]: I0312 13:22:01.325131 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-jw9bb"] Mar 12 13:22:01 crc kubenswrapper[4921]: W0312 13:22:01.329483 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e022cd5_783e_4dbe_a554_42a43e2bc746.slice/crio-ff73a8eb0e5575aefcfe0cc61ce0690d586838b95d887b232e92efdff5f3ff59 WatchSource:0}: Error finding container ff73a8eb0e5575aefcfe0cc61ce0690d586838b95d887b232e92efdff5f3ff59: Status 404 returned error can't find the container with id ff73a8eb0e5575aefcfe0cc61ce0690d586838b95d887b232e92efdff5f3ff59 Mar 12 13:22:01 crc kubenswrapper[4921]: I0312 13:22:01.351268 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-bbpqn" event={"ID":"b02a546a-2d4e-4de2-9673-9c7b2d37a6e8","Type":"ContainerStarted","Data":"fb345cc80f5a702f4d6f05ef00fa42a64e931899aedd69cb6bf0dcaf34f63358"} Mar 12 13:22:01 crc kubenswrapper[4921]: I0312 13:22:01.352345 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-d22fr" event={"ID":"aabe30ef-92c9-4d25-8278-09d1dba1583b","Type":"ContainerStarted","Data":"4c4249b8c3d07352bf5d7cfa04adced14f00ac5448b7a2453822aea035258813"} Mar 12 13:22:01 crc kubenswrapper[4921]: I0312 13:22:01.353317 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-jw9bb" event={"ID":"5e022cd5-783e-4dbe-a554-42a43e2bc746","Type":"ContainerStarted","Data":"ff73a8eb0e5575aefcfe0cc61ce0690d586838b95d887b232e92efdff5f3ff59"} Mar 12 13:22:01 crc kubenswrapper[4921]: I0312 13:22:01.354091 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555362-df84w" event={"ID":"90e979e7-b235-4702-b7b6-303d881df7bb","Type":"ContainerStarted","Data":"1b308baa923200264dc266f9cb5ec66fd20a4faa855515a7814a0141d319824c"} Mar 12 13:22:04 crc kubenswrapper[4921]: I0312 13:22:04.373533 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-jw9bb" event={"ID":"5e022cd5-783e-4dbe-a554-42a43e2bc746","Type":"ContainerStarted","Data":"d9b5e7318225c4beb7e57fbdf75835f0c0046975a2ccff09bebce65169bc2392"} Mar 12 13:22:04 crc kubenswrapper[4921]: I0312 13:22:04.374172 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-jw9bb" Mar 12 13:22:04 crc kubenswrapper[4921]: I0312 13:22:04.392379 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-jw9bb" podStartSLOduration=2.338241877 podStartE2EDuration="4.39236046s" podCreationTimestamp="2026-03-12 13:22:00 +0000 UTC" firstStartedPulling="2026-03-12 13:22:01.331214065 +0000 UTC m=+744.021286026" lastFinishedPulling="2026-03-12 13:22:03.385332638 +0000 UTC m=+746.075404609" observedRunningTime="2026-03-12 13:22:04.391772071 +0000 UTC m=+747.081844092" watchObservedRunningTime="2026-03-12 13:22:04.39236046 +0000 UTC m=+747.082432431" Mar 12 13:22:05 crc kubenswrapper[4921]: I0312 13:22:05.379330 4921 generic.go:334] "Generic (PLEG): container finished" podID="90e979e7-b235-4702-b7b6-303d881df7bb" containerID="02b9faa37d3c073ef9dfec8e1be069df54bf22e3bb65f54526c0702581df5f4b" exitCode=0 Mar 12 13:22:05 crc kubenswrapper[4921]: I0312 13:22:05.379395 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555362-df84w" event={"ID":"90e979e7-b235-4702-b7b6-303d881df7bb","Type":"ContainerDied","Data":"02b9faa37d3c073ef9dfec8e1be069df54bf22e3bb65f54526c0702581df5f4b"} Mar 12 13:22:05 crc kubenswrapper[4921]: I0312 13:22:05.380712 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-bbpqn" event={"ID":"b02a546a-2d4e-4de2-9673-9c7b2d37a6e8","Type":"ContainerStarted","Data":"f62537ffe4b13ad6f3e4eb92adc15abc6c2b4bf58ea9ce370d03a33ca9f95800"} Mar 12 13:22:05 crc kubenswrapper[4921]: I0312 13:22:05.381983 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-d22fr" event={"ID":"aabe30ef-92c9-4d25-8278-09d1dba1583b","Type":"ContainerStarted","Data":"e7b2015885f6763f42e3fcf23b194f8c38244c1f229face604a4bd3b585fe462"} Mar 12 13:22:05 crc kubenswrapper[4921]: I0312 13:22:05.411988 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-d22fr" podStartSLOduration=2.003393317 podStartE2EDuration="5.411968024s" podCreationTimestamp="2026-03-12 13:22:00 +0000 UTC" firstStartedPulling="2026-03-12 13:22:01.319403638 +0000 UTC m=+744.009475609" lastFinishedPulling="2026-03-12 13:22:04.727978325 +0000 UTC m=+747.418050316" observedRunningTime="2026-03-12 13:22:05.408301797 +0000 UTC m=+748.098373768" watchObservedRunningTime="2026-03-12 13:22:05.411968024 +0000 UTC m=+748.102039995" Mar 12 13:22:05 crc kubenswrapper[4921]: I0312 13:22:05.466666 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-bbpqn" podStartSLOduration=2.01258438 podStartE2EDuration="5.46664304s" podCreationTimestamp="2026-03-12 13:22:00 +0000 UTC" firstStartedPulling="2026-03-12 13:22:01.27093647 +0000 UTC m=+743.961008441" lastFinishedPulling="2026-03-12 13:22:04.72499512 +0000 UTC m=+747.415067101" observedRunningTime="2026-03-12 13:22:05.462910031 +0000 UTC m=+748.152982002" watchObservedRunningTime="2026-03-12 13:22:05.46664304 +0000 UTC m=+748.156715011" Mar 12 13:22:06 crc kubenswrapper[4921]: I0312 13:22:06.701392 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555362-df84w" Mar 12 13:22:06 crc kubenswrapper[4921]: I0312 13:22:06.877757 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99wnx\" (UniqueName: \"kubernetes.io/projected/90e979e7-b235-4702-b7b6-303d881df7bb-kube-api-access-99wnx\") pod \"90e979e7-b235-4702-b7b6-303d881df7bb\" (UID: \"90e979e7-b235-4702-b7b6-303d881df7bb\") " Mar 12 13:22:06 crc kubenswrapper[4921]: I0312 13:22:06.883439 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e979e7-b235-4702-b7b6-303d881df7bb-kube-api-access-99wnx" (OuterVolumeSpecName: "kube-api-access-99wnx") pod "90e979e7-b235-4702-b7b6-303d881df7bb" (UID: "90e979e7-b235-4702-b7b6-303d881df7bb"). InnerVolumeSpecName "kube-api-access-99wnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:22:06 crc kubenswrapper[4921]: I0312 13:22:06.979304 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99wnx\" (UniqueName: \"kubernetes.io/projected/90e979e7-b235-4702-b7b6-303d881df7bb-kube-api-access-99wnx\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:07 crc kubenswrapper[4921]: I0312 13:22:07.395194 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555362-df84w" event={"ID":"90e979e7-b235-4702-b7b6-303d881df7bb","Type":"ContainerDied","Data":"1b308baa923200264dc266f9cb5ec66fd20a4faa855515a7814a0141d319824c"} Mar 12 13:22:07 crc kubenswrapper[4921]: I0312 13:22:07.395550 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b308baa923200264dc266f9cb5ec66fd20a4faa855515a7814a0141d319824c" Mar 12 13:22:07 crc kubenswrapper[4921]: I0312 13:22:07.395271 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555362-df84w" Mar 12 13:22:07 crc kubenswrapper[4921]: E0312 13:22:07.480099 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90e979e7_b235_4702_b7b6_303d881df7bb.slice/crio-1b308baa923200264dc266f9cb5ec66fd20a4faa855515a7814a0141d319824c\": RecentStats: unable to find data in memory cache]" Mar 12 13:22:07 crc kubenswrapper[4921]: I0312 13:22:07.768257 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555356-jkdbf"] Mar 12 13:22:07 crc kubenswrapper[4921]: I0312 13:22:07.774869 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555356-jkdbf"] Mar 12 13:22:07 crc kubenswrapper[4921]: I0312 13:22:07.998677 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a9d799a-ae1e-4012-8c57-c78ceeadfb49" path="/var/lib/kubelet/pods/0a9d799a-ae1e-4012-8c57-c78ceeadfb49/volumes" Mar 12 13:22:10 crc kubenswrapper[4921]: I0312 13:22:10.846545 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-jw9bb" Mar 12 13:22:30 crc kubenswrapper[4921]: I0312 13:22:30.898793 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rl674"] Mar 12 13:22:30 crc kubenswrapper[4921]: I0312 13:22:30.900043 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="ovn-controller" containerID="cri-o://50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c" gracePeriod=30 Mar 12 13:22:30 crc kubenswrapper[4921]: I0312 13:22:30.900548 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="sbdb" containerID="cri-o://704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c" gracePeriod=30 Mar 12 13:22:30 crc kubenswrapper[4921]: I0312 13:22:30.900624 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="nbdb" containerID="cri-o://a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894" gracePeriod=30 Mar 12 13:22:30 crc kubenswrapper[4921]: I0312 13:22:30.900681 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="northd" containerID="cri-o://a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30" gracePeriod=30 Mar 12 13:22:30 crc kubenswrapper[4921]: I0312 13:22:30.900734 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218" gracePeriod=30 Mar 12 13:22:30 crc kubenswrapper[4921]: I0312 13:22:30.900785 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="kube-rbac-proxy-node" containerID="cri-o://622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43" gracePeriod=30 Mar 12 13:22:30 crc kubenswrapper[4921]: I0312 13:22:30.900872 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="ovn-acl-logging" containerID="cri-o://136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7" gracePeriod=30 Mar 12 13:22:30 crc kubenswrapper[4921]: I0312 13:22:30.969279 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="ovnkube-controller" containerID="cri-o://646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624" gracePeriod=30 Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.249503 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl674_d5c679df-0a81-4663-a3fc-d7247c933507/ovn-acl-logging/0.log" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.250408 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl674_d5c679df-0a81-4663-a3fc-d7247c933507/ovn-controller/0.log" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.250849 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.320766 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t9lfk"] Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.321010 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="kube-rbac-proxy-node" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321026 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="kube-rbac-proxy-node" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.321039 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="ovn-acl-logging" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321047 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="ovn-acl-logging" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.321059 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321067 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.321081 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e979e7-b235-4702-b7b6-303d881df7bb" containerName="oc" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321088 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e979e7-b235-4702-b7b6-303d881df7bb" containerName="oc" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.321098 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="nbdb" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321105 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="nbdb" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.321119 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="northd" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321126 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="northd" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.321139 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="kubecfg-setup" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321146 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="kubecfg-setup" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.321157 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="sbdb" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321164 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="sbdb" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.321174 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="ovn-controller" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321182 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="ovn-controller" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.321191 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="ovnkube-controller" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321198 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="ovnkube-controller" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321304 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321320 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="sbdb" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321329 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="ovn-acl-logging" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321341 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="nbdb" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321350 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="ovnkube-controller" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321359 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="northd" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321365 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="kube-rbac-proxy-node" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321372 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" containerName="ovn-controller" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.321380 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e979e7-b235-4702-b7b6-303d881df7bb" containerName="oc" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.323074 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442340 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-ovnkube-config\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442418 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-etc-openvswitch\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442452 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-run-ovn-kubernetes\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442506 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-cni-netd\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442540 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-log-socket\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442571 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-systemd-units\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442607 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-ovnkube-script-lib\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442604 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442646 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-kubelet\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442683 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-var-lib-openvswitch\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442717 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-ovn\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442750 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-openvswitch\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442788 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-env-overrides\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442852 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-systemd\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442939 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442982 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-slash\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443010 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-node-log\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443045 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-run-netns\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443092 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-cni-bin\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443201 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pxbl\" (UniqueName: \"kubernetes.io/projected/d5c679df-0a81-4663-a3fc-d7247c933507-kube-api-access-2pxbl\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442652 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.442667 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443160 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443156 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443183 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-log-socket" (OuterVolumeSpecName: "log-socket") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443201 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443209 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443266 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-slash" (OuterVolumeSpecName: "host-slash") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443269 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-node-log" (OuterVolumeSpecName: "node-log") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443291 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443310 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443298 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443533 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443339 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443423 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443504 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.443413 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5c679df-0a81-4663-a3fc-d7247c933507-ovn-node-metrics-cert\") pod \"d5c679df-0a81-4663-a3fc-d7247c933507\" (UID: \"d5c679df-0a81-4663-a3fc-d7247c933507\") " Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444062 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444096 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-kubelet\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444119 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-cni-netd\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444138 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-node-log\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444177 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-run-systemd\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444193 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-slash\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444213 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-log-socket\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444262 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-ovnkube-script-lib\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444321 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-ovnkube-config\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444484 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-systemd-units\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444571 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-cni-bin\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444629 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-ovn-node-metrics-cert\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444665 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-env-overrides\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444698 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-var-lib-openvswitch\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444775 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-run-ovn-kubernetes\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444845 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-etc-openvswitch\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444877 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-run-ovn\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.444923 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-run-openvswitch\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445063 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-run-netns\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445154 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8j54\" (UniqueName: \"kubernetes.io/projected/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-kube-api-access-t8j54\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445248 4921 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445278 4921 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-slash\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445299 4921 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-node-log\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445316 4921 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445335 4921 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445354 4921 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445370 4921 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445388 4921 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445407 4921 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445424 4921 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-log-socket\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445440 4921 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445456 4921 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445475 4921 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445492 4921 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445509 4921 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445526 4921 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.445543 4921 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d5c679df-0a81-4663-a3fc-d7247c933507-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.451756 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5c679df-0a81-4663-a3fc-d7247c933507-kube-api-access-2pxbl" (OuterVolumeSpecName: "kube-api-access-2pxbl") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "kube-api-access-2pxbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.452095 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5c679df-0a81-4663-a3fc-d7247c933507-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.463915 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d5c679df-0a81-4663-a3fc-d7247c933507" (UID: "d5c679df-0a81-4663-a3fc-d7247c933507"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.546991 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-ovnkube-config\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.547611 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-systemd-units\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.547757 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-cni-bin\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.547903 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-cni-bin\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.547783 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-systemd-units\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.547940 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-ovn-node-metrics-cert\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.548254 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-env-overrides\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.548421 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-var-lib-openvswitch\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.548321 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-ovnkube-config\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.548506 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-var-lib-openvswitch\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.548914 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-run-ovn-kubernetes\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.548922 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-run-ovn-kubernetes\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549032 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-etc-openvswitch\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549071 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-run-ovn\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549082 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-env-overrides\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549119 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-run-openvswitch\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549177 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-etc-openvswitch\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549225 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-run-netns\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549229 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-run-ovn\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549182 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-run-netns\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549332 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8j54\" (UniqueName: \"kubernetes.io/projected/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-kube-api-access-t8j54\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549262 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-run-openvswitch\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549379 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549423 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-kubelet\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549462 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-cni-netd\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549493 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549499 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-node-log\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549534 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-node-log\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549549 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-run-systemd\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549572 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-slash\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549586 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-kubelet\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549590 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-log-socket\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549609 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-log-socket\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549634 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-run-systemd\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549637 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-ovnkube-script-lib\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549654 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-slash\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549752 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-host-cni-netd\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549772 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pxbl\" (UniqueName: \"kubernetes.io/projected/d5c679df-0a81-4663-a3fc-d7247c933507-kube-api-access-2pxbl\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549785 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d5c679df-0a81-4663-a3fc-d7247c933507-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.549796 4921 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d5c679df-0a81-4663-a3fc-d7247c933507-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.550748 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-ovnkube-script-lib\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.555053 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-ovn-node-metrics-cert\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.582584 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8j54\" (UniqueName: \"kubernetes.io/projected/9f647378-a687-4aa8-bc1d-0f3276a1e9d7-kube-api-access-t8j54\") pod \"ovnkube-node-t9lfk\" (UID: \"9f647378-a687-4aa8-bc1d-0f3276a1e9d7\") " pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.584291 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl674_d5c679df-0a81-4663-a3fc-d7247c933507/ovn-acl-logging/0.log" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.585216 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rl674_d5c679df-0a81-4663-a3fc-d7247c933507/ovn-controller/0.log" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.585713 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5c679df-0a81-4663-a3fc-d7247c933507" containerID="646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624" exitCode=0 Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.585746 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5c679df-0a81-4663-a3fc-d7247c933507" containerID="704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c" exitCode=0 Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.585759 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5c679df-0a81-4663-a3fc-d7247c933507" containerID="a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894" exitCode=0 Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.585769 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5c679df-0a81-4663-a3fc-d7247c933507" containerID="a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30" exitCode=0 Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.585780 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5c679df-0a81-4663-a3fc-d7247c933507" containerID="f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218" exitCode=0 Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.585789 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5c679df-0a81-4663-a3fc-d7247c933507" containerID="622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43" exitCode=0 Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.585797 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5c679df-0a81-4663-a3fc-d7247c933507" containerID="136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7" exitCode=143 Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.585806 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5c679df-0a81-4663-a3fc-d7247c933507" containerID="50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c" exitCode=143 Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.585859 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerDied","Data":"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.585895 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.585938 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerDied","Data":"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.585972 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerDied","Data":"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586000 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerDied","Data":"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586029 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerDied","Data":"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586059 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerDied","Data":"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586090 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586115 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586132 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586154 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerDied","Data":"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586177 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586196 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586211 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586226 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586242 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586257 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586271 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586286 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586300 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586322 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerDied","Data":"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586346 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586364 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586380 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586396 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586411 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586426 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586442 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586457 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586472 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586495 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rl674" event={"ID":"d5c679df-0a81-4663-a3fc-d7247c933507","Type":"ContainerDied","Data":"fdec2af208e179df589d5582612ee9723bba2bf10a9c43d1830cb2721b55c499"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586519 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586537 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586553 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586571 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586586 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586601 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586619 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586634 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586649 4921 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.586680 4921 scope.go:117] "RemoveContainer" containerID="646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.593271 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q6tv6_db00f274-e86e-48c1-b0fe-5b4750265b85/kube-multus/0.log" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.593359 4921 generic.go:334] "Generic (PLEG): container finished" podID="db00f274-e86e-48c1-b0fe-5b4750265b85" containerID="64277a86d9ee84804412d148ece2e3feff1c23021c557d7224d7bab6172ce894" exitCode=2 Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.593421 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q6tv6" event={"ID":"db00f274-e86e-48c1-b0fe-5b4750265b85","Type":"ContainerDied","Data":"64277a86d9ee84804412d148ece2e3feff1c23021c557d7224d7bab6172ce894"} Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.594322 4921 scope.go:117] "RemoveContainer" containerID="64277a86d9ee84804412d148ece2e3feff1c23021c557d7224d7bab6172ce894" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.617566 4921 scope.go:117] "RemoveContainer" containerID="704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.643368 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.646272 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rl674"] Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.649126 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rl674"] Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.657254 4921 scope.go:117] "RemoveContainer" containerID="a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.688101 4921 scope.go:117] "RemoveContainer" containerID="a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30" Mar 12 13:22:31 crc kubenswrapper[4921]: W0312 13:22:31.696688 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f647378_a687_4aa8_bc1d_0f3276a1e9d7.slice/crio-c338b6523793ae5e1668f7951244dc7f36c4f0efd8f1f495a3d680bd88838ab3 WatchSource:0}: Error finding container c338b6523793ae5e1668f7951244dc7f36c4f0efd8f1f495a3d680bd88838ab3: Status 404 returned error can't find the container with id c338b6523793ae5e1668f7951244dc7f36c4f0efd8f1f495a3d680bd88838ab3 Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.732932 4921 scope.go:117] "RemoveContainer" containerID="f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.758210 4921 scope.go:117] "RemoveContainer" containerID="622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.771291 4921 scope.go:117] "RemoveContainer" containerID="136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.785660 4921 scope.go:117] "RemoveContainer" containerID="50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.807777 4921 scope.go:117] "RemoveContainer" containerID="eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.820040 4921 scope.go:117] "RemoveContainer" containerID="646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.820512 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624\": container with ID starting with 646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624 not found: ID does not exist" containerID="646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.820540 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624"} err="failed to get container status \"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624\": rpc error: code = NotFound desc = could not find container \"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624\": container with ID starting with 646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.820560 4921 scope.go:117] "RemoveContainer" containerID="704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.820959 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c\": container with ID starting with 704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c not found: ID does not exist" containerID="704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.820987 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c"} err="failed to get container status \"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c\": rpc error: code = NotFound desc = could not find container \"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c\": container with ID starting with 704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.821004 4921 scope.go:117] "RemoveContainer" containerID="a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.821383 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894\": container with ID starting with a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894 not found: ID does not exist" containerID="a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.821400 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894"} err="failed to get container status \"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894\": rpc error: code = NotFound desc = could not find container \"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894\": container with ID starting with a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.821414 4921 scope.go:117] "RemoveContainer" containerID="a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.821959 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30\": container with ID starting with a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30 not found: ID does not exist" containerID="a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.821976 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30"} err="failed to get container status \"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30\": rpc error: code = NotFound desc = could not find container \"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30\": container with ID starting with a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.821989 4921 scope.go:117] "RemoveContainer" containerID="f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.822269 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218\": container with ID starting with f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218 not found: ID does not exist" containerID="f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.822291 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218"} err="failed to get container status \"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218\": rpc error: code = NotFound desc = could not find container \"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218\": container with ID starting with f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.822309 4921 scope.go:117] "RemoveContainer" containerID="622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.822574 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43\": container with ID starting with 622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43 not found: ID does not exist" containerID="622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.822593 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43"} err="failed to get container status \"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43\": rpc error: code = NotFound desc = could not find container \"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43\": container with ID starting with 622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.822609 4921 scope.go:117] "RemoveContainer" containerID="136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.823084 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7\": container with ID starting with 136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7 not found: ID does not exist" containerID="136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.823108 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7"} err="failed to get container status \"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7\": rpc error: code = NotFound desc = could not find container \"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7\": container with ID starting with 136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.823123 4921 scope.go:117] "RemoveContainer" containerID="50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.823371 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c\": container with ID starting with 50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c not found: ID does not exist" containerID="50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.823395 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c"} err="failed to get container status \"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c\": rpc error: code = NotFound desc = could not find container \"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c\": container with ID starting with 50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.823415 4921 scope.go:117] "RemoveContainer" containerID="eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f" Mar 12 13:22:31 crc kubenswrapper[4921]: E0312 13:22:31.823630 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f\": container with ID starting with eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f not found: ID does not exist" containerID="eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.823654 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f"} err="failed to get container status \"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f\": rpc error: code = NotFound desc = could not find container \"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f\": container with ID starting with eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.823670 4921 scope.go:117] "RemoveContainer" containerID="646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.823895 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624"} err="failed to get container status \"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624\": rpc error: code = NotFound desc = could not find container \"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624\": container with ID starting with 646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.823916 4921 scope.go:117] "RemoveContainer" containerID="704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.824111 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c"} err="failed to get container status \"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c\": rpc error: code = NotFound desc = could not find container \"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c\": container with ID starting with 704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.824133 4921 scope.go:117] "RemoveContainer" containerID="a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.824309 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894"} err="failed to get container status \"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894\": rpc error: code = NotFound desc = could not find container \"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894\": container with ID starting with a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.824324 4921 scope.go:117] "RemoveContainer" containerID="a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.824526 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30"} err="failed to get container status \"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30\": rpc error: code = NotFound desc = could not find container \"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30\": container with ID starting with a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.824542 4921 scope.go:117] "RemoveContainer" containerID="f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.824726 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218"} err="failed to get container status \"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218\": rpc error: code = NotFound desc = could not find container \"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218\": container with ID starting with f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.824741 4921 scope.go:117] "RemoveContainer" containerID="622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.824999 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43"} err="failed to get container status \"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43\": rpc error: code = NotFound desc = could not find container \"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43\": container with ID starting with 622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.825013 4921 scope.go:117] "RemoveContainer" containerID="136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.825259 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7"} err="failed to get container status \"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7\": rpc error: code = NotFound desc = could not find container \"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7\": container with ID starting with 136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.825278 4921 scope.go:117] "RemoveContainer" containerID="50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.825482 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c"} err="failed to get container status \"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c\": rpc error: code = NotFound desc = could not find container \"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c\": container with ID starting with 50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.825501 4921 scope.go:117] "RemoveContainer" containerID="eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.825712 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f"} err="failed to get container status \"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f\": rpc error: code = NotFound desc = could not find container \"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f\": container with ID starting with eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.825732 4921 scope.go:117] "RemoveContainer" containerID="646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.825936 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624"} err="failed to get container status \"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624\": rpc error: code = NotFound desc = could not find container \"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624\": container with ID starting with 646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.825957 4921 scope.go:117] "RemoveContainer" containerID="704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.826166 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c"} err="failed to get container status \"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c\": rpc error: code = NotFound desc = could not find container \"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c\": container with ID starting with 704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.826186 4921 scope.go:117] "RemoveContainer" containerID="a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.826381 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894"} err="failed to get container status \"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894\": rpc error: code = NotFound desc = could not find container \"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894\": container with ID starting with a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.826401 4921 scope.go:117] "RemoveContainer" containerID="a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.826580 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30"} err="failed to get container status \"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30\": rpc error: code = NotFound desc = could not find container \"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30\": container with ID starting with a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.826598 4921 scope.go:117] "RemoveContainer" containerID="f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.826985 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218"} err="failed to get container status \"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218\": rpc error: code = NotFound desc = could not find container \"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218\": container with ID starting with f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.827009 4921 scope.go:117] "RemoveContainer" containerID="622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.827184 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43"} err="failed to get container status \"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43\": rpc error: code = NotFound desc = could not find container \"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43\": container with ID starting with 622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.827204 4921 scope.go:117] "RemoveContainer" containerID="136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.827370 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7"} err="failed to get container status \"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7\": rpc error: code = NotFound desc = could not find container \"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7\": container with ID starting with 136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.827389 4921 scope.go:117] "RemoveContainer" containerID="50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.827546 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c"} err="failed to get container status \"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c\": rpc error: code = NotFound desc = could not find container \"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c\": container with ID starting with 50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.827565 4921 scope.go:117] "RemoveContainer" containerID="eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.827723 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f"} err="failed to get container status \"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f\": rpc error: code = NotFound desc = could not find container \"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f\": container with ID starting with eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.827743 4921 scope.go:117] "RemoveContainer" containerID="646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.828482 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624"} err="failed to get container status \"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624\": rpc error: code = NotFound desc = could not find container \"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624\": container with ID starting with 646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.828498 4921 scope.go:117] "RemoveContainer" containerID="704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.828667 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c"} err="failed to get container status \"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c\": rpc error: code = NotFound desc = could not find container \"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c\": container with ID starting with 704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.828695 4921 scope.go:117] "RemoveContainer" containerID="a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.828881 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894"} err="failed to get container status \"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894\": rpc error: code = NotFound desc = could not find container \"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894\": container with ID starting with a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.828905 4921 scope.go:117] "RemoveContainer" containerID="a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.829075 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30"} err="failed to get container status \"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30\": rpc error: code = NotFound desc = could not find container \"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30\": container with ID starting with a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.829097 4921 scope.go:117] "RemoveContainer" containerID="f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.829294 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218"} err="failed to get container status \"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218\": rpc error: code = NotFound desc = could not find container \"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218\": container with ID starting with f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.829313 4921 scope.go:117] "RemoveContainer" containerID="622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.829458 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43"} err="failed to get container status \"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43\": rpc error: code = NotFound desc = could not find container \"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43\": container with ID starting with 622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.829475 4921 scope.go:117] "RemoveContainer" containerID="136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.829618 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7"} err="failed to get container status \"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7\": rpc error: code = NotFound desc = could not find container \"136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7\": container with ID starting with 136a4cdc72291bd74c394fc2738100a0b4378f94b523b18b62ec4cd24700fbc7 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.829640 4921 scope.go:117] "RemoveContainer" containerID="50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.829843 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c"} err="failed to get container status \"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c\": rpc error: code = NotFound desc = could not find container \"50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c\": container with ID starting with 50afbcd7abe28596ad3e96ddff6b79ff8df2818889b0e7e061f706acc896c60c not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.829865 4921 scope.go:117] "RemoveContainer" containerID="eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.830022 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f"} err="failed to get container status \"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f\": rpc error: code = NotFound desc = could not find container \"eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f\": container with ID starting with eca83b585cc4e76d2cb4a15c15d96690b3b1dcf90cfb66444bb26a6bc8a0bd4f not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.830043 4921 scope.go:117] "RemoveContainer" containerID="646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.830184 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624"} err="failed to get container status \"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624\": rpc error: code = NotFound desc = could not find container \"646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624\": container with ID starting with 646737d24266d8d06ac57debf02a1c9aef106cd7dba28b70c280f19e8c7a1624 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.830204 4921 scope.go:117] "RemoveContainer" containerID="704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.830406 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c"} err="failed to get container status \"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c\": rpc error: code = NotFound desc = could not find container \"704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c\": container with ID starting with 704ab7b24b16f099062190ba9e24d8d62e4149505b28b538350c95367d4a608c not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.830426 4921 scope.go:117] "RemoveContainer" containerID="a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.830569 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894"} err="failed to get container status \"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894\": rpc error: code = NotFound desc = could not find container \"a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894\": container with ID starting with a5f2788ce1eaa8c293961e59f009e8c2e0ed0a36a9b7b762007d7b529b96d894 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.830591 4921 scope.go:117] "RemoveContainer" containerID="a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.830747 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30"} err="failed to get container status \"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30\": rpc error: code = NotFound desc = could not find container \"a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30\": container with ID starting with a5e22209a4c775ebbc86fd116e094907556589878d4b144616b42a2fc6be3e30 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.830762 4921 scope.go:117] "RemoveContainer" containerID="f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.830936 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218"} err="failed to get container status \"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218\": rpc error: code = NotFound desc = could not find container \"f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218\": container with ID starting with f204f7947805b85a1a5675f8579cc2a67a07f5369dc8f22263b858cae2973218 not found: ID does not exist" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.830964 4921 scope.go:117] "RemoveContainer" containerID="622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43" Mar 12 13:22:31 crc kubenswrapper[4921]: I0312 13:22:31.831105 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43"} err="failed to get container status \"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43\": rpc error: code = NotFound desc = could not find container \"622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43\": container with ID starting with 622a48cf7b762fafc0262cccb7afe44012180e91a77b03071d53cc846d0d6e43 not found: ID does not exist" Mar 12 13:22:32 crc kubenswrapper[4921]: I0312 13:22:32.004711 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5c679df-0a81-4663-a3fc-d7247c933507" path="/var/lib/kubelet/pods/d5c679df-0a81-4663-a3fc-d7247c933507/volumes" Mar 12 13:22:32 crc kubenswrapper[4921]: I0312 13:22:32.600804 4921 generic.go:334] "Generic (PLEG): container finished" podID="9f647378-a687-4aa8-bc1d-0f3276a1e9d7" containerID="0b78fe653dcb606b990377f2acd9955fd40c34c8337aa2efadf1b3f9659b1cbe" exitCode=0 Mar 12 13:22:32 crc kubenswrapper[4921]: I0312 13:22:32.600874 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" event={"ID":"9f647378-a687-4aa8-bc1d-0f3276a1e9d7","Type":"ContainerDied","Data":"0b78fe653dcb606b990377f2acd9955fd40c34c8337aa2efadf1b3f9659b1cbe"} Mar 12 13:22:32 crc kubenswrapper[4921]: I0312 13:22:32.601285 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" event={"ID":"9f647378-a687-4aa8-bc1d-0f3276a1e9d7","Type":"ContainerStarted","Data":"c338b6523793ae5e1668f7951244dc7f36c4f0efd8f1f495a3d680bd88838ab3"} Mar 12 13:22:32 crc kubenswrapper[4921]: I0312 13:22:32.606065 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q6tv6_db00f274-e86e-48c1-b0fe-5b4750265b85/kube-multus/0.log" Mar 12 13:22:32 crc kubenswrapper[4921]: I0312 13:22:32.606145 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q6tv6" event={"ID":"db00f274-e86e-48c1-b0fe-5b4750265b85","Type":"ContainerStarted","Data":"d15381879739b30d7291fb321e18df3cb8d4cbe9958b586615b85d335da67c6e"} Mar 12 13:22:33 crc kubenswrapper[4921]: I0312 13:22:33.620479 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" event={"ID":"9f647378-a687-4aa8-bc1d-0f3276a1e9d7","Type":"ContainerStarted","Data":"70b9319929e8763bc8ffafd170f0d12bf95297a87a3bb6b38f7b1a953fcb7740"} Mar 12 13:22:33 crc kubenswrapper[4921]: I0312 13:22:33.621029 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" event={"ID":"9f647378-a687-4aa8-bc1d-0f3276a1e9d7","Type":"ContainerStarted","Data":"51780e4ab8972458584cc64768c6470e455a4612531253e9ad514dba982fa3cc"} Mar 12 13:22:33 crc kubenswrapper[4921]: I0312 13:22:33.621053 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" event={"ID":"9f647378-a687-4aa8-bc1d-0f3276a1e9d7","Type":"ContainerStarted","Data":"94153cccd2f3c0562d386425f31a051a2bc75a5cf10fbdb7824aea8cdd3d5e60"} Mar 12 13:22:33 crc kubenswrapper[4921]: I0312 13:22:33.621074 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" event={"ID":"9f647378-a687-4aa8-bc1d-0f3276a1e9d7","Type":"ContainerStarted","Data":"541f18720ecd551da00a60e862d98857dc34ea61a84162f24c5ccd50ec760dc7"} Mar 12 13:22:33 crc kubenswrapper[4921]: I0312 13:22:33.621093 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" event={"ID":"9f647378-a687-4aa8-bc1d-0f3276a1e9d7","Type":"ContainerStarted","Data":"ea7c0cf148ac51a8838b79bc068414a5412071f324938951ffbb1186c7b5513f"} Mar 12 13:22:33 crc kubenswrapper[4921]: I0312 13:22:33.621114 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" event={"ID":"9f647378-a687-4aa8-bc1d-0f3276a1e9d7","Type":"ContainerStarted","Data":"26b6efcf3668e6d88306a8e709a7967a832df6874fe280535d1e81be5e03134f"} Mar 12 13:22:35 crc kubenswrapper[4921]: I0312 13:22:35.646450 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" event={"ID":"9f647378-a687-4aa8-bc1d-0f3276a1e9d7","Type":"ContainerStarted","Data":"1f6cee1a0254644528b7d88860020072b58f4a66b87038a681323feb43f874d5"} Mar 12 13:22:38 crc kubenswrapper[4921]: I0312 13:22:38.575792 4921 scope.go:117] "RemoveContainer" containerID="82c2fad463f497e13167cbacc577b607de465341fbac53636744fe234966b3b1" Mar 12 13:22:38 crc kubenswrapper[4921]: I0312 13:22:38.669334 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" event={"ID":"9f647378-a687-4aa8-bc1d-0f3276a1e9d7","Type":"ContainerStarted","Data":"f41e20c37686e46c3010b460a5376ba38bb166abe25fc408dbeda5c069c46a31"} Mar 12 13:22:38 crc kubenswrapper[4921]: I0312 13:22:38.670792 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:38 crc kubenswrapper[4921]: I0312 13:22:38.711013 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" podStartSLOduration=7.710984223 podStartE2EDuration="7.710984223s" podCreationTimestamp="2026-03-12 13:22:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:22:38.706407586 +0000 UTC m=+781.396479627" watchObservedRunningTime="2026-03-12 13:22:38.710984223 +0000 UTC m=+781.401056234" Mar 12 13:22:38 crc kubenswrapper[4921]: I0312 13:22:38.754597 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:39 crc kubenswrapper[4921]: I0312 13:22:39.674443 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:39 crc kubenswrapper[4921]: I0312 13:22:39.674491 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:39 crc kubenswrapper[4921]: I0312 13:22:39.700772 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:22:56 crc kubenswrapper[4921]: I0312 13:22:56.324474 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:22:56 crc kubenswrapper[4921]: I0312 13:22:56.325335 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.359741 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk"] Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.361046 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.362524 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.372998 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk"] Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.450584 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk\" (UID: \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.450702 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb6hw\" (UniqueName: \"kubernetes.io/projected/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-kube-api-access-bb6hw\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk\" (UID: \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.450736 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk\" (UID: \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.551145 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb6hw\" (UniqueName: \"kubernetes.io/projected/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-kube-api-access-bb6hw\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk\" (UID: \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.551185 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk\" (UID: \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.551218 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk\" (UID: \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.551642 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk\" (UID: \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.551954 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk\" (UID: \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.575451 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb6hw\" (UniqueName: \"kubernetes.io/projected/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-kube-api-access-bb6hw\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk\" (UID: \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.676731 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" Mar 12 13:22:59 crc kubenswrapper[4921]: I0312 13:22:59.922259 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk"] Mar 12 13:23:00 crc kubenswrapper[4921]: I0312 13:23:00.801037 4921 generic.go:334] "Generic (PLEG): container finished" podID="6cbcab60-00bb-4477-a36c-5d3f8298ab6b" containerID="337946e8e9f8ce0d2c4e93376aa4b2a9b0c56e5fa0f370eb1147c3eda8679cf8" exitCode=0 Mar 12 13:23:00 crc kubenswrapper[4921]: I0312 13:23:00.801114 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" event={"ID":"6cbcab60-00bb-4477-a36c-5d3f8298ab6b","Type":"ContainerDied","Data":"337946e8e9f8ce0d2c4e93376aa4b2a9b0c56e5fa0f370eb1147c3eda8679cf8"} Mar 12 13:23:00 crc kubenswrapper[4921]: I0312 13:23:00.801412 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" event={"ID":"6cbcab60-00bb-4477-a36c-5d3f8298ab6b","Type":"ContainerStarted","Data":"421aeb57dc72cafb0434d03b7b237184ae2ae5ac15f82f9281982b3b4891a637"} Mar 12 13:23:01 crc kubenswrapper[4921]: I0312 13:23:01.681621 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t9lfk" Mar 12 13:23:01 crc kubenswrapper[4921]: I0312 13:23:01.718445 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qtn4l"] Mar 12 13:23:01 crc kubenswrapper[4921]: I0312 13:23:01.720740 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:01 crc kubenswrapper[4921]: I0312 13:23:01.731189 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qtn4l"] Mar 12 13:23:01 crc kubenswrapper[4921]: I0312 13:23:01.885570 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c2fc63-694b-4885-8553-87f2a2bfbcc1-utilities\") pod \"redhat-operators-qtn4l\" (UID: \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\") " pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:01 crc kubenswrapper[4921]: I0312 13:23:01.887018 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcxkz\" (UniqueName: \"kubernetes.io/projected/99c2fc63-694b-4885-8553-87f2a2bfbcc1-kube-api-access-qcxkz\") pod \"redhat-operators-qtn4l\" (UID: \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\") " pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:01 crc kubenswrapper[4921]: I0312 13:23:01.887216 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c2fc63-694b-4885-8553-87f2a2bfbcc1-catalog-content\") pod \"redhat-operators-qtn4l\" (UID: \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\") " pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:01 crc kubenswrapper[4921]: I0312 13:23:01.988066 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c2fc63-694b-4885-8553-87f2a2bfbcc1-catalog-content\") pod \"redhat-operators-qtn4l\" (UID: \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\") " pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:01 crc kubenswrapper[4921]: I0312 13:23:01.988513 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c2fc63-694b-4885-8553-87f2a2bfbcc1-utilities\") pod \"redhat-operators-qtn4l\" (UID: \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\") " pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:01 crc kubenswrapper[4921]: I0312 13:23:01.988595 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c2fc63-694b-4885-8553-87f2a2bfbcc1-catalog-content\") pod \"redhat-operators-qtn4l\" (UID: \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\") " pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:01 crc kubenswrapper[4921]: I0312 13:23:01.988731 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcxkz\" (UniqueName: \"kubernetes.io/projected/99c2fc63-694b-4885-8553-87f2a2bfbcc1-kube-api-access-qcxkz\") pod \"redhat-operators-qtn4l\" (UID: \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\") " pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:01 crc kubenswrapper[4921]: I0312 13:23:01.989346 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c2fc63-694b-4885-8553-87f2a2bfbcc1-utilities\") pod \"redhat-operators-qtn4l\" (UID: \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\") " pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:02 crc kubenswrapper[4921]: I0312 13:23:02.010504 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcxkz\" (UniqueName: \"kubernetes.io/projected/99c2fc63-694b-4885-8553-87f2a2bfbcc1-kube-api-access-qcxkz\") pod \"redhat-operators-qtn4l\" (UID: \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\") " pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:02 crc kubenswrapper[4921]: I0312 13:23:02.057784 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:02 crc kubenswrapper[4921]: I0312 13:23:02.483761 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qtn4l"] Mar 12 13:23:02 crc kubenswrapper[4921]: I0312 13:23:02.816300 4921 generic.go:334] "Generic (PLEG): container finished" podID="99c2fc63-694b-4885-8553-87f2a2bfbcc1" containerID="88ffa9089998c502217377b20c55ad2a92f4a4609559cb320327fa9870e561b3" exitCode=0 Mar 12 13:23:02 crc kubenswrapper[4921]: I0312 13:23:02.816348 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtn4l" event={"ID":"99c2fc63-694b-4885-8553-87f2a2bfbcc1","Type":"ContainerDied","Data":"88ffa9089998c502217377b20c55ad2a92f4a4609559cb320327fa9870e561b3"} Mar 12 13:23:02 crc kubenswrapper[4921]: I0312 13:23:02.816375 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtn4l" event={"ID":"99c2fc63-694b-4885-8553-87f2a2bfbcc1","Type":"ContainerStarted","Data":"961c546a18193425910bdc05256dd8b7b6286cde4c6536fd3125f08e794769f4"} Mar 12 13:23:02 crc kubenswrapper[4921]: I0312 13:23:02.879609 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:23:03 crc kubenswrapper[4921]: I0312 13:23:03.830977 4921 generic.go:334] "Generic (PLEG): container finished" podID="6cbcab60-00bb-4477-a36c-5d3f8298ab6b" containerID="83366204be36ed6d013cf2ce86c9db89f8f9c84797a7e7a37a05dca283c27ad6" exitCode=0 Mar 12 13:23:03 crc kubenswrapper[4921]: I0312 13:23:03.831026 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" event={"ID":"6cbcab60-00bb-4477-a36c-5d3f8298ab6b","Type":"ContainerDied","Data":"83366204be36ed6d013cf2ce86c9db89f8f9c84797a7e7a37a05dca283c27ad6"} Mar 12 13:23:04 crc kubenswrapper[4921]: I0312 13:23:04.840613 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtn4l" event={"ID":"99c2fc63-694b-4885-8553-87f2a2bfbcc1","Type":"ContainerStarted","Data":"3ecdafefbf8ad9b83a19a80385010a45181fd895fb4e8580a91c177ffcf52012"} Mar 12 13:23:04 crc kubenswrapper[4921]: I0312 13:23:04.844006 4921 generic.go:334] "Generic (PLEG): container finished" podID="6cbcab60-00bb-4477-a36c-5d3f8298ab6b" containerID="9b6bda2e7f7dd2ac30f6c44d8d65c960ad115d9195e2ce01bf51eb019782bd6c" exitCode=0 Mar 12 13:23:04 crc kubenswrapper[4921]: I0312 13:23:04.844096 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" event={"ID":"6cbcab60-00bb-4477-a36c-5d3f8298ab6b","Type":"ContainerDied","Data":"9b6bda2e7f7dd2ac30f6c44d8d65c960ad115d9195e2ce01bf51eb019782bd6c"} Mar 12 13:23:05 crc kubenswrapper[4921]: I0312 13:23:05.853808 4921 generic.go:334] "Generic (PLEG): container finished" podID="99c2fc63-694b-4885-8553-87f2a2bfbcc1" containerID="3ecdafefbf8ad9b83a19a80385010a45181fd895fb4e8580a91c177ffcf52012" exitCode=0 Mar 12 13:23:05 crc kubenswrapper[4921]: I0312 13:23:05.853935 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtn4l" event={"ID":"99c2fc63-694b-4885-8553-87f2a2bfbcc1","Type":"ContainerDied","Data":"3ecdafefbf8ad9b83a19a80385010a45181fd895fb4e8580a91c177ffcf52012"} Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.141425 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.142874 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb6hw\" (UniqueName: \"kubernetes.io/projected/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-kube-api-access-bb6hw\") pod \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\" (UID: \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\") " Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.142941 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-util\") pod \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\" (UID: \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\") " Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.142976 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-bundle\") pod \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\" (UID: \"6cbcab60-00bb-4477-a36c-5d3f8298ab6b\") " Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.143656 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-bundle" (OuterVolumeSpecName: "bundle") pod "6cbcab60-00bb-4477-a36c-5d3f8298ab6b" (UID: "6cbcab60-00bb-4477-a36c-5d3f8298ab6b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.153095 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-kube-api-access-bb6hw" (OuterVolumeSpecName: "kube-api-access-bb6hw") pod "6cbcab60-00bb-4477-a36c-5d3f8298ab6b" (UID: "6cbcab60-00bb-4477-a36c-5d3f8298ab6b"). InnerVolumeSpecName "kube-api-access-bb6hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.159592 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-util" (OuterVolumeSpecName: "util") pod "6cbcab60-00bb-4477-a36c-5d3f8298ab6b" (UID: "6cbcab60-00bb-4477-a36c-5d3f8298ab6b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.244791 4921 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-util\") on node \"crc\" DevicePath \"\"" Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.244867 4921 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.244886 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb6hw\" (UniqueName: \"kubernetes.io/projected/6cbcab60-00bb-4477-a36c-5d3f8298ab6b-kube-api-access-bb6hw\") on node \"crc\" DevicePath \"\"" Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.863254 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtn4l" event={"ID":"99c2fc63-694b-4885-8553-87f2a2bfbcc1","Type":"ContainerStarted","Data":"90dc3ed25079b20b454378f5fbb0fe5cb4ec4457842b772f71bf129c51bbdc19"} Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.876543 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" event={"ID":"6cbcab60-00bb-4477-a36c-5d3f8298ab6b","Type":"ContainerDied","Data":"421aeb57dc72cafb0434d03b7b237184ae2ae5ac15f82f9281982b3b4891a637"} Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.876610 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="421aeb57dc72cafb0434d03b7b237184ae2ae5ac15f82f9281982b3b4891a637" Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.876960 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk" Mar 12 13:23:06 crc kubenswrapper[4921]: I0312 13:23:06.901159 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qtn4l" podStartSLOduration=2.238076503 podStartE2EDuration="5.901140214s" podCreationTimestamp="2026-03-12 13:23:01 +0000 UTC" firstStartedPulling="2026-03-12 13:23:02.879330859 +0000 UTC m=+805.569402830" lastFinishedPulling="2026-03-12 13:23:06.54239454 +0000 UTC m=+809.232466541" observedRunningTime="2026-03-12 13:23:06.893959809 +0000 UTC m=+809.584031800" watchObservedRunningTime="2026-03-12 13:23:06.901140214 +0000 UTC m=+809.591212195" Mar 12 13:23:09 crc kubenswrapper[4921]: I0312 13:23:09.817371 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-nvv9l"] Mar 12 13:23:09 crc kubenswrapper[4921]: E0312 13:23:09.817957 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbcab60-00bb-4477-a36c-5d3f8298ab6b" containerName="util" Mar 12 13:23:09 crc kubenswrapper[4921]: I0312 13:23:09.817971 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbcab60-00bb-4477-a36c-5d3f8298ab6b" containerName="util" Mar 12 13:23:09 crc kubenswrapper[4921]: E0312 13:23:09.817979 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbcab60-00bb-4477-a36c-5d3f8298ab6b" containerName="pull" Mar 12 13:23:09 crc kubenswrapper[4921]: I0312 13:23:09.817985 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbcab60-00bb-4477-a36c-5d3f8298ab6b" containerName="pull" Mar 12 13:23:09 crc kubenswrapper[4921]: E0312 13:23:09.818000 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cbcab60-00bb-4477-a36c-5d3f8298ab6b" containerName="extract" Mar 12 13:23:09 crc kubenswrapper[4921]: I0312 13:23:09.818009 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cbcab60-00bb-4477-a36c-5d3f8298ab6b" containerName="extract" Mar 12 13:23:09 crc kubenswrapper[4921]: I0312 13:23:09.818122 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cbcab60-00bb-4477-a36c-5d3f8298ab6b" containerName="extract" Mar 12 13:23:09 crc kubenswrapper[4921]: I0312 13:23:09.818547 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvv9l" Mar 12 13:23:09 crc kubenswrapper[4921]: I0312 13:23:09.820741 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-tg2rq" Mar 12 13:23:09 crc kubenswrapper[4921]: I0312 13:23:09.820907 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 12 13:23:09 crc kubenswrapper[4921]: I0312 13:23:09.832761 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-nvv9l"] Mar 12 13:23:09 crc kubenswrapper[4921]: I0312 13:23:09.836760 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 12 13:23:10 crc kubenswrapper[4921]: I0312 13:23:10.000145 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmfck\" (UniqueName: \"kubernetes.io/projected/7258907e-4b4e-41d5-aac1-9d0fb967e5fd-kube-api-access-tmfck\") pod \"nmstate-operator-796d4cfff4-nvv9l\" (UID: \"7258907e-4b4e-41d5-aac1-9d0fb967e5fd\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvv9l" Mar 12 13:23:10 crc kubenswrapper[4921]: I0312 13:23:10.101502 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmfck\" (UniqueName: \"kubernetes.io/projected/7258907e-4b4e-41d5-aac1-9d0fb967e5fd-kube-api-access-tmfck\") pod \"nmstate-operator-796d4cfff4-nvv9l\" (UID: \"7258907e-4b4e-41d5-aac1-9d0fb967e5fd\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvv9l" Mar 12 13:23:10 crc kubenswrapper[4921]: I0312 13:23:10.121155 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmfck\" (UniqueName: \"kubernetes.io/projected/7258907e-4b4e-41d5-aac1-9d0fb967e5fd-kube-api-access-tmfck\") pod \"nmstate-operator-796d4cfff4-nvv9l\" (UID: \"7258907e-4b4e-41d5-aac1-9d0fb967e5fd\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvv9l" Mar 12 13:23:10 crc kubenswrapper[4921]: I0312 13:23:10.139366 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvv9l" Mar 12 13:23:10 crc kubenswrapper[4921]: I0312 13:23:10.533042 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-nvv9l"] Mar 12 13:23:10 crc kubenswrapper[4921]: W0312 13:23:10.535673 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7258907e_4b4e_41d5_aac1_9d0fb967e5fd.slice/crio-6a9d2f5938d540b230445ccfed7afb47d624707cdf21b68d9032c184b1773708 WatchSource:0}: Error finding container 6a9d2f5938d540b230445ccfed7afb47d624707cdf21b68d9032c184b1773708: Status 404 returned error can't find the container with id 6a9d2f5938d540b230445ccfed7afb47d624707cdf21b68d9032c184b1773708 Mar 12 13:23:10 crc kubenswrapper[4921]: I0312 13:23:10.897373 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvv9l" event={"ID":"7258907e-4b4e-41d5-aac1-9d0fb967e5fd","Type":"ContainerStarted","Data":"6a9d2f5938d540b230445ccfed7afb47d624707cdf21b68d9032c184b1773708"} Mar 12 13:23:12 crc kubenswrapper[4921]: I0312 13:23:12.057996 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:12 crc kubenswrapper[4921]: I0312 13:23:12.058454 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:13 crc kubenswrapper[4921]: I0312 13:23:13.100275 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qtn4l" podUID="99c2fc63-694b-4885-8553-87f2a2bfbcc1" containerName="registry-server" probeResult="failure" output=< Mar 12 13:23:13 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 13:23:13 crc kubenswrapper[4921]: > Mar 12 13:23:13 crc kubenswrapper[4921]: I0312 13:23:13.924210 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvv9l" event={"ID":"7258907e-4b4e-41d5-aac1-9d0fb967e5fd","Type":"ContainerStarted","Data":"b805dae7813bef4962bfa820d430ddf9a2187bf7bbac7e0bf1b61f0ae0fcc0ff"} Mar 12 13:23:13 crc kubenswrapper[4921]: I0312 13:23:13.944781 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvv9l" podStartSLOduration=1.9621781 podStartE2EDuration="4.944764445s" podCreationTimestamp="2026-03-12 13:23:09 +0000 UTC" firstStartedPulling="2026-03-12 13:23:10.53819996 +0000 UTC m=+813.228271931" lastFinishedPulling="2026-03-12 13:23:13.520786305 +0000 UTC m=+816.210858276" observedRunningTime="2026-03-12 13:23:13.944724053 +0000 UTC m=+816.634796024" watchObservedRunningTime="2026-03-12 13:23:13.944764445 +0000 UTC m=+816.634836436" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.673138 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-tkdph"] Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.674483 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tkdph" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.676538 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-r7th5" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.686591 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-tkdph"] Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.689729 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-kf975"] Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.690407 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kf975" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.702048 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-2x8kb"] Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.702838 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.705262 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-kf975"] Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.714269 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.799182 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69"] Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.800002 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.801782 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.801982 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-hcrnt" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.803774 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.806475 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69"] Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.814480 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkqm4\" (UniqueName: \"kubernetes.io/projected/20f1f547-f958-419e-a5c2-58695625d6ad-kube-api-access-bkqm4\") pod \"nmstate-webhook-5f558f5558-kf975\" (UID: \"20f1f547-f958-419e-a5c2-58695625d6ad\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-kf975" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.814519 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7fea2e61-eacd-4cef-9425-2e03106cf6f4-nmstate-lock\") pod \"nmstate-handler-2x8kb\" (UID: \"7fea2e61-eacd-4cef-9425-2e03106cf6f4\") " pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.814539 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7fea2e61-eacd-4cef-9425-2e03106cf6f4-ovs-socket\") pod \"nmstate-handler-2x8kb\" (UID: \"7fea2e61-eacd-4cef-9425-2e03106cf6f4\") " pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.814556 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cqvn\" (UniqueName: \"kubernetes.io/projected/e3a3372c-64ea-4841-91b6-55d6dbc9490a-kube-api-access-4cqvn\") pod \"nmstate-metrics-9b8c8685d-tkdph\" (UID: \"e3a3372c-64ea-4841-91b6-55d6dbc9490a\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tkdph" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.814619 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhz7d\" (UniqueName: \"kubernetes.io/projected/7fea2e61-eacd-4cef-9425-2e03106cf6f4-kube-api-access-xhz7d\") pod \"nmstate-handler-2x8kb\" (UID: \"7fea2e61-eacd-4cef-9425-2e03106cf6f4\") " pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.814655 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7fea2e61-eacd-4cef-9425-2e03106cf6f4-dbus-socket\") pod \"nmstate-handler-2x8kb\" (UID: \"7fea2e61-eacd-4cef-9425-2e03106cf6f4\") " pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.814679 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/20f1f547-f958-419e-a5c2-58695625d6ad-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-kf975\" (UID: \"20f1f547-f958-419e-a5c2-58695625d6ad\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-kf975" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.916736 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1bd23bf-3c09-41ff-9840-3397219f3f4d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-ppq69\" (UID: \"e1bd23bf-3c09-41ff-9840-3397219f3f4d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.916806 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkqm4\" (UniqueName: \"kubernetes.io/projected/20f1f547-f958-419e-a5c2-58695625d6ad-kube-api-access-bkqm4\") pod \"nmstate-webhook-5f558f5558-kf975\" (UID: \"20f1f547-f958-419e-a5c2-58695625d6ad\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-kf975" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.916854 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7fea2e61-eacd-4cef-9425-2e03106cf6f4-nmstate-lock\") pod \"nmstate-handler-2x8kb\" (UID: \"7fea2e61-eacd-4cef-9425-2e03106cf6f4\") " pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.916886 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7fea2e61-eacd-4cef-9425-2e03106cf6f4-ovs-socket\") pod \"nmstate-handler-2x8kb\" (UID: \"7fea2e61-eacd-4cef-9425-2e03106cf6f4\") " pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.916932 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cqvn\" (UniqueName: \"kubernetes.io/projected/e3a3372c-64ea-4841-91b6-55d6dbc9490a-kube-api-access-4cqvn\") pod \"nmstate-metrics-9b8c8685d-tkdph\" (UID: \"e3a3372c-64ea-4841-91b6-55d6dbc9490a\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tkdph" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.916959 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r2jd\" (UniqueName: \"kubernetes.io/projected/e1bd23bf-3c09-41ff-9840-3397219f3f4d-kube-api-access-8r2jd\") pod \"nmstate-console-plugin-86f58fcf4-ppq69\" (UID: \"e1bd23bf-3c09-41ff-9840-3397219f3f4d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.916998 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhz7d\" (UniqueName: \"kubernetes.io/projected/7fea2e61-eacd-4cef-9425-2e03106cf6f4-kube-api-access-xhz7d\") pod \"nmstate-handler-2x8kb\" (UID: \"7fea2e61-eacd-4cef-9425-2e03106cf6f4\") " pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.917048 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7fea2e61-eacd-4cef-9425-2e03106cf6f4-dbus-socket\") pod \"nmstate-handler-2x8kb\" (UID: \"7fea2e61-eacd-4cef-9425-2e03106cf6f4\") " pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.917070 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e1bd23bf-3c09-41ff-9840-3397219f3f4d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-ppq69\" (UID: \"e1bd23bf-3c09-41ff-9840-3397219f3f4d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.917106 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/20f1f547-f958-419e-a5c2-58695625d6ad-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-kf975\" (UID: \"20f1f547-f958-419e-a5c2-58695625d6ad\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-kf975" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.917386 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7fea2e61-eacd-4cef-9425-2e03106cf6f4-nmstate-lock\") pod \"nmstate-handler-2x8kb\" (UID: \"7fea2e61-eacd-4cef-9425-2e03106cf6f4\") " pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.917459 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7fea2e61-eacd-4cef-9425-2e03106cf6f4-ovs-socket\") pod \"nmstate-handler-2x8kb\" (UID: \"7fea2e61-eacd-4cef-9425-2e03106cf6f4\") " pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.917686 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7fea2e61-eacd-4cef-9425-2e03106cf6f4-dbus-socket\") pod \"nmstate-handler-2x8kb\" (UID: \"7fea2e61-eacd-4cef-9425-2e03106cf6f4\") " pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.939564 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/20f1f547-f958-419e-a5c2-58695625d6ad-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-kf975\" (UID: \"20f1f547-f958-419e-a5c2-58695625d6ad\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-kf975" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.943743 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhz7d\" (UniqueName: \"kubernetes.io/projected/7fea2e61-eacd-4cef-9425-2e03106cf6f4-kube-api-access-xhz7d\") pod \"nmstate-handler-2x8kb\" (UID: \"7fea2e61-eacd-4cef-9425-2e03106cf6f4\") " pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.946182 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cqvn\" (UniqueName: \"kubernetes.io/projected/e3a3372c-64ea-4841-91b6-55d6dbc9490a-kube-api-access-4cqvn\") pod \"nmstate-metrics-9b8c8685d-tkdph\" (UID: \"e3a3372c-64ea-4841-91b6-55d6dbc9490a\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tkdph" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.951180 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkqm4\" (UniqueName: \"kubernetes.io/projected/20f1f547-f958-419e-a5c2-58695625d6ad-kube-api-access-bkqm4\") pod \"nmstate-webhook-5f558f5558-kf975\" (UID: \"20f1f547-f958-419e-a5c2-58695625d6ad\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-kf975" Mar 12 13:23:19 crc kubenswrapper[4921]: I0312 13:23:19.988954 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tkdph" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.003448 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kf975" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.018250 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.018566 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e1bd23bf-3c09-41ff-9840-3397219f3f4d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-ppq69\" (UID: \"e1bd23bf-3c09-41ff-9840-3397219f3f4d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.018645 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1bd23bf-3c09-41ff-9840-3397219f3f4d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-ppq69\" (UID: \"e1bd23bf-3c09-41ff-9840-3397219f3f4d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.018695 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r2jd\" (UniqueName: \"kubernetes.io/projected/e1bd23bf-3c09-41ff-9840-3397219f3f4d-kube-api-access-8r2jd\") pod \"nmstate-console-plugin-86f58fcf4-ppq69\" (UID: \"e1bd23bf-3c09-41ff-9840-3397219f3f4d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.019890 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e1bd23bf-3c09-41ff-9840-3397219f3f4d-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-ppq69\" (UID: \"e1bd23bf-3c09-41ff-9840-3397219f3f4d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.023178 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1bd23bf-3c09-41ff-9840-3397219f3f4d-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-ppq69\" (UID: \"e1bd23bf-3c09-41ff-9840-3397219f3f4d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.023557 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-79fff8f7d6-wjl8k"] Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.024199 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.048724 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r2jd\" (UniqueName: \"kubernetes.io/projected/e1bd23bf-3c09-41ff-9840-3397219f3f4d-kube-api-access-8r2jd\") pod \"nmstate-console-plugin-86f58fcf4-ppq69\" (UID: \"e1bd23bf-3c09-41ff-9840-3397219f3f4d\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.062578 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79fff8f7d6-wjl8k"] Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.120557 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.220535 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57af1cba-c918-4500-91da-75051727b6f9-trusted-ca-bundle\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.220575 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57af1cba-c918-4500-91da-75051727b6f9-service-ca\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.220620 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p87z9\" (UniqueName: \"kubernetes.io/projected/57af1cba-c918-4500-91da-75051727b6f9-kube-api-access-p87z9\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.220645 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57af1cba-c918-4500-91da-75051727b6f9-console-serving-cert\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.220660 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57af1cba-c918-4500-91da-75051727b6f9-console-oauth-config\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.220729 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57af1cba-c918-4500-91da-75051727b6f9-console-config\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.220838 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57af1cba-c918-4500-91da-75051727b6f9-oauth-serving-cert\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.286080 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-kf975"] Mar 12 13:23:20 crc kubenswrapper[4921]: W0312 13:23:20.287157 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20f1f547_f958_419e_a5c2_58695625d6ad.slice/crio-aba0aa4d9467eb41d3c9ceaacbc06391a1c20a0c5289628a0f4474ab2cbf0cee WatchSource:0}: Error finding container aba0aa4d9467eb41d3c9ceaacbc06391a1c20a0c5289628a0f4474ab2cbf0cee: Status 404 returned error can't find the container with id aba0aa4d9467eb41d3c9ceaacbc06391a1c20a0c5289628a0f4474ab2cbf0cee Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.321865 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57af1cba-c918-4500-91da-75051727b6f9-oauth-serving-cert\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.321975 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57af1cba-c918-4500-91da-75051727b6f9-trusted-ca-bundle\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.322023 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57af1cba-c918-4500-91da-75051727b6f9-service-ca\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.322051 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p87z9\" (UniqueName: \"kubernetes.io/projected/57af1cba-c918-4500-91da-75051727b6f9-kube-api-access-p87z9\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.322098 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57af1cba-c918-4500-91da-75051727b6f9-console-serving-cert\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.322122 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57af1cba-c918-4500-91da-75051727b6f9-console-oauth-config\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.322202 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57af1cba-c918-4500-91da-75051727b6f9-console-config\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.323655 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57af1cba-c918-4500-91da-75051727b6f9-oauth-serving-cert\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.323967 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57af1cba-c918-4500-91da-75051727b6f9-console-config\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.324839 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57af1cba-c918-4500-91da-75051727b6f9-trusted-ca-bundle\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.324853 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57af1cba-c918-4500-91da-75051727b6f9-service-ca\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.330593 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57af1cba-c918-4500-91da-75051727b6f9-console-oauth-config\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.330743 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57af1cba-c918-4500-91da-75051727b6f9-console-serving-cert\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.340287 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p87z9\" (UniqueName: \"kubernetes.io/projected/57af1cba-c918-4500-91da-75051727b6f9-kube-api-access-p87z9\") pod \"console-79fff8f7d6-wjl8k\" (UID: \"57af1cba-c918-4500-91da-75051727b6f9\") " pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.378103 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.380437 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69"] Mar 12 13:23:20 crc kubenswrapper[4921]: W0312 13:23:20.383583 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1bd23bf_3c09_41ff_9840_3397219f3f4d.slice/crio-8fda2c7296e1f0ed9b73e51d22544e80a3d1bf30ca40f99edb97292ca1a57de2 WatchSource:0}: Error finding container 8fda2c7296e1f0ed9b73e51d22544e80a3d1bf30ca40f99edb97292ca1a57de2: Status 404 returned error can't find the container with id 8fda2c7296e1f0ed9b73e51d22544e80a3d1bf30ca40f99edb97292ca1a57de2 Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.419369 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kf975" event={"ID":"20f1f547-f958-419e-a5c2-58695625d6ad","Type":"ContainerStarted","Data":"aba0aa4d9467eb41d3c9ceaacbc06391a1c20a0c5289628a0f4474ab2cbf0cee"} Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.420857 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-2x8kb" event={"ID":"7fea2e61-eacd-4cef-9425-2e03106cf6f4","Type":"ContainerStarted","Data":"3c037bb556766dfadb102a912818c6d8faae2acb52c168385b5fe8749bf12b6c"} Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.421903 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69" event={"ID":"e1bd23bf-3c09-41ff-9840-3397219f3f4d","Type":"ContainerStarted","Data":"8fda2c7296e1f0ed9b73e51d22544e80a3d1bf30ca40f99edb97292ca1a57de2"} Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.573458 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-tkdph"] Mar 12 13:23:20 crc kubenswrapper[4921]: W0312 13:23:20.579787 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3a3372c_64ea_4841_91b6_55d6dbc9490a.slice/crio-b02dcae91d8e799ef3bbcb3cbc81861b2f6102ca8c7ddf9e4641cf8ffb54f76d WatchSource:0}: Error finding container b02dcae91d8e799ef3bbcb3cbc81861b2f6102ca8c7ddf9e4641cf8ffb54f76d: Status 404 returned error can't find the container with id b02dcae91d8e799ef3bbcb3cbc81861b2f6102ca8c7ddf9e4641cf8ffb54f76d Mar 12 13:23:20 crc kubenswrapper[4921]: I0312 13:23:20.602433 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79fff8f7d6-wjl8k"] Mar 12 13:23:20 crc kubenswrapper[4921]: W0312 13:23:20.608948 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57af1cba_c918_4500_91da_75051727b6f9.slice/crio-2165695d119c25f386588f992e60be4fb969fa5a4bc94cc71c4fb375a7ac2291 WatchSource:0}: Error finding container 2165695d119c25f386588f992e60be4fb969fa5a4bc94cc71c4fb375a7ac2291: Status 404 returned error can't find the container with id 2165695d119c25f386588f992e60be4fb969fa5a4bc94cc71c4fb375a7ac2291 Mar 12 13:23:21 crc kubenswrapper[4921]: I0312 13:23:21.434325 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79fff8f7d6-wjl8k" event={"ID":"57af1cba-c918-4500-91da-75051727b6f9","Type":"ContainerStarted","Data":"2009f87912eb5b635c345f084b3a1b38a11517c30af2063d9bc6467b8610b783"} Mar 12 13:23:21 crc kubenswrapper[4921]: I0312 13:23:21.434733 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79fff8f7d6-wjl8k" event={"ID":"57af1cba-c918-4500-91da-75051727b6f9","Type":"ContainerStarted","Data":"2165695d119c25f386588f992e60be4fb969fa5a4bc94cc71c4fb375a7ac2291"} Mar 12 13:23:21 crc kubenswrapper[4921]: I0312 13:23:21.437344 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tkdph" event={"ID":"e3a3372c-64ea-4841-91b6-55d6dbc9490a","Type":"ContainerStarted","Data":"b02dcae91d8e799ef3bbcb3cbc81861b2f6102ca8c7ddf9e4641cf8ffb54f76d"} Mar 12 13:23:21 crc kubenswrapper[4921]: I0312 13:23:21.470473 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79fff8f7d6-wjl8k" podStartSLOduration=1.470452598 podStartE2EDuration="1.470452598s" podCreationTimestamp="2026-03-12 13:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:23:21.467563107 +0000 UTC m=+824.157635088" watchObservedRunningTime="2026-03-12 13:23:21.470452598 +0000 UTC m=+824.160524589" Mar 12 13:23:22 crc kubenswrapper[4921]: I0312 13:23:22.104428 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:22 crc kubenswrapper[4921]: I0312 13:23:22.148560 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:22 crc kubenswrapper[4921]: I0312 13:23:22.337987 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qtn4l"] Mar 12 13:23:23 crc kubenswrapper[4921]: I0312 13:23:23.449997 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tkdph" event={"ID":"e3a3372c-64ea-4841-91b6-55d6dbc9490a","Type":"ContainerStarted","Data":"4566cc21fa2f4a4cb1a658e2c07e1668682a9d3aac30acff5c3abf1c75366361"} Mar 12 13:23:23 crc kubenswrapper[4921]: I0312 13:23:23.451960 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kf975" event={"ID":"20f1f547-f958-419e-a5c2-58695625d6ad","Type":"ContainerStarted","Data":"178d830447f6749d56a9e29dd107a1d11b2c9742f8e41af405b332e436d199c3"} Mar 12 13:23:23 crc kubenswrapper[4921]: I0312 13:23:23.452909 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kf975" Mar 12 13:23:23 crc kubenswrapper[4921]: I0312 13:23:23.457130 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qtn4l" podUID="99c2fc63-694b-4885-8553-87f2a2bfbcc1" containerName="registry-server" containerID="cri-o://90dc3ed25079b20b454378f5fbb0fe5cb4ec4457842b772f71bf129c51bbdc19" gracePeriod=2 Mar 12 13:23:23 crc kubenswrapper[4921]: I0312 13:23:23.457944 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-2x8kb" event={"ID":"7fea2e61-eacd-4cef-9425-2e03106cf6f4","Type":"ContainerStarted","Data":"41dc77feaf050931e7bbec0f0ad37e5cbb8e05dc750b3fc23d5ee7d67fde75b0"} Mar 12 13:23:23 crc kubenswrapper[4921]: I0312 13:23:23.458130 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:23 crc kubenswrapper[4921]: I0312 13:23:23.477087 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kf975" podStartSLOduration=2.2060370320000002 podStartE2EDuration="4.477067746s" podCreationTimestamp="2026-03-12 13:23:19 +0000 UTC" firstStartedPulling="2026-03-12 13:23:20.289005445 +0000 UTC m=+822.979077416" lastFinishedPulling="2026-03-12 13:23:22.560036159 +0000 UTC m=+825.250108130" observedRunningTime="2026-03-12 13:23:23.473808804 +0000 UTC m=+826.163880815" watchObservedRunningTime="2026-03-12 13:23:23.477067746 +0000 UTC m=+826.167139717" Mar 12 13:23:23 crc kubenswrapper[4921]: I0312 13:23:23.496994 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-2x8kb" podStartSLOduration=2.074354472 podStartE2EDuration="4.49697779s" podCreationTimestamp="2026-03-12 13:23:19 +0000 UTC" firstStartedPulling="2026-03-12 13:23:20.119525869 +0000 UTC m=+822.809597840" lastFinishedPulling="2026-03-12 13:23:22.542149187 +0000 UTC m=+825.232221158" observedRunningTime="2026-03-12 13:23:23.495688351 +0000 UTC m=+826.185760322" watchObservedRunningTime="2026-03-12 13:23:23.49697779 +0000 UTC m=+826.187049761" Mar 12 13:23:23 crc kubenswrapper[4921]: I0312 13:23:23.805389 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:23 crc kubenswrapper[4921]: I0312 13:23:23.994728 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcxkz\" (UniqueName: \"kubernetes.io/projected/99c2fc63-694b-4885-8553-87f2a2bfbcc1-kube-api-access-qcxkz\") pod \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\" (UID: \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\") " Mar 12 13:23:23 crc kubenswrapper[4921]: I0312 13:23:23.994897 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c2fc63-694b-4885-8553-87f2a2bfbcc1-catalog-content\") pod \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\" (UID: \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\") " Mar 12 13:23:23 crc kubenswrapper[4921]: I0312 13:23:23.994975 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c2fc63-694b-4885-8553-87f2a2bfbcc1-utilities\") pod \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\" (UID: \"99c2fc63-694b-4885-8553-87f2a2bfbcc1\") " Mar 12 13:23:23 crc kubenswrapper[4921]: I0312 13:23:23.996059 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99c2fc63-694b-4885-8553-87f2a2bfbcc1-utilities" (OuterVolumeSpecName: "utilities") pod "99c2fc63-694b-4885-8553-87f2a2bfbcc1" (UID: "99c2fc63-694b-4885-8553-87f2a2bfbcc1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:23.999501 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c2fc63-694b-4885-8553-87f2a2bfbcc1-kube-api-access-qcxkz" (OuterVolumeSpecName: "kube-api-access-qcxkz") pod "99c2fc63-694b-4885-8553-87f2a2bfbcc1" (UID: "99c2fc63-694b-4885-8553-87f2a2bfbcc1"). InnerVolumeSpecName "kube-api-access-qcxkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.098264 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcxkz\" (UniqueName: \"kubernetes.io/projected/99c2fc63-694b-4885-8553-87f2a2bfbcc1-kube-api-access-qcxkz\") on node \"crc\" DevicePath \"\"" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.098314 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99c2fc63-694b-4885-8553-87f2a2bfbcc1-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.118340 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99c2fc63-694b-4885-8553-87f2a2bfbcc1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99c2fc63-694b-4885-8553-87f2a2bfbcc1" (UID: "99c2fc63-694b-4885-8553-87f2a2bfbcc1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.199614 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99c2fc63-694b-4885-8553-87f2a2bfbcc1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.463145 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69" event={"ID":"e1bd23bf-3c09-41ff-9840-3397219f3f4d","Type":"ContainerStarted","Data":"c7e765d0939000bcba502584578864b7e39c0cf3e38bb18f8d32f41003178717"} Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.468343 4921 generic.go:334] "Generic (PLEG): container finished" podID="99c2fc63-694b-4885-8553-87f2a2bfbcc1" containerID="90dc3ed25079b20b454378f5fbb0fe5cb4ec4457842b772f71bf129c51bbdc19" exitCode=0 Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.489346 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtn4l" event={"ID":"99c2fc63-694b-4885-8553-87f2a2bfbcc1","Type":"ContainerDied","Data":"90dc3ed25079b20b454378f5fbb0fe5cb4ec4457842b772f71bf129c51bbdc19"} Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.489440 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qtn4l" event={"ID":"99c2fc63-694b-4885-8553-87f2a2bfbcc1","Type":"ContainerDied","Data":"961c546a18193425910bdc05256dd8b7b6286cde4c6536fd3125f08e794769f4"} Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.489505 4921 scope.go:117] "RemoveContainer" containerID="90dc3ed25079b20b454378f5fbb0fe5cb4ec4457842b772f71bf129c51bbdc19" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.489555 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qtn4l" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.520794 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-ppq69" podStartSLOduration=2.534535289 podStartE2EDuration="5.520771038s" podCreationTimestamp="2026-03-12 13:23:19 +0000 UTC" firstStartedPulling="2026-03-12 13:23:20.390366065 +0000 UTC m=+823.080438036" lastFinishedPulling="2026-03-12 13:23:23.376601794 +0000 UTC m=+826.066673785" observedRunningTime="2026-03-12 13:23:24.512481618 +0000 UTC m=+827.202553609" watchObservedRunningTime="2026-03-12 13:23:24.520771038 +0000 UTC m=+827.210843019" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.544417 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qtn4l"] Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.545872 4921 scope.go:117] "RemoveContainer" containerID="3ecdafefbf8ad9b83a19a80385010a45181fd895fb4e8580a91c177ffcf52012" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.551542 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qtn4l"] Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.572151 4921 scope.go:117] "RemoveContainer" containerID="88ffa9089998c502217377b20c55ad2a92f4a4609559cb320327fa9870e561b3" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.591357 4921 scope.go:117] "RemoveContainer" containerID="90dc3ed25079b20b454378f5fbb0fe5cb4ec4457842b772f71bf129c51bbdc19" Mar 12 13:23:24 crc kubenswrapper[4921]: E0312 13:23:24.591939 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90dc3ed25079b20b454378f5fbb0fe5cb4ec4457842b772f71bf129c51bbdc19\": container with ID starting with 90dc3ed25079b20b454378f5fbb0fe5cb4ec4457842b772f71bf129c51bbdc19 not found: ID does not exist" containerID="90dc3ed25079b20b454378f5fbb0fe5cb4ec4457842b772f71bf129c51bbdc19" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.591984 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90dc3ed25079b20b454378f5fbb0fe5cb4ec4457842b772f71bf129c51bbdc19"} err="failed to get container status \"90dc3ed25079b20b454378f5fbb0fe5cb4ec4457842b772f71bf129c51bbdc19\": rpc error: code = NotFound desc = could not find container \"90dc3ed25079b20b454378f5fbb0fe5cb4ec4457842b772f71bf129c51bbdc19\": container with ID starting with 90dc3ed25079b20b454378f5fbb0fe5cb4ec4457842b772f71bf129c51bbdc19 not found: ID does not exist" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.592010 4921 scope.go:117] "RemoveContainer" containerID="3ecdafefbf8ad9b83a19a80385010a45181fd895fb4e8580a91c177ffcf52012" Mar 12 13:23:24 crc kubenswrapper[4921]: E0312 13:23:24.592531 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ecdafefbf8ad9b83a19a80385010a45181fd895fb4e8580a91c177ffcf52012\": container with ID starting with 3ecdafefbf8ad9b83a19a80385010a45181fd895fb4e8580a91c177ffcf52012 not found: ID does not exist" containerID="3ecdafefbf8ad9b83a19a80385010a45181fd895fb4e8580a91c177ffcf52012" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.592580 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ecdafefbf8ad9b83a19a80385010a45181fd895fb4e8580a91c177ffcf52012"} err="failed to get container status \"3ecdafefbf8ad9b83a19a80385010a45181fd895fb4e8580a91c177ffcf52012\": rpc error: code = NotFound desc = could not find container \"3ecdafefbf8ad9b83a19a80385010a45181fd895fb4e8580a91c177ffcf52012\": container with ID starting with 3ecdafefbf8ad9b83a19a80385010a45181fd895fb4e8580a91c177ffcf52012 not found: ID does not exist" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.592618 4921 scope.go:117] "RemoveContainer" containerID="88ffa9089998c502217377b20c55ad2a92f4a4609559cb320327fa9870e561b3" Mar 12 13:23:24 crc kubenswrapper[4921]: E0312 13:23:24.592978 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88ffa9089998c502217377b20c55ad2a92f4a4609559cb320327fa9870e561b3\": container with ID starting with 88ffa9089998c502217377b20c55ad2a92f4a4609559cb320327fa9870e561b3 not found: ID does not exist" containerID="88ffa9089998c502217377b20c55ad2a92f4a4609559cb320327fa9870e561b3" Mar 12 13:23:24 crc kubenswrapper[4921]: I0312 13:23:24.593006 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88ffa9089998c502217377b20c55ad2a92f4a4609559cb320327fa9870e561b3"} err="failed to get container status \"88ffa9089998c502217377b20c55ad2a92f4a4609559cb320327fa9870e561b3\": rpc error: code = NotFound desc = could not find container \"88ffa9089998c502217377b20c55ad2a92f4a4609559cb320327fa9870e561b3\": container with ID starting with 88ffa9089998c502217377b20c55ad2a92f4a4609559cb320327fa9870e561b3 not found: ID does not exist" Mar 12 13:23:25 crc kubenswrapper[4921]: I0312 13:23:25.991102 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99c2fc63-694b-4885-8553-87f2a2bfbcc1" path="/var/lib/kubelet/pods/99c2fc63-694b-4885-8553-87f2a2bfbcc1/volumes" Mar 12 13:23:26 crc kubenswrapper[4921]: I0312 13:23:26.323940 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:23:26 crc kubenswrapper[4921]: I0312 13:23:26.324210 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:23:26 crc kubenswrapper[4921]: I0312 13:23:26.490725 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tkdph" event={"ID":"e3a3372c-64ea-4841-91b6-55d6dbc9490a","Type":"ContainerStarted","Data":"c3d54db82191b89e7fac9c74cddba49fbd47894a79f93368c47e8c5c3f5524bb"} Mar 12 13:23:26 crc kubenswrapper[4921]: I0312 13:23:26.520905 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-tkdph" podStartSLOduration=2.714885266 podStartE2EDuration="7.51953147s" podCreationTimestamp="2026-03-12 13:23:19 +0000 UTC" firstStartedPulling="2026-03-12 13:23:20.582145801 +0000 UTC m=+823.272217772" lastFinishedPulling="2026-03-12 13:23:25.386791995 +0000 UTC m=+828.076863976" observedRunningTime="2026-03-12 13:23:26.507242374 +0000 UTC m=+829.197314385" watchObservedRunningTime="2026-03-12 13:23:26.51953147 +0000 UTC m=+829.209603471" Mar 12 13:23:30 crc kubenswrapper[4921]: I0312 13:23:30.057342 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-2x8kb" Mar 12 13:23:30 crc kubenswrapper[4921]: I0312 13:23:30.378803 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:30 crc kubenswrapper[4921]: I0312 13:23:30.378962 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:30 crc kubenswrapper[4921]: I0312 13:23:30.385380 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:30 crc kubenswrapper[4921]: I0312 13:23:30.520063 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79fff8f7d6-wjl8k" Mar 12 13:23:30 crc kubenswrapper[4921]: I0312 13:23:30.579031 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-22xz2"] Mar 12 13:23:40 crc kubenswrapper[4921]: I0312 13:23:40.010521 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kf975" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.616574 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7"] Mar 12 13:23:55 crc kubenswrapper[4921]: E0312 13:23:55.619492 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c2fc63-694b-4885-8553-87f2a2bfbcc1" containerName="extract-utilities" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.619594 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c2fc63-694b-4885-8553-87f2a2bfbcc1" containerName="extract-utilities" Mar 12 13:23:55 crc kubenswrapper[4921]: E0312 13:23:55.619688 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c2fc63-694b-4885-8553-87f2a2bfbcc1" containerName="registry-server" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.619758 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c2fc63-694b-4885-8553-87f2a2bfbcc1" containerName="registry-server" Mar 12 13:23:55 crc kubenswrapper[4921]: E0312 13:23:55.619881 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c2fc63-694b-4885-8553-87f2a2bfbcc1" containerName="extract-content" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.619960 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c2fc63-694b-4885-8553-87f2a2bfbcc1" containerName="extract-content" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.620226 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="99c2fc63-694b-4885-8553-87f2a2bfbcc1" containerName="registry-server" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.621536 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.625799 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.639059 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7"] Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.649496 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-22xz2" podUID="57677fcb-c7a5-431c-b751-ec13d22484b1" containerName="console" containerID="cri-o://2299919c3c69fcf2f0ea10e2661454216b2d2709cba49d0ce908da4140433782" gracePeriod=15 Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.792649 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8247093d-09e8-4ff9-8a21-902c3135b7ab-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7\" (UID: \"8247093d-09e8-4ff9-8a21-902c3135b7ab\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.792782 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtmgt\" (UniqueName: \"kubernetes.io/projected/8247093d-09e8-4ff9-8a21-902c3135b7ab-kube-api-access-vtmgt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7\" (UID: \"8247093d-09e8-4ff9-8a21-902c3135b7ab\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.792947 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8247093d-09e8-4ff9-8a21-902c3135b7ab-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7\" (UID: \"8247093d-09e8-4ff9-8a21-902c3135b7ab\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.893906 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8247093d-09e8-4ff9-8a21-902c3135b7ab-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7\" (UID: \"8247093d-09e8-4ff9-8a21-902c3135b7ab\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.894031 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8247093d-09e8-4ff9-8a21-902c3135b7ab-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7\" (UID: \"8247093d-09e8-4ff9-8a21-902c3135b7ab\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.894113 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtmgt\" (UniqueName: \"kubernetes.io/projected/8247093d-09e8-4ff9-8a21-902c3135b7ab-kube-api-access-vtmgt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7\" (UID: \"8247093d-09e8-4ff9-8a21-902c3135b7ab\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.895097 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8247093d-09e8-4ff9-8a21-902c3135b7ab-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7\" (UID: \"8247093d-09e8-4ff9-8a21-902c3135b7ab\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.895879 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8247093d-09e8-4ff9-8a21-902c3135b7ab-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7\" (UID: \"8247093d-09e8-4ff9-8a21-902c3135b7ab\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.921585 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtmgt\" (UniqueName: \"kubernetes.io/projected/8247093d-09e8-4ff9-8a21-902c3135b7ab-kube-api-access-vtmgt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7\" (UID: \"8247093d-09e8-4ff9-8a21-902c3135b7ab\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" Mar 12 13:23:55 crc kubenswrapper[4921]: I0312 13:23:55.988747 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.072991 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-22xz2_57677fcb-c7a5-431c-b751-ec13d22484b1/console/0.log" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.073056 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.098272 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zft6\" (UniqueName: \"kubernetes.io/projected/57677fcb-c7a5-431c-b751-ec13d22484b1-kube-api-access-7zft6\") pod \"57677fcb-c7a5-431c-b751-ec13d22484b1\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.098667 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-service-ca\") pod \"57677fcb-c7a5-431c-b751-ec13d22484b1\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.098746 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-oauth-serving-cert\") pod \"57677fcb-c7a5-431c-b751-ec13d22484b1\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.098839 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57677fcb-c7a5-431c-b751-ec13d22484b1-console-serving-cert\") pod \"57677fcb-c7a5-431c-b751-ec13d22484b1\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.098976 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-trusted-ca-bundle\") pod \"57677fcb-c7a5-431c-b751-ec13d22484b1\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.098998 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57677fcb-c7a5-431c-b751-ec13d22484b1-console-oauth-config\") pod \"57677fcb-c7a5-431c-b751-ec13d22484b1\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.099146 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-console-config\") pod \"57677fcb-c7a5-431c-b751-ec13d22484b1\" (UID: \"57677fcb-c7a5-431c-b751-ec13d22484b1\") " Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.102169 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "57677fcb-c7a5-431c-b751-ec13d22484b1" (UID: "57677fcb-c7a5-431c-b751-ec13d22484b1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.105322 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-console-config" (OuterVolumeSpecName: "console-config") pod "57677fcb-c7a5-431c-b751-ec13d22484b1" (UID: "57677fcb-c7a5-431c-b751-ec13d22484b1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.106394 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "57677fcb-c7a5-431c-b751-ec13d22484b1" (UID: "57677fcb-c7a5-431c-b751-ec13d22484b1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.106494 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-service-ca" (OuterVolumeSpecName: "service-ca") pod "57677fcb-c7a5-431c-b751-ec13d22484b1" (UID: "57677fcb-c7a5-431c-b751-ec13d22484b1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.112629 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57677fcb-c7a5-431c-b751-ec13d22484b1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "57677fcb-c7a5-431c-b751-ec13d22484b1" (UID: "57677fcb-c7a5-431c-b751-ec13d22484b1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.115221 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57677fcb-c7a5-431c-b751-ec13d22484b1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "57677fcb-c7a5-431c-b751-ec13d22484b1" (UID: "57677fcb-c7a5-431c-b751-ec13d22484b1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.123185 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57677fcb-c7a5-431c-b751-ec13d22484b1-kube-api-access-7zft6" (OuterVolumeSpecName: "kube-api-access-7zft6") pod "57677fcb-c7a5-431c-b751-ec13d22484b1" (UID: "57677fcb-c7a5-431c-b751-ec13d22484b1"). InnerVolumeSpecName "kube-api-access-7zft6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.200787 4921 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-service-ca\") on node \"crc\" DevicePath \"\"" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.200846 4921 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.200858 4921 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57677fcb-c7a5-431c-b751-ec13d22484b1-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.200867 4921 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.200876 4921 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57677fcb-c7a5-431c-b751-ec13d22484b1-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.200884 4921 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57677fcb-c7a5-431c-b751-ec13d22484b1-console-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.200893 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zft6\" (UniqueName: \"kubernetes.io/projected/57677fcb-c7a5-431c-b751-ec13d22484b1-kube-api-access-7zft6\") on node \"crc\" DevicePath \"\"" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.262540 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7"] Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.324023 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.324263 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.324359 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.325423 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"107f4a8503d4c0486ad2c3402e1b2b2b1ceede9b611f44e27a27f3de56a8e4cf"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.325549 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://107f4a8503d4c0486ad2c3402e1b2b2b1ceede9b611f44e27a27f3de56a8e4cf" gracePeriod=600 Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.719576 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="107f4a8503d4c0486ad2c3402e1b2b2b1ceede9b611f44e27a27f3de56a8e4cf" exitCode=0 Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.719664 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"107f4a8503d4c0486ad2c3402e1b2b2b1ceede9b611f44e27a27f3de56a8e4cf"} Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.720091 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"d3b38af5e8a74ac4ff0f8e664ea487d80830b0618d599c37b78cc47d7d985662"} Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.720113 4921 scope.go:117] "RemoveContainer" containerID="51cd28594939b8d7f25cf0501cf5d7fac94d792e25a45e58d39d8c8a5553a580" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.723387 4921 generic.go:334] "Generic (PLEG): container finished" podID="8247093d-09e8-4ff9-8a21-902c3135b7ab" containerID="8472af133903545b0fdb3e5f4a36a08c3768bdf639338f1360778935825ae009" exitCode=0 Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.723536 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" event={"ID":"8247093d-09e8-4ff9-8a21-902c3135b7ab","Type":"ContainerDied","Data":"8472af133903545b0fdb3e5f4a36a08c3768bdf639338f1360778935825ae009"} Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.723588 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" event={"ID":"8247093d-09e8-4ff9-8a21-902c3135b7ab","Type":"ContainerStarted","Data":"36834c11493703c08aa2efd01d6e6072e7d01464677aaeb96c0878758d4327fe"} Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.726572 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-22xz2_57677fcb-c7a5-431c-b751-ec13d22484b1/console/0.log" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.726630 4921 generic.go:334] "Generic (PLEG): container finished" podID="57677fcb-c7a5-431c-b751-ec13d22484b1" containerID="2299919c3c69fcf2f0ea10e2661454216b2d2709cba49d0ce908da4140433782" exitCode=2 Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.726668 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-22xz2" event={"ID":"57677fcb-c7a5-431c-b751-ec13d22484b1","Type":"ContainerDied","Data":"2299919c3c69fcf2f0ea10e2661454216b2d2709cba49d0ce908da4140433782"} Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.726704 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-22xz2" event={"ID":"57677fcb-c7a5-431c-b751-ec13d22484b1","Type":"ContainerDied","Data":"00a058fd86aefa56b6722c8efea6027b50404dae9316d21f1f3b9db3d3b66af0"} Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.726761 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-22xz2" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.777773 4921 scope.go:117] "RemoveContainer" containerID="2299919c3c69fcf2f0ea10e2661454216b2d2709cba49d0ce908da4140433782" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.796239 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-22xz2"] Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.803734 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-22xz2"] Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.811309 4921 scope.go:117] "RemoveContainer" containerID="2299919c3c69fcf2f0ea10e2661454216b2d2709cba49d0ce908da4140433782" Mar 12 13:23:56 crc kubenswrapper[4921]: E0312 13:23:56.811758 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2299919c3c69fcf2f0ea10e2661454216b2d2709cba49d0ce908da4140433782\": container with ID starting with 2299919c3c69fcf2f0ea10e2661454216b2d2709cba49d0ce908da4140433782 not found: ID does not exist" containerID="2299919c3c69fcf2f0ea10e2661454216b2d2709cba49d0ce908da4140433782" Mar 12 13:23:56 crc kubenswrapper[4921]: I0312 13:23:56.811858 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2299919c3c69fcf2f0ea10e2661454216b2d2709cba49d0ce908da4140433782"} err="failed to get container status \"2299919c3c69fcf2f0ea10e2661454216b2d2709cba49d0ce908da4140433782\": rpc error: code = NotFound desc = could not find container \"2299919c3c69fcf2f0ea10e2661454216b2d2709cba49d0ce908da4140433782\": container with ID starting with 2299919c3c69fcf2f0ea10e2661454216b2d2709cba49d0ce908da4140433782 not found: ID does not exist" Mar 12 13:23:57 crc kubenswrapper[4921]: I0312 13:23:57.995782 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57677fcb-c7a5-431c-b751-ec13d22484b1" path="/var/lib/kubelet/pods/57677fcb-c7a5-431c-b751-ec13d22484b1/volumes" Mar 12 13:23:59 crc kubenswrapper[4921]: I0312 13:23:59.786335 4921 generic.go:334] "Generic (PLEG): container finished" podID="8247093d-09e8-4ff9-8a21-902c3135b7ab" containerID="3fcf65b6ec9583709098176e7add1ee325b0ddc44f2df4a1b1413ca59a435127" exitCode=0 Mar 12 13:23:59 crc kubenswrapper[4921]: I0312 13:23:59.786444 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" event={"ID":"8247093d-09e8-4ff9-8a21-902c3135b7ab","Type":"ContainerDied","Data":"3fcf65b6ec9583709098176e7add1ee325b0ddc44f2df4a1b1413ca59a435127"} Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.134265 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555364-rx4zr"] Mar 12 13:24:00 crc kubenswrapper[4921]: E0312 13:24:00.134649 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57677fcb-c7a5-431c-b751-ec13d22484b1" containerName="console" Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.134676 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="57677fcb-c7a5-431c-b751-ec13d22484b1" containerName="console" Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.134872 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="57677fcb-c7a5-431c-b751-ec13d22484b1" containerName="console" Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.135419 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555364-rx4zr" Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.137659 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.138199 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.138289 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.146958 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555364-rx4zr"] Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.257957 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb8h4\" (UniqueName: \"kubernetes.io/projected/9d2337cb-6456-4ad6-9be8-7da6c785025c-kube-api-access-tb8h4\") pod \"auto-csr-approver-29555364-rx4zr\" (UID: \"9d2337cb-6456-4ad6-9be8-7da6c785025c\") " pod="openshift-infra/auto-csr-approver-29555364-rx4zr" Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.359346 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb8h4\" (UniqueName: \"kubernetes.io/projected/9d2337cb-6456-4ad6-9be8-7da6c785025c-kube-api-access-tb8h4\") pod \"auto-csr-approver-29555364-rx4zr\" (UID: \"9d2337cb-6456-4ad6-9be8-7da6c785025c\") " pod="openshift-infra/auto-csr-approver-29555364-rx4zr" Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.385267 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb8h4\" (UniqueName: \"kubernetes.io/projected/9d2337cb-6456-4ad6-9be8-7da6c785025c-kube-api-access-tb8h4\") pod \"auto-csr-approver-29555364-rx4zr\" (UID: \"9d2337cb-6456-4ad6-9be8-7da6c785025c\") " pod="openshift-infra/auto-csr-approver-29555364-rx4zr" Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.477193 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555364-rx4zr" Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.760401 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555364-rx4zr"] Mar 12 13:24:00 crc kubenswrapper[4921]: W0312 13:24:00.769846 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d2337cb_6456_4ad6_9be8_7da6c785025c.slice/crio-bec73ff87501d39a55e693a11e244944e74abffc8e2d1563ac8643e2726a849f WatchSource:0}: Error finding container bec73ff87501d39a55e693a11e244944e74abffc8e2d1563ac8643e2726a849f: Status 404 returned error can't find the container with id bec73ff87501d39a55e693a11e244944e74abffc8e2d1563ac8643e2726a849f Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.794163 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555364-rx4zr" event={"ID":"9d2337cb-6456-4ad6-9be8-7da6c785025c","Type":"ContainerStarted","Data":"bec73ff87501d39a55e693a11e244944e74abffc8e2d1563ac8643e2726a849f"} Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.797474 4921 generic.go:334] "Generic (PLEG): container finished" podID="8247093d-09e8-4ff9-8a21-902c3135b7ab" containerID="dbdf5d95acded1c8b4048cff91456ebbc6403d57ea311c16d924ec491895a5be" exitCode=0 Mar 12 13:24:00 crc kubenswrapper[4921]: I0312 13:24:00.797517 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" event={"ID":"8247093d-09e8-4ff9-8a21-902c3135b7ab","Type":"ContainerDied","Data":"dbdf5d95acded1c8b4048cff91456ebbc6403d57ea311c16d924ec491895a5be"} Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.060021 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.185143 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtmgt\" (UniqueName: \"kubernetes.io/projected/8247093d-09e8-4ff9-8a21-902c3135b7ab-kube-api-access-vtmgt\") pod \"8247093d-09e8-4ff9-8a21-902c3135b7ab\" (UID: \"8247093d-09e8-4ff9-8a21-902c3135b7ab\") " Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.185248 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8247093d-09e8-4ff9-8a21-902c3135b7ab-bundle\") pod \"8247093d-09e8-4ff9-8a21-902c3135b7ab\" (UID: \"8247093d-09e8-4ff9-8a21-902c3135b7ab\") " Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.185411 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8247093d-09e8-4ff9-8a21-902c3135b7ab-util\") pod \"8247093d-09e8-4ff9-8a21-902c3135b7ab\" (UID: \"8247093d-09e8-4ff9-8a21-902c3135b7ab\") " Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.186309 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8247093d-09e8-4ff9-8a21-902c3135b7ab-bundle" (OuterVolumeSpecName: "bundle") pod "8247093d-09e8-4ff9-8a21-902c3135b7ab" (UID: "8247093d-09e8-4ff9-8a21-902c3135b7ab"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.186684 4921 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8247093d-09e8-4ff9-8a21-902c3135b7ab-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.196124 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8247093d-09e8-4ff9-8a21-902c3135b7ab-util" (OuterVolumeSpecName: "util") pod "8247093d-09e8-4ff9-8a21-902c3135b7ab" (UID: "8247093d-09e8-4ff9-8a21-902c3135b7ab"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.201767 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8247093d-09e8-4ff9-8a21-902c3135b7ab-kube-api-access-vtmgt" (OuterVolumeSpecName: "kube-api-access-vtmgt") pod "8247093d-09e8-4ff9-8a21-902c3135b7ab" (UID: "8247093d-09e8-4ff9-8a21-902c3135b7ab"). InnerVolumeSpecName "kube-api-access-vtmgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.287437 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtmgt\" (UniqueName: \"kubernetes.io/projected/8247093d-09e8-4ff9-8a21-902c3135b7ab-kube-api-access-vtmgt\") on node \"crc\" DevicePath \"\"" Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.287471 4921 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8247093d-09e8-4ff9-8a21-902c3135b7ab-util\") on node \"crc\" DevicePath \"\"" Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.819568 4921 generic.go:334] "Generic (PLEG): container finished" podID="9d2337cb-6456-4ad6-9be8-7da6c785025c" containerID="6dade520635acbe8d38599c5edab3eb49ce6c98a0fe8b1c22720cbc70fc59f28" exitCode=0 Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.819643 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555364-rx4zr" event={"ID":"9d2337cb-6456-4ad6-9be8-7da6c785025c","Type":"ContainerDied","Data":"6dade520635acbe8d38599c5edab3eb49ce6c98a0fe8b1c22720cbc70fc59f28"} Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.825359 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" event={"ID":"8247093d-09e8-4ff9-8a21-902c3135b7ab","Type":"ContainerDied","Data":"36834c11493703c08aa2efd01d6e6072e7d01464677aaeb96c0878758d4327fe"} Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.825429 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36834c11493703c08aa2efd01d6e6072e7d01464677aaeb96c0878758d4327fe" Mar 12 13:24:02 crc kubenswrapper[4921]: I0312 13:24:02.825523 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7" Mar 12 13:24:04 crc kubenswrapper[4921]: I0312 13:24:04.151863 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555364-rx4zr" Mar 12 13:24:04 crc kubenswrapper[4921]: I0312 13:24:04.218540 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb8h4\" (UniqueName: \"kubernetes.io/projected/9d2337cb-6456-4ad6-9be8-7da6c785025c-kube-api-access-tb8h4\") pod \"9d2337cb-6456-4ad6-9be8-7da6c785025c\" (UID: \"9d2337cb-6456-4ad6-9be8-7da6c785025c\") " Mar 12 13:24:04 crc kubenswrapper[4921]: I0312 13:24:04.224931 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d2337cb-6456-4ad6-9be8-7da6c785025c-kube-api-access-tb8h4" (OuterVolumeSpecName: "kube-api-access-tb8h4") pod "9d2337cb-6456-4ad6-9be8-7da6c785025c" (UID: "9d2337cb-6456-4ad6-9be8-7da6c785025c"). InnerVolumeSpecName "kube-api-access-tb8h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:24:04 crc kubenswrapper[4921]: I0312 13:24:04.319806 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb8h4\" (UniqueName: \"kubernetes.io/projected/9d2337cb-6456-4ad6-9be8-7da6c785025c-kube-api-access-tb8h4\") on node \"crc\" DevicePath \"\"" Mar 12 13:24:04 crc kubenswrapper[4921]: I0312 13:24:04.835378 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555364-rx4zr" event={"ID":"9d2337cb-6456-4ad6-9be8-7da6c785025c","Type":"ContainerDied","Data":"bec73ff87501d39a55e693a11e244944e74abffc8e2d1563ac8643e2726a849f"} Mar 12 13:24:04 crc kubenswrapper[4921]: I0312 13:24:04.835417 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bec73ff87501d39a55e693a11e244944e74abffc8e2d1563ac8643e2726a849f" Mar 12 13:24:04 crc kubenswrapper[4921]: I0312 13:24:04.835443 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555364-rx4zr" Mar 12 13:24:05 crc kubenswrapper[4921]: I0312 13:24:05.248625 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555358-bbn64"] Mar 12 13:24:05 crc kubenswrapper[4921]: I0312 13:24:05.256142 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555358-bbn64"] Mar 12 13:24:05 crc kubenswrapper[4921]: I0312 13:24:05.992471 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485ff5b7-a500-4dd6-b619-876713a66893" path="/var/lib/kubelet/pods/485ff5b7-a500-4dd6-b619-876713a66893/volumes" Mar 12 13:24:10 crc kubenswrapper[4921]: I0312 13:24:10.977352 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k"] Mar 12 13:24:10 crc kubenswrapper[4921]: E0312 13:24:10.978167 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2337cb-6456-4ad6-9be8-7da6c785025c" containerName="oc" Mar 12 13:24:10 crc kubenswrapper[4921]: I0312 13:24:10.978182 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2337cb-6456-4ad6-9be8-7da6c785025c" containerName="oc" Mar 12 13:24:10 crc kubenswrapper[4921]: E0312 13:24:10.978194 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8247093d-09e8-4ff9-8a21-902c3135b7ab" containerName="util" Mar 12 13:24:10 crc kubenswrapper[4921]: I0312 13:24:10.978202 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8247093d-09e8-4ff9-8a21-902c3135b7ab" containerName="util" Mar 12 13:24:10 crc kubenswrapper[4921]: E0312 13:24:10.978215 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8247093d-09e8-4ff9-8a21-902c3135b7ab" containerName="extract" Mar 12 13:24:10 crc kubenswrapper[4921]: I0312 13:24:10.978223 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8247093d-09e8-4ff9-8a21-902c3135b7ab" containerName="extract" Mar 12 13:24:10 crc kubenswrapper[4921]: E0312 13:24:10.978245 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8247093d-09e8-4ff9-8a21-902c3135b7ab" containerName="pull" Mar 12 13:24:10 crc kubenswrapper[4921]: I0312 13:24:10.978252 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8247093d-09e8-4ff9-8a21-902c3135b7ab" containerName="pull" Mar 12 13:24:10 crc kubenswrapper[4921]: I0312 13:24:10.978367 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8247093d-09e8-4ff9-8a21-902c3135b7ab" containerName="extract" Mar 12 13:24:10 crc kubenswrapper[4921]: I0312 13:24:10.978382 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d2337cb-6456-4ad6-9be8-7da6c785025c" containerName="oc" Mar 12 13:24:10 crc kubenswrapper[4921]: I0312 13:24:10.978851 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" Mar 12 13:24:10 crc kubenswrapper[4921]: I0312 13:24:10.982184 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 12 13:24:10 crc kubenswrapper[4921]: I0312 13:24:10.982207 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 12 13:24:10 crc kubenswrapper[4921]: I0312 13:24:10.983214 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 12 13:24:10 crc kubenswrapper[4921]: I0312 13:24:10.983284 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 12 13:24:10 crc kubenswrapper[4921]: I0312 13:24:10.983526 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-p6h42" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.004212 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjt2p\" (UniqueName: \"kubernetes.io/projected/ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234-kube-api-access-rjt2p\") pod \"metallb-operator-controller-manager-74b4d54bf-8p27k\" (UID: \"ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234\") " pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.004289 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234-apiservice-cert\") pod \"metallb-operator-controller-manager-74b4d54bf-8p27k\" (UID: \"ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234\") " pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.004400 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234-webhook-cert\") pod \"metallb-operator-controller-manager-74b4d54bf-8p27k\" (UID: \"ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234\") " pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.014497 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k"] Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.105854 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjt2p\" (UniqueName: \"kubernetes.io/projected/ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234-kube-api-access-rjt2p\") pod \"metallb-operator-controller-manager-74b4d54bf-8p27k\" (UID: \"ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234\") " pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.106141 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234-apiservice-cert\") pod \"metallb-operator-controller-manager-74b4d54bf-8p27k\" (UID: \"ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234\") " pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.106250 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234-webhook-cert\") pod \"metallb-operator-controller-manager-74b4d54bf-8p27k\" (UID: \"ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234\") " pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.124150 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234-webhook-cert\") pod \"metallb-operator-controller-manager-74b4d54bf-8p27k\" (UID: \"ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234\") " pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.126307 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234-apiservice-cert\") pod \"metallb-operator-controller-manager-74b4d54bf-8p27k\" (UID: \"ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234\") " pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.141219 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjt2p\" (UniqueName: \"kubernetes.io/projected/ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234-kube-api-access-rjt2p\") pod \"metallb-operator-controller-manager-74b4d54bf-8p27k\" (UID: \"ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234\") " pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.251479 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h"] Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.252830 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.256446 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.256541 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.256455 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-r5x2j" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.271754 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h"] Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.295293 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.309756 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a20ce4c-4e95-4fcd-ba22-212cc219c81f-apiservice-cert\") pod \"metallb-operator-webhook-server-78c99c5f4b-pq84h\" (UID: \"7a20ce4c-4e95-4fcd-ba22-212cc219c81f\") " pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.309800 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a20ce4c-4e95-4fcd-ba22-212cc219c81f-webhook-cert\") pod \"metallb-operator-webhook-server-78c99c5f4b-pq84h\" (UID: \"7a20ce4c-4e95-4fcd-ba22-212cc219c81f\") " pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.309852 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gp6g\" (UniqueName: \"kubernetes.io/projected/7a20ce4c-4e95-4fcd-ba22-212cc219c81f-kube-api-access-9gp6g\") pod \"metallb-operator-webhook-server-78c99c5f4b-pq84h\" (UID: \"7a20ce4c-4e95-4fcd-ba22-212cc219c81f\") " pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.411720 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gp6g\" (UniqueName: \"kubernetes.io/projected/7a20ce4c-4e95-4fcd-ba22-212cc219c81f-kube-api-access-9gp6g\") pod \"metallb-operator-webhook-server-78c99c5f4b-pq84h\" (UID: \"7a20ce4c-4e95-4fcd-ba22-212cc219c81f\") " pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.411794 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a20ce4c-4e95-4fcd-ba22-212cc219c81f-apiservice-cert\") pod \"metallb-operator-webhook-server-78c99c5f4b-pq84h\" (UID: \"7a20ce4c-4e95-4fcd-ba22-212cc219c81f\") " pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.411837 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a20ce4c-4e95-4fcd-ba22-212cc219c81f-webhook-cert\") pod \"metallb-operator-webhook-server-78c99c5f4b-pq84h\" (UID: \"7a20ce4c-4e95-4fcd-ba22-212cc219c81f\") " pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.417048 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a20ce4c-4e95-4fcd-ba22-212cc219c81f-webhook-cert\") pod \"metallb-operator-webhook-server-78c99c5f4b-pq84h\" (UID: \"7a20ce4c-4e95-4fcd-ba22-212cc219c81f\") " pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.422052 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a20ce4c-4e95-4fcd-ba22-212cc219c81f-apiservice-cert\") pod \"metallb-operator-webhook-server-78c99c5f4b-pq84h\" (UID: \"7a20ce4c-4e95-4fcd-ba22-212cc219c81f\") " pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.435094 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gp6g\" (UniqueName: \"kubernetes.io/projected/7a20ce4c-4e95-4fcd-ba22-212cc219c81f-kube-api-access-9gp6g\") pod \"metallb-operator-webhook-server-78c99c5f4b-pq84h\" (UID: \"7a20ce4c-4e95-4fcd-ba22-212cc219c81f\") " pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.571230 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.745469 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k"] Mar 12 13:24:11 crc kubenswrapper[4921]: W0312 13:24:11.763969 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccbab5b1_d08b_4c2d_9ac9_f265e0bf8234.slice/crio-6cb8bb4a927b95a3d25a336cb015b99f9be2b06d2e44515a8b696b7ae92d743c WatchSource:0}: Error finding container 6cb8bb4a927b95a3d25a336cb015b99f9be2b06d2e44515a8b696b7ae92d743c: Status 404 returned error can't find the container with id 6cb8bb4a927b95a3d25a336cb015b99f9be2b06d2e44515a8b696b7ae92d743c Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.818447 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h"] Mar 12 13:24:11 crc kubenswrapper[4921]: W0312 13:24:11.828272 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a20ce4c_4e95_4fcd_ba22_212cc219c81f.slice/crio-9916ddd189d66c024e78b29ef290c0d4639c49cdacbfc254c828d702d5674b00 WatchSource:0}: Error finding container 9916ddd189d66c024e78b29ef290c0d4639c49cdacbfc254c828d702d5674b00: Status 404 returned error can't find the container with id 9916ddd189d66c024e78b29ef290c0d4639c49cdacbfc254c828d702d5674b00 Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.875415 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" event={"ID":"7a20ce4c-4e95-4fcd-ba22-212cc219c81f","Type":"ContainerStarted","Data":"9916ddd189d66c024e78b29ef290c0d4639c49cdacbfc254c828d702d5674b00"} Mar 12 13:24:11 crc kubenswrapper[4921]: I0312 13:24:11.877018 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" event={"ID":"ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234","Type":"ContainerStarted","Data":"6cb8bb4a927b95a3d25a336cb015b99f9be2b06d2e44515a8b696b7ae92d743c"} Mar 12 13:24:16 crc kubenswrapper[4921]: I0312 13:24:16.913604 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" event={"ID":"ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234","Type":"ContainerStarted","Data":"ab20abdefba5a0e259f6862a5e86393836f6d97f89ce00d33f77ee7d18c5de26"} Mar 12 13:24:16 crc kubenswrapper[4921]: I0312 13:24:16.914277 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" Mar 12 13:24:16 crc kubenswrapper[4921]: I0312 13:24:16.915246 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" event={"ID":"7a20ce4c-4e95-4fcd-ba22-212cc219c81f","Type":"ContainerStarted","Data":"d0e60c90fedf098725ae195cf523dcff5ed53ad7f5611037e0363b32d7afaad0"} Mar 12 13:24:16 crc kubenswrapper[4921]: I0312 13:24:16.915350 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" Mar 12 13:24:16 crc kubenswrapper[4921]: I0312 13:24:16.942206 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" podStartSLOduration=2.373524478 podStartE2EDuration="6.942185668s" podCreationTimestamp="2026-03-12 13:24:10 +0000 UTC" firstStartedPulling="2026-03-12 13:24:11.76570188 +0000 UTC m=+874.455773851" lastFinishedPulling="2026-03-12 13:24:16.33436307 +0000 UTC m=+879.024435041" observedRunningTime="2026-03-12 13:24:16.938774431 +0000 UTC m=+879.628846422" watchObservedRunningTime="2026-03-12 13:24:16.942185668 +0000 UTC m=+879.632257649" Mar 12 13:24:16 crc kubenswrapper[4921]: I0312 13:24:16.964306 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" podStartSLOduration=1.430779814 podStartE2EDuration="5.964285301s" podCreationTimestamp="2026-03-12 13:24:11 +0000 UTC" firstStartedPulling="2026-03-12 13:24:11.830880125 +0000 UTC m=+874.520952096" lastFinishedPulling="2026-03-12 13:24:16.364385612 +0000 UTC m=+879.054457583" observedRunningTime="2026-03-12 13:24:16.961018558 +0000 UTC m=+879.651090529" watchObservedRunningTime="2026-03-12 13:24:16.964285301 +0000 UTC m=+879.654357272" Mar 12 13:24:17 crc kubenswrapper[4921]: I0312 13:24:17.413132 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2vsd8"] Mar 12 13:24:17 crc kubenswrapper[4921]: I0312 13:24:17.414157 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:17 crc kubenswrapper[4921]: I0312 13:24:17.435186 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vsd8"] Mar 12 13:24:17 crc kubenswrapper[4921]: I0312 13:24:17.495684 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9lrg\" (UniqueName: \"kubernetes.io/projected/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-kube-api-access-s9lrg\") pod \"redhat-marketplace-2vsd8\" (UID: \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\") " pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:17 crc kubenswrapper[4921]: I0312 13:24:17.495737 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-catalog-content\") pod \"redhat-marketplace-2vsd8\" (UID: \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\") " pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:17 crc kubenswrapper[4921]: I0312 13:24:17.495907 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-utilities\") pod \"redhat-marketplace-2vsd8\" (UID: \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\") " pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:17 crc kubenswrapper[4921]: I0312 13:24:17.597236 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-catalog-content\") pod \"redhat-marketplace-2vsd8\" (UID: \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\") " pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:17 crc kubenswrapper[4921]: I0312 13:24:17.597310 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9lrg\" (UniqueName: \"kubernetes.io/projected/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-kube-api-access-s9lrg\") pod \"redhat-marketplace-2vsd8\" (UID: \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\") " pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:17 crc kubenswrapper[4921]: I0312 13:24:17.597454 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-utilities\") pod \"redhat-marketplace-2vsd8\" (UID: \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\") " pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:17 crc kubenswrapper[4921]: I0312 13:24:17.597963 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-catalog-content\") pod \"redhat-marketplace-2vsd8\" (UID: \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\") " pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:17 crc kubenswrapper[4921]: I0312 13:24:17.598104 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-utilities\") pod \"redhat-marketplace-2vsd8\" (UID: \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\") " pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:17 crc kubenswrapper[4921]: I0312 13:24:17.625909 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9lrg\" (UniqueName: \"kubernetes.io/projected/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-kube-api-access-s9lrg\") pod \"redhat-marketplace-2vsd8\" (UID: \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\") " pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:17 crc kubenswrapper[4921]: I0312 13:24:17.746687 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:17 crc kubenswrapper[4921]: I0312 13:24:17.955749 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vsd8"] Mar 12 13:24:18 crc kubenswrapper[4921]: I0312 13:24:18.932790 4921 generic.go:334] "Generic (PLEG): container finished" podID="c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" containerID="e553d835d7c6662cddfe45e1b473b1031a0f6c0926a2dfcb92ba342d0218e987" exitCode=0 Mar 12 13:24:18 crc kubenswrapper[4921]: I0312 13:24:18.932870 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vsd8" event={"ID":"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c","Type":"ContainerDied","Data":"e553d835d7c6662cddfe45e1b473b1031a0f6c0926a2dfcb92ba342d0218e987"} Mar 12 13:24:18 crc kubenswrapper[4921]: I0312 13:24:18.932921 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vsd8" event={"ID":"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c","Type":"ContainerStarted","Data":"3cc80cc4f3ed54f9beb80f624f41a46843f139c3eb8cce082744b87742680c71"} Mar 12 13:24:19 crc kubenswrapper[4921]: I0312 13:24:19.941920 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vsd8" event={"ID":"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c","Type":"ContainerStarted","Data":"619585f5ffa2aae8ca78e3eaac2200df6367c007c7b069901385ab2d7255b4fb"} Mar 12 13:24:20 crc kubenswrapper[4921]: I0312 13:24:20.953849 4921 generic.go:334] "Generic (PLEG): container finished" podID="c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" containerID="619585f5ffa2aae8ca78e3eaac2200df6367c007c7b069901385ab2d7255b4fb" exitCode=0 Mar 12 13:24:20 crc kubenswrapper[4921]: I0312 13:24:20.953974 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vsd8" event={"ID":"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c","Type":"ContainerDied","Data":"619585f5ffa2aae8ca78e3eaac2200df6367c007c7b069901385ab2d7255b4fb"} Mar 12 13:24:21 crc kubenswrapper[4921]: I0312 13:24:21.965232 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vsd8" event={"ID":"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c","Type":"ContainerStarted","Data":"dab9a156fe2483f3801114333aba262d8f32402f674343c13df4ab1e12796f9f"} Mar 12 13:24:22 crc kubenswrapper[4921]: I0312 13:24:22.000115 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2vsd8" podStartSLOduration=2.541797939 podStartE2EDuration="5.000102906s" podCreationTimestamp="2026-03-12 13:24:17 +0000 UTC" firstStartedPulling="2026-03-12 13:24:18.93489808 +0000 UTC m=+881.624970051" lastFinishedPulling="2026-03-12 13:24:21.393203037 +0000 UTC m=+884.083275018" observedRunningTime="2026-03-12 13:24:21.992833648 +0000 UTC m=+884.682905619" watchObservedRunningTime="2026-03-12 13:24:22.000102906 +0000 UTC m=+884.690174877" Mar 12 13:24:26 crc kubenswrapper[4921]: I0312 13:24:26.604829 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rkcwt"] Mar 12 13:24:26 crc kubenswrapper[4921]: I0312 13:24:26.606339 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:26 crc kubenswrapper[4921]: I0312 13:24:26.614736 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rkcwt"] Mar 12 13:24:26 crc kubenswrapper[4921]: I0312 13:24:26.619432 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2v95\" (UniqueName: \"kubernetes.io/projected/0f790d06-1133-4e49-9f72-bc1ab3b8613c-kube-api-access-p2v95\") pod \"certified-operators-rkcwt\" (UID: \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\") " pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:26 crc kubenswrapper[4921]: I0312 13:24:26.619515 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f790d06-1133-4e49-9f72-bc1ab3b8613c-catalog-content\") pod \"certified-operators-rkcwt\" (UID: \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\") " pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:26 crc kubenswrapper[4921]: I0312 13:24:26.619693 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f790d06-1133-4e49-9f72-bc1ab3b8613c-utilities\") pod \"certified-operators-rkcwt\" (UID: \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\") " pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:26 crc kubenswrapper[4921]: I0312 13:24:26.720838 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f790d06-1133-4e49-9f72-bc1ab3b8613c-utilities\") pod \"certified-operators-rkcwt\" (UID: \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\") " pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:26 crc kubenswrapper[4921]: I0312 13:24:26.720907 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2v95\" (UniqueName: \"kubernetes.io/projected/0f790d06-1133-4e49-9f72-bc1ab3b8613c-kube-api-access-p2v95\") pod \"certified-operators-rkcwt\" (UID: \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\") " pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:26 crc kubenswrapper[4921]: I0312 13:24:26.720944 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f790d06-1133-4e49-9f72-bc1ab3b8613c-catalog-content\") pod \"certified-operators-rkcwt\" (UID: \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\") " pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:26 crc kubenswrapper[4921]: I0312 13:24:26.721378 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f790d06-1133-4e49-9f72-bc1ab3b8613c-catalog-content\") pod \"certified-operators-rkcwt\" (UID: \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\") " pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:26 crc kubenswrapper[4921]: I0312 13:24:26.721599 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f790d06-1133-4e49-9f72-bc1ab3b8613c-utilities\") pod \"certified-operators-rkcwt\" (UID: \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\") " pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:26 crc kubenswrapper[4921]: I0312 13:24:26.750050 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2v95\" (UniqueName: \"kubernetes.io/projected/0f790d06-1133-4e49-9f72-bc1ab3b8613c-kube-api-access-p2v95\") pod \"certified-operators-rkcwt\" (UID: \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\") " pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:26 crc kubenswrapper[4921]: I0312 13:24:26.920321 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:27 crc kubenswrapper[4921]: I0312 13:24:27.146971 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rkcwt"] Mar 12 13:24:27 crc kubenswrapper[4921]: I0312 13:24:27.747470 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:27 crc kubenswrapper[4921]: I0312 13:24:27.747742 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:27 crc kubenswrapper[4921]: I0312 13:24:27.787729 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:28 crc kubenswrapper[4921]: I0312 13:24:28.004468 4921 generic.go:334] "Generic (PLEG): container finished" podID="0f790d06-1133-4e49-9f72-bc1ab3b8613c" containerID="be0a5bccad9dad0bd3329c019f93523b49f19defca9906658e09f1880ba360c6" exitCode=0 Mar 12 13:24:28 crc kubenswrapper[4921]: I0312 13:24:28.004614 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkcwt" event={"ID":"0f790d06-1133-4e49-9f72-bc1ab3b8613c","Type":"ContainerDied","Data":"be0a5bccad9dad0bd3329c019f93523b49f19defca9906658e09f1880ba360c6"} Mar 12 13:24:28 crc kubenswrapper[4921]: I0312 13:24:28.004699 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkcwt" event={"ID":"0f790d06-1133-4e49-9f72-bc1ab3b8613c","Type":"ContainerStarted","Data":"f35beb8c5093b70772e526130984b1d9995c4d5b99ba52b0f7ad0cadd46b8624"} Mar 12 13:24:28 crc kubenswrapper[4921]: I0312 13:24:28.052609 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:29 crc kubenswrapper[4921]: I0312 13:24:29.011636 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkcwt" event={"ID":"0f790d06-1133-4e49-9f72-bc1ab3b8613c","Type":"ContainerStarted","Data":"fd7bbc2f426d03cbdc505f3f03faab58d11b02de4283803368238fc41ef726c6"} Mar 12 13:24:30 crc kubenswrapper[4921]: I0312 13:24:30.018434 4921 generic.go:334] "Generic (PLEG): container finished" podID="0f790d06-1133-4e49-9f72-bc1ab3b8613c" containerID="fd7bbc2f426d03cbdc505f3f03faab58d11b02de4283803368238fc41ef726c6" exitCode=0 Mar 12 13:24:30 crc kubenswrapper[4921]: I0312 13:24:30.018486 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkcwt" event={"ID":"0f790d06-1133-4e49-9f72-bc1ab3b8613c","Type":"ContainerDied","Data":"fd7bbc2f426d03cbdc505f3f03faab58d11b02de4283803368238fc41ef726c6"} Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.027044 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkcwt" event={"ID":"0f790d06-1133-4e49-9f72-bc1ab3b8613c","Type":"ContainerStarted","Data":"cb61ce42bf8907e89925645c04846b33e1838cdf89ed4caf2cc87edeed1b47f9"} Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.040767 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rkcwt" podStartSLOduration=2.604156678 podStartE2EDuration="5.040753044s" podCreationTimestamp="2026-03-12 13:24:26 +0000 UTC" firstStartedPulling="2026-03-12 13:24:28.006425287 +0000 UTC m=+890.696497268" lastFinishedPulling="2026-03-12 13:24:30.443021663 +0000 UTC m=+893.133093634" observedRunningTime="2026-03-12 13:24:31.039781974 +0000 UTC m=+893.729853955" watchObservedRunningTime="2026-03-12 13:24:31.040753044 +0000 UTC m=+893.730825015" Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.387732 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vsd8"] Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.389243 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2vsd8" podUID="c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" containerName="registry-server" containerID="cri-o://dab9a156fe2483f3801114333aba262d8f32402f674343c13df4ab1e12796f9f" gracePeriod=2 Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.577465 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-78c99c5f4b-pq84h" Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.774342 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.878292 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-utilities\") pod \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\" (UID: \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\") " Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.878408 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9lrg\" (UniqueName: \"kubernetes.io/projected/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-kube-api-access-s9lrg\") pod \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\" (UID: \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\") " Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.878445 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-catalog-content\") pod \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\" (UID: \"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c\") " Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.879250 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-utilities" (OuterVolumeSpecName: "utilities") pod "c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" (UID: "c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.888817 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-kube-api-access-s9lrg" (OuterVolumeSpecName: "kube-api-access-s9lrg") pod "c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" (UID: "c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c"). InnerVolumeSpecName "kube-api-access-s9lrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.907434 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" (UID: "c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.980501 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.980862 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9lrg\" (UniqueName: \"kubernetes.io/projected/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-kube-api-access-s9lrg\") on node \"crc\" DevicePath \"\"" Mar 12 13:24:31 crc kubenswrapper[4921]: I0312 13:24:31.981018 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.039069 4921 generic.go:334] "Generic (PLEG): container finished" podID="c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" containerID="dab9a156fe2483f3801114333aba262d8f32402f674343c13df4ab1e12796f9f" exitCode=0 Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.039162 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vsd8" event={"ID":"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c","Type":"ContainerDied","Data":"dab9a156fe2483f3801114333aba262d8f32402f674343c13df4ab1e12796f9f"} Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.039214 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vsd8" event={"ID":"c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c","Type":"ContainerDied","Data":"3cc80cc4f3ed54f9beb80f624f41a46843f139c3eb8cce082744b87742680c71"} Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.039233 4921 scope.go:117] "RemoveContainer" containerID="dab9a156fe2483f3801114333aba262d8f32402f674343c13df4ab1e12796f9f" Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.039668 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2vsd8" Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.058411 4921 scope.go:117] "RemoveContainer" containerID="619585f5ffa2aae8ca78e3eaac2200df6367c007c7b069901385ab2d7255b4fb" Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.059273 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vsd8"] Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.063601 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vsd8"] Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.074284 4921 scope.go:117] "RemoveContainer" containerID="e553d835d7c6662cddfe45e1b473b1031a0f6c0926a2dfcb92ba342d0218e987" Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.098862 4921 scope.go:117] "RemoveContainer" containerID="dab9a156fe2483f3801114333aba262d8f32402f674343c13df4ab1e12796f9f" Mar 12 13:24:32 crc kubenswrapper[4921]: E0312 13:24:32.099267 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab9a156fe2483f3801114333aba262d8f32402f674343c13df4ab1e12796f9f\": container with ID starting with dab9a156fe2483f3801114333aba262d8f32402f674343c13df4ab1e12796f9f not found: ID does not exist" containerID="dab9a156fe2483f3801114333aba262d8f32402f674343c13df4ab1e12796f9f" Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.099311 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab9a156fe2483f3801114333aba262d8f32402f674343c13df4ab1e12796f9f"} err="failed to get container status \"dab9a156fe2483f3801114333aba262d8f32402f674343c13df4ab1e12796f9f\": rpc error: code = NotFound desc = could not find container \"dab9a156fe2483f3801114333aba262d8f32402f674343c13df4ab1e12796f9f\": container with ID starting with dab9a156fe2483f3801114333aba262d8f32402f674343c13df4ab1e12796f9f not found: ID does not exist" Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.099350 4921 scope.go:117] "RemoveContainer" containerID="619585f5ffa2aae8ca78e3eaac2200df6367c007c7b069901385ab2d7255b4fb" Mar 12 13:24:32 crc kubenswrapper[4921]: E0312 13:24:32.099662 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"619585f5ffa2aae8ca78e3eaac2200df6367c007c7b069901385ab2d7255b4fb\": container with ID starting with 619585f5ffa2aae8ca78e3eaac2200df6367c007c7b069901385ab2d7255b4fb not found: ID does not exist" containerID="619585f5ffa2aae8ca78e3eaac2200df6367c007c7b069901385ab2d7255b4fb" Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.099678 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"619585f5ffa2aae8ca78e3eaac2200df6367c007c7b069901385ab2d7255b4fb"} err="failed to get container status \"619585f5ffa2aae8ca78e3eaac2200df6367c007c7b069901385ab2d7255b4fb\": rpc error: code = NotFound desc = could not find container \"619585f5ffa2aae8ca78e3eaac2200df6367c007c7b069901385ab2d7255b4fb\": container with ID starting with 619585f5ffa2aae8ca78e3eaac2200df6367c007c7b069901385ab2d7255b4fb not found: ID does not exist" Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.099689 4921 scope.go:117] "RemoveContainer" containerID="e553d835d7c6662cddfe45e1b473b1031a0f6c0926a2dfcb92ba342d0218e987" Mar 12 13:24:32 crc kubenswrapper[4921]: E0312 13:24:32.099938 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e553d835d7c6662cddfe45e1b473b1031a0f6c0926a2dfcb92ba342d0218e987\": container with ID starting with e553d835d7c6662cddfe45e1b473b1031a0f6c0926a2dfcb92ba342d0218e987 not found: ID does not exist" containerID="e553d835d7c6662cddfe45e1b473b1031a0f6c0926a2dfcb92ba342d0218e987" Mar 12 13:24:32 crc kubenswrapper[4921]: I0312 13:24:32.099962 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e553d835d7c6662cddfe45e1b473b1031a0f6c0926a2dfcb92ba342d0218e987"} err="failed to get container status \"e553d835d7c6662cddfe45e1b473b1031a0f6c0926a2dfcb92ba342d0218e987\": rpc error: code = NotFound desc = could not find container \"e553d835d7c6662cddfe45e1b473b1031a0f6c0926a2dfcb92ba342d0218e987\": container with ID starting with e553d835d7c6662cddfe45e1b473b1031a0f6c0926a2dfcb92ba342d0218e987 not found: ID does not exist" Mar 12 13:24:33 crc kubenswrapper[4921]: I0312 13:24:33.991511 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" path="/var/lib/kubelet/pods/c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c/volumes" Mar 12 13:24:36 crc kubenswrapper[4921]: I0312 13:24:36.921317 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:36 crc kubenswrapper[4921]: I0312 13:24:36.922296 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:36 crc kubenswrapper[4921]: I0312 13:24:36.984787 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:37 crc kubenswrapper[4921]: I0312 13:24:37.143076 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:38 crc kubenswrapper[4921]: I0312 13:24:38.706725 4921 scope.go:117] "RemoveContainer" containerID="de2ba02afd825c8f08eddc6e570e30f2421d22424ee4e2ba6b29e1c1910aa587" Mar 12 13:24:39 crc kubenswrapper[4921]: I0312 13:24:39.392330 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rkcwt"] Mar 12 13:24:39 crc kubenswrapper[4921]: I0312 13:24:39.392659 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rkcwt" podUID="0f790d06-1133-4e49-9f72-bc1ab3b8613c" containerName="registry-server" containerID="cri-o://cb61ce42bf8907e89925645c04846b33e1838cdf89ed4caf2cc87edeed1b47f9" gracePeriod=2 Mar 12 13:24:40 crc kubenswrapper[4921]: I0312 13:24:40.092611 4921 generic.go:334] "Generic (PLEG): container finished" podID="0f790d06-1133-4e49-9f72-bc1ab3b8613c" containerID="cb61ce42bf8907e89925645c04846b33e1838cdf89ed4caf2cc87edeed1b47f9" exitCode=0 Mar 12 13:24:40 crc kubenswrapper[4921]: I0312 13:24:40.092666 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkcwt" event={"ID":"0f790d06-1133-4e49-9f72-bc1ab3b8613c","Type":"ContainerDied","Data":"cb61ce42bf8907e89925645c04846b33e1838cdf89ed4caf2cc87edeed1b47f9"} Mar 12 13:24:40 crc kubenswrapper[4921]: I0312 13:24:40.418050 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:40 crc kubenswrapper[4921]: I0312 13:24:40.524231 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f790d06-1133-4e49-9f72-bc1ab3b8613c-utilities\") pod \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\" (UID: \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\") " Mar 12 13:24:40 crc kubenswrapper[4921]: I0312 13:24:40.524283 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2v95\" (UniqueName: \"kubernetes.io/projected/0f790d06-1133-4e49-9f72-bc1ab3b8613c-kube-api-access-p2v95\") pod \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\" (UID: \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\") " Mar 12 13:24:40 crc kubenswrapper[4921]: I0312 13:24:40.524364 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f790d06-1133-4e49-9f72-bc1ab3b8613c-catalog-content\") pod \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\" (UID: \"0f790d06-1133-4e49-9f72-bc1ab3b8613c\") " Mar 12 13:24:40 crc kubenswrapper[4921]: I0312 13:24:40.525222 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f790d06-1133-4e49-9f72-bc1ab3b8613c-utilities" (OuterVolumeSpecName: "utilities") pod "0f790d06-1133-4e49-9f72-bc1ab3b8613c" (UID: "0f790d06-1133-4e49-9f72-bc1ab3b8613c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:24:40 crc kubenswrapper[4921]: I0312 13:24:40.532548 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f790d06-1133-4e49-9f72-bc1ab3b8613c-kube-api-access-p2v95" (OuterVolumeSpecName: "kube-api-access-p2v95") pod "0f790d06-1133-4e49-9f72-bc1ab3b8613c" (UID: "0f790d06-1133-4e49-9f72-bc1ab3b8613c"). InnerVolumeSpecName "kube-api-access-p2v95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:24:40 crc kubenswrapper[4921]: I0312 13:24:40.588612 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f790d06-1133-4e49-9f72-bc1ab3b8613c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f790d06-1133-4e49-9f72-bc1ab3b8613c" (UID: "0f790d06-1133-4e49-9f72-bc1ab3b8613c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:24:40 crc kubenswrapper[4921]: I0312 13:24:40.625537 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f790d06-1133-4e49-9f72-bc1ab3b8613c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:24:40 crc kubenswrapper[4921]: I0312 13:24:40.625582 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f790d06-1133-4e49-9f72-bc1ab3b8613c-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:24:40 crc kubenswrapper[4921]: I0312 13:24:40.625595 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2v95\" (UniqueName: \"kubernetes.io/projected/0f790d06-1133-4e49-9f72-bc1ab3b8613c-kube-api-access-p2v95\") on node \"crc\" DevicePath \"\"" Mar 12 13:24:41 crc kubenswrapper[4921]: I0312 13:24:41.100971 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rkcwt" event={"ID":"0f790d06-1133-4e49-9f72-bc1ab3b8613c","Type":"ContainerDied","Data":"f35beb8c5093b70772e526130984b1d9995c4d5b99ba52b0f7ad0cadd46b8624"} Mar 12 13:24:41 crc kubenswrapper[4921]: I0312 13:24:41.101039 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rkcwt" Mar 12 13:24:41 crc kubenswrapper[4921]: I0312 13:24:41.101275 4921 scope.go:117] "RemoveContainer" containerID="cb61ce42bf8907e89925645c04846b33e1838cdf89ed4caf2cc87edeed1b47f9" Mar 12 13:24:41 crc kubenswrapper[4921]: I0312 13:24:41.116714 4921 scope.go:117] "RemoveContainer" containerID="fd7bbc2f426d03cbdc505f3f03faab58d11b02de4283803368238fc41ef726c6" Mar 12 13:24:41 crc kubenswrapper[4921]: I0312 13:24:41.132692 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rkcwt"] Mar 12 13:24:41 crc kubenswrapper[4921]: I0312 13:24:41.135425 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rkcwt"] Mar 12 13:24:41 crc kubenswrapper[4921]: I0312 13:24:41.151581 4921 scope.go:117] "RemoveContainer" containerID="be0a5bccad9dad0bd3329c019f93523b49f19defca9906658e09f1880ba360c6" Mar 12 13:24:41 crc kubenswrapper[4921]: I0312 13:24:41.995990 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f790d06-1133-4e49-9f72-bc1ab3b8613c" path="/var/lib/kubelet/pods/0f790d06-1133-4e49-9f72-bc1ab3b8613c/volumes" Mar 12 13:24:51 crc kubenswrapper[4921]: I0312 13:24:51.298756 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-74b4d54bf-8p27k" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.062047 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-qcglj"] Mar 12 13:24:52 crc kubenswrapper[4921]: E0312 13:24:52.062322 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" containerName="registry-server" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.062341 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" containerName="registry-server" Mar 12 13:24:52 crc kubenswrapper[4921]: E0312 13:24:52.062356 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" containerName="extract-utilities" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.062365 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" containerName="extract-utilities" Mar 12 13:24:52 crc kubenswrapper[4921]: E0312 13:24:52.062377 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" containerName="extract-content" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.062385 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" containerName="extract-content" Mar 12 13:24:52 crc kubenswrapper[4921]: E0312 13:24:52.062404 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f790d06-1133-4e49-9f72-bc1ab3b8613c" containerName="registry-server" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.062412 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f790d06-1133-4e49-9f72-bc1ab3b8613c" containerName="registry-server" Mar 12 13:24:52 crc kubenswrapper[4921]: E0312 13:24:52.062428 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f790d06-1133-4e49-9f72-bc1ab3b8613c" containerName="extract-utilities" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.062436 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f790d06-1133-4e49-9f72-bc1ab3b8613c" containerName="extract-utilities" Mar 12 13:24:52 crc kubenswrapper[4921]: E0312 13:24:52.062635 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f790d06-1133-4e49-9f72-bc1ab3b8613c" containerName="extract-content" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.062644 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f790d06-1133-4e49-9f72-bc1ab3b8613c" containerName="extract-content" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.062770 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4a59a5b-bc3c-4c70-9f5a-abdd51a7827c" containerName="registry-server" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.062787 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f790d06-1133-4e49-9f72-bc1ab3b8613c" containerName="registry-server" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.065152 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: W0312 13:24:52.069764 4921 reflector.go:561] object-"metallb-system"/"frr-k8s-daemon-dockercfg-th7sc": failed to list *v1.Secret: secrets "frr-k8s-daemon-dockercfg-th7sc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 12 13:24:52 crc kubenswrapper[4921]: E0312 13:24:52.069840 4921 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"frr-k8s-daemon-dockercfg-th7sc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"frr-k8s-daemon-dockercfg-th7sc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 13:24:52 crc kubenswrapper[4921]: W0312 13:24:52.069897 4921 reflector.go:561] object-"metallb-system"/"frr-k8s-certs-secret": failed to list *v1.Secret: secrets "frr-k8s-certs-secret" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 12 13:24:52 crc kubenswrapper[4921]: E0312 13:24:52.069912 4921 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"frr-k8s-certs-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"frr-k8s-certs-secret\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 13:24:52 crc kubenswrapper[4921]: W0312 13:24:52.070085 4921 reflector.go:561] object-"metallb-system"/"frr-startup": failed to list *v1.ConfigMap: configmaps "frr-startup" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 12 13:24:52 crc kubenswrapper[4921]: E0312 13:24:52.070115 4921 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"frr-startup\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"frr-startup\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.074486 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5"] Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.075302 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.077832 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.096208 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5"] Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.157190 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zfh6j"] Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.158144 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zfh6j" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.161183 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.161536 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.161714 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-q452j" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.162265 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.176279 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-nzvhg"] Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.177135 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-nzvhg" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.179580 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.186649 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2ebf7941-9d40-49cf-ad40-530b5e696770-frr-sockets\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.186685 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aabfc338-f7a1-46a8-a02a-daf1adc64862-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jn8d5\" (UID: \"aabfc338-f7a1-46a8-a02a-daf1adc64862\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.186711 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ebf7941-9d40-49cf-ad40-530b5e696770-metrics-certs\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.186896 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2ebf7941-9d40-49cf-ad40-530b5e696770-reloader\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.186960 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2ebf7941-9d40-49cf-ad40-530b5e696770-frr-startup\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.187005 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2ebf7941-9d40-49cf-ad40-530b5e696770-frr-conf\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.187063 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd4h5\" (UniqueName: \"kubernetes.io/projected/aabfc338-f7a1-46a8-a02a-daf1adc64862-kube-api-access-dd4h5\") pod \"frr-k8s-webhook-server-bcc4b6f68-jn8d5\" (UID: \"aabfc338-f7a1-46a8-a02a-daf1adc64862\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.187137 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z86ns\" (UniqueName: \"kubernetes.io/projected/2ebf7941-9d40-49cf-ad40-530b5e696770-kube-api-access-z86ns\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.187281 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2ebf7941-9d40-49cf-ad40-530b5e696770-metrics\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.194680 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-nzvhg"] Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.288903 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8ae92198-0eeb-414f-859a-27c54e4338bf-memberlist\") pod \"speaker-zfh6j\" (UID: \"8ae92198-0eeb-414f-859a-27c54e4338bf\") " pod="metallb-system/speaker-zfh6j" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.288953 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2ebf7941-9d40-49cf-ad40-530b5e696770-metrics\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.288973 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ceb498e3-36d0-4f72-9c07-54807b7a11ea-cert\") pod \"controller-7bb4cc7c98-nzvhg\" (UID: \"ceb498e3-36d0-4f72-9c07-54807b7a11ea\") " pod="metallb-system/controller-7bb4cc7c98-nzvhg" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.288993 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8ae92198-0eeb-414f-859a-27c54e4338bf-metallb-excludel2\") pod \"speaker-zfh6j\" (UID: \"8ae92198-0eeb-414f-859a-27c54e4338bf\") " pod="metallb-system/speaker-zfh6j" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289018 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ae92198-0eeb-414f-859a-27c54e4338bf-metrics-certs\") pod \"speaker-zfh6j\" (UID: \"8ae92198-0eeb-414f-859a-27c54e4338bf\") " pod="metallb-system/speaker-zfh6j" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289034 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2ebf7941-9d40-49cf-ad40-530b5e696770-frr-sockets\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289054 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ceb498e3-36d0-4f72-9c07-54807b7a11ea-metrics-certs\") pod \"controller-7bb4cc7c98-nzvhg\" (UID: \"ceb498e3-36d0-4f72-9c07-54807b7a11ea\") " pod="metallb-system/controller-7bb4cc7c98-nzvhg" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289214 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aabfc338-f7a1-46a8-a02a-daf1adc64862-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jn8d5\" (UID: \"aabfc338-f7a1-46a8-a02a-daf1adc64862\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289278 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ebf7941-9d40-49cf-ad40-530b5e696770-metrics-certs\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289335 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8cpt\" (UniqueName: \"kubernetes.io/projected/ceb498e3-36d0-4f72-9c07-54807b7a11ea-kube-api-access-v8cpt\") pod \"controller-7bb4cc7c98-nzvhg\" (UID: \"ceb498e3-36d0-4f72-9c07-54807b7a11ea\") " pod="metallb-system/controller-7bb4cc7c98-nzvhg" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289396 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c22xc\" (UniqueName: \"kubernetes.io/projected/8ae92198-0eeb-414f-859a-27c54e4338bf-kube-api-access-c22xc\") pod \"speaker-zfh6j\" (UID: \"8ae92198-0eeb-414f-859a-27c54e4338bf\") " pod="metallb-system/speaker-zfh6j" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289481 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2ebf7941-9d40-49cf-ad40-530b5e696770-reloader\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289499 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/2ebf7941-9d40-49cf-ad40-530b5e696770-metrics\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289520 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2ebf7941-9d40-49cf-ad40-530b5e696770-frr-startup\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289551 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2ebf7941-9d40-49cf-ad40-530b5e696770-frr-conf\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289594 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd4h5\" (UniqueName: \"kubernetes.io/projected/aabfc338-f7a1-46a8-a02a-daf1adc64862-kube-api-access-dd4h5\") pod \"frr-k8s-webhook-server-bcc4b6f68-jn8d5\" (UID: \"aabfc338-f7a1-46a8-a02a-daf1adc64862\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289621 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/2ebf7941-9d40-49cf-ad40-530b5e696770-frr-sockets\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289669 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z86ns\" (UniqueName: \"kubernetes.io/projected/2ebf7941-9d40-49cf-ad40-530b5e696770-kube-api-access-z86ns\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.289836 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/2ebf7941-9d40-49cf-ad40-530b5e696770-frr-conf\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.290026 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/2ebf7941-9d40-49cf-ad40-530b5e696770-reloader\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.296299 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aabfc338-f7a1-46a8-a02a-daf1adc64862-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jn8d5\" (UID: \"aabfc338-f7a1-46a8-a02a-daf1adc64862\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.310518 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z86ns\" (UniqueName: \"kubernetes.io/projected/2ebf7941-9d40-49cf-ad40-530b5e696770-kube-api-access-z86ns\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.313856 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd4h5\" (UniqueName: \"kubernetes.io/projected/aabfc338-f7a1-46a8-a02a-daf1adc64862-kube-api-access-dd4h5\") pod \"frr-k8s-webhook-server-bcc4b6f68-jn8d5\" (UID: \"aabfc338-f7a1-46a8-a02a-daf1adc64862\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.391664 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8cpt\" (UniqueName: \"kubernetes.io/projected/ceb498e3-36d0-4f72-9c07-54807b7a11ea-kube-api-access-v8cpt\") pod \"controller-7bb4cc7c98-nzvhg\" (UID: \"ceb498e3-36d0-4f72-9c07-54807b7a11ea\") " pod="metallb-system/controller-7bb4cc7c98-nzvhg" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.392305 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c22xc\" (UniqueName: \"kubernetes.io/projected/8ae92198-0eeb-414f-859a-27c54e4338bf-kube-api-access-c22xc\") pod \"speaker-zfh6j\" (UID: \"8ae92198-0eeb-414f-859a-27c54e4338bf\") " pod="metallb-system/speaker-zfh6j" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.392392 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8ae92198-0eeb-414f-859a-27c54e4338bf-memberlist\") pod \"speaker-zfh6j\" (UID: \"8ae92198-0eeb-414f-859a-27c54e4338bf\") " pod="metallb-system/speaker-zfh6j" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.392421 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ceb498e3-36d0-4f72-9c07-54807b7a11ea-cert\") pod \"controller-7bb4cc7c98-nzvhg\" (UID: \"ceb498e3-36d0-4f72-9c07-54807b7a11ea\") " pod="metallb-system/controller-7bb4cc7c98-nzvhg" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.392443 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8ae92198-0eeb-414f-859a-27c54e4338bf-metallb-excludel2\") pod \"speaker-zfh6j\" (UID: \"8ae92198-0eeb-414f-859a-27c54e4338bf\") " pod="metallb-system/speaker-zfh6j" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.392466 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ae92198-0eeb-414f-859a-27c54e4338bf-metrics-certs\") pod \"speaker-zfh6j\" (UID: \"8ae92198-0eeb-414f-859a-27c54e4338bf\") " pod="metallb-system/speaker-zfh6j" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.392490 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ceb498e3-36d0-4f72-9c07-54807b7a11ea-metrics-certs\") pod \"controller-7bb4cc7c98-nzvhg\" (UID: \"ceb498e3-36d0-4f72-9c07-54807b7a11ea\") " pod="metallb-system/controller-7bb4cc7c98-nzvhg" Mar 12 13:24:52 crc kubenswrapper[4921]: E0312 13:24:52.393075 4921 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 12 13:24:52 crc kubenswrapper[4921]: E0312 13:24:52.393147 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae92198-0eeb-414f-859a-27c54e4338bf-memberlist podName:8ae92198-0eeb-414f-859a-27c54e4338bf nodeName:}" failed. No retries permitted until 2026-03-12 13:24:52.893127405 +0000 UTC m=+915.583199376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8ae92198-0eeb-414f-859a-27c54e4338bf-memberlist") pod "speaker-zfh6j" (UID: "8ae92198-0eeb-414f-859a-27c54e4338bf") : secret "metallb-memberlist" not found Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.394164 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/8ae92198-0eeb-414f-859a-27c54e4338bf-metallb-excludel2\") pod \"speaker-zfh6j\" (UID: \"8ae92198-0eeb-414f-859a-27c54e4338bf\") " pod="metallb-system/speaker-zfh6j" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.396384 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.396494 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ae92198-0eeb-414f-859a-27c54e4338bf-metrics-certs\") pod \"speaker-zfh6j\" (UID: \"8ae92198-0eeb-414f-859a-27c54e4338bf\") " pod="metallb-system/speaker-zfh6j" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.399333 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ceb498e3-36d0-4f72-9c07-54807b7a11ea-metrics-certs\") pod \"controller-7bb4cc7c98-nzvhg\" (UID: \"ceb498e3-36d0-4f72-9c07-54807b7a11ea\") " pod="metallb-system/controller-7bb4cc7c98-nzvhg" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.407888 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ceb498e3-36d0-4f72-9c07-54807b7a11ea-cert\") pod \"controller-7bb4cc7c98-nzvhg\" (UID: \"ceb498e3-36d0-4f72-9c07-54807b7a11ea\") " pod="metallb-system/controller-7bb4cc7c98-nzvhg" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.411329 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8cpt\" (UniqueName: \"kubernetes.io/projected/ceb498e3-36d0-4f72-9c07-54807b7a11ea-kube-api-access-v8cpt\") pod \"controller-7bb4cc7c98-nzvhg\" (UID: \"ceb498e3-36d0-4f72-9c07-54807b7a11ea\") " pod="metallb-system/controller-7bb4cc7c98-nzvhg" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.431234 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c22xc\" (UniqueName: \"kubernetes.io/projected/8ae92198-0eeb-414f-859a-27c54e4338bf-kube-api-access-c22xc\") pod \"speaker-zfh6j\" (UID: \"8ae92198-0eeb-414f-859a-27c54e4338bf\") " pod="metallb-system/speaker-zfh6j" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.488442 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-nzvhg" Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.700227 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-nzvhg"] Mar 12 13:24:52 crc kubenswrapper[4921]: I0312 13:24:52.898678 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8ae92198-0eeb-414f-859a-27c54e4338bf-memberlist\") pod \"speaker-zfh6j\" (UID: \"8ae92198-0eeb-414f-859a-27c54e4338bf\") " pod="metallb-system/speaker-zfh6j" Mar 12 13:24:52 crc kubenswrapper[4921]: E0312 13:24:52.898969 4921 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 12 13:24:52 crc kubenswrapper[4921]: E0312 13:24:52.899198 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae92198-0eeb-414f-859a-27c54e4338bf-memberlist podName:8ae92198-0eeb-414f-859a-27c54e4338bf nodeName:}" failed. No retries permitted until 2026-03-12 13:24:53.899176189 +0000 UTC m=+916.589248160 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/8ae92198-0eeb-414f-859a-27c54e4338bf-memberlist") pod "speaker-zfh6j" (UID: "8ae92198-0eeb-414f-859a-27c54e4338bf") : secret "metallb-memberlist" not found Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.175402 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-nzvhg" event={"ID":"ceb498e3-36d0-4f72-9c07-54807b7a11ea","Type":"ContainerStarted","Data":"a5b13779d7ae58d3c241721586bc07c35fcca7a01989e2d553722a2ce851d942"} Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.175444 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-nzvhg" event={"ID":"ceb498e3-36d0-4f72-9c07-54807b7a11ea","Type":"ContainerStarted","Data":"100124b3b85aa0bc6886cdd214f37184eca773bc7f99a4055510f43080ca7664"} Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.175454 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-nzvhg" event={"ID":"ceb498e3-36d0-4f72-9c07-54807b7a11ea","Type":"ContainerStarted","Data":"b0e0d6c3c6344cce7f0084a4f6f115734959ce9fb4d5d13f0c9e4b6338f70578"} Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.175664 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-nzvhg" Mar 12 13:24:53 crc kubenswrapper[4921]: E0312 13:24:53.290419 4921 configmap.go:193] Couldn't get configMap metallb-system/frr-startup: failed to sync configmap cache: timed out waiting for the condition Mar 12 13:24:53 crc kubenswrapper[4921]: E0312 13:24:53.290480 4921 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: failed to sync secret cache: timed out waiting for the condition Mar 12 13:24:53 crc kubenswrapper[4921]: E0312 13:24:53.290503 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2ebf7941-9d40-49cf-ad40-530b5e696770-frr-startup podName:2ebf7941-9d40-49cf-ad40-530b5e696770 nodeName:}" failed. No retries permitted until 2026-03-12 13:24:53.790484915 +0000 UTC m=+916.480556886 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "frr-startup" (UniqueName: "kubernetes.io/configmap/2ebf7941-9d40-49cf-ad40-530b5e696770-frr-startup") pod "frr-k8s-qcglj" (UID: "2ebf7941-9d40-49cf-ad40-530b5e696770") : failed to sync configmap cache: timed out waiting for the condition Mar 12 13:24:53 crc kubenswrapper[4921]: E0312 13:24:53.290551 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ebf7941-9d40-49cf-ad40-530b5e696770-metrics-certs podName:2ebf7941-9d40-49cf-ad40-530b5e696770 nodeName:}" failed. No retries permitted until 2026-03-12 13:24:53.790531556 +0000 UTC m=+916.480603517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ebf7941-9d40-49cf-ad40-530b5e696770-metrics-certs") pod "frr-k8s-qcglj" (UID: "2ebf7941-9d40-49cf-ad40-530b5e696770") : failed to sync secret cache: timed out waiting for the condition Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.310026 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.393110 4921 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.393199 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.446267 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.493050 4921 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-th7sc" Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.587350 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-nzvhg" podStartSLOduration=1.5873343370000002 podStartE2EDuration="1.587334337s" podCreationTimestamp="2026-03-12 13:24:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:24:53.207168231 +0000 UTC m=+915.897240202" watchObservedRunningTime="2026-03-12 13:24:53.587334337 +0000 UTC m=+916.277406308" Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.590066 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5"] Mar 12 13:24:53 crc kubenswrapper[4921]: W0312 13:24:53.596920 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaabfc338_f7a1_46a8_a02a_daf1adc64862.slice/crio-0f7919ff98df60d3f8ae8fd924a1e312452da6c2a23dcc232330898fe7c99a54 WatchSource:0}: Error finding container 0f7919ff98df60d3f8ae8fd924a1e312452da6c2a23dcc232330898fe7c99a54: Status 404 returned error can't find the container with id 0f7919ff98df60d3f8ae8fd924a1e312452da6c2a23dcc232330898fe7c99a54 Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.817086 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ebf7941-9d40-49cf-ad40-530b5e696770-metrics-certs\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.817246 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2ebf7941-9d40-49cf-ad40-530b5e696770-frr-startup\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.818345 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/2ebf7941-9d40-49cf-ad40-530b5e696770-frr-startup\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.823188 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ebf7941-9d40-49cf-ad40-530b5e696770-metrics-certs\") pod \"frr-k8s-qcglj\" (UID: \"2ebf7941-9d40-49cf-ad40-530b5e696770\") " pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.883529 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-qcglj" Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.919551 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8ae92198-0eeb-414f-859a-27c54e4338bf-memberlist\") pod \"speaker-zfh6j\" (UID: \"8ae92198-0eeb-414f-859a-27c54e4338bf\") " pod="metallb-system/speaker-zfh6j" Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.923250 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/8ae92198-0eeb-414f-859a-27c54e4338bf-memberlist\") pod \"speaker-zfh6j\" (UID: \"8ae92198-0eeb-414f-859a-27c54e4338bf\") " pod="metallb-system/speaker-zfh6j" Mar 12 13:24:53 crc kubenswrapper[4921]: I0312 13:24:53.974799 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zfh6j" Mar 12 13:24:54 crc kubenswrapper[4921]: W0312 13:24:54.002505 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ae92198_0eeb_414f_859a_27c54e4338bf.slice/crio-47efda9c985933e48bf6c88a8961c31a00fbeaa58ce10269f23c9565c6710bc5 WatchSource:0}: Error finding container 47efda9c985933e48bf6c88a8961c31a00fbeaa58ce10269f23c9565c6710bc5: Status 404 returned error can't find the container with id 47efda9c985933e48bf6c88a8961c31a00fbeaa58ce10269f23c9565c6710bc5 Mar 12 13:24:54 crc kubenswrapper[4921]: I0312 13:24:54.182163 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qcglj" event={"ID":"2ebf7941-9d40-49cf-ad40-530b5e696770","Type":"ContainerStarted","Data":"058c9af0aec15540c67d6de5fb8b0b5ea7823a610a5b046068a57e7be7909e9c"} Mar 12 13:24:54 crc kubenswrapper[4921]: I0312 13:24:54.183647 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" event={"ID":"aabfc338-f7a1-46a8-a02a-daf1adc64862","Type":"ContainerStarted","Data":"0f7919ff98df60d3f8ae8fd924a1e312452da6c2a23dcc232330898fe7c99a54"} Mar 12 13:24:54 crc kubenswrapper[4921]: I0312 13:24:54.185963 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zfh6j" event={"ID":"8ae92198-0eeb-414f-859a-27c54e4338bf","Type":"ContainerStarted","Data":"47efda9c985933e48bf6c88a8961c31a00fbeaa58ce10269f23c9565c6710bc5"} Mar 12 13:24:55 crc kubenswrapper[4921]: I0312 13:24:55.194304 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zfh6j" event={"ID":"8ae92198-0eeb-414f-859a-27c54e4338bf","Type":"ContainerStarted","Data":"49e9fb4c652f30876ee9bd29222512f022f792d21671aadc719b9e891f125834"} Mar 12 13:24:55 crc kubenswrapper[4921]: I0312 13:24:55.194354 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zfh6j" event={"ID":"8ae92198-0eeb-414f-859a-27c54e4338bf","Type":"ContainerStarted","Data":"6110198f05ce59cee46a32716104b8c579bd5341b60a4d2fb1ac7683603608e1"} Mar 12 13:24:55 crc kubenswrapper[4921]: I0312 13:24:55.194464 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zfh6j" Mar 12 13:24:55 crc kubenswrapper[4921]: I0312 13:24:55.221537 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zfh6j" podStartSLOduration=3.221506962 podStartE2EDuration="3.221506962s" podCreationTimestamp="2026-03-12 13:24:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:24:55.214615335 +0000 UTC m=+917.904687316" watchObservedRunningTime="2026-03-12 13:24:55.221506962 +0000 UTC m=+917.911578943" Mar 12 13:25:01 crc kubenswrapper[4921]: I0312 13:25:01.234034 4921 generic.go:334] "Generic (PLEG): container finished" podID="2ebf7941-9d40-49cf-ad40-530b5e696770" containerID="2107cf2cc30a82eb87fc6966c64b444f7a69762e0461aa4a66ad06fefdcc7933" exitCode=0 Mar 12 13:25:01 crc kubenswrapper[4921]: I0312 13:25:01.234151 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qcglj" event={"ID":"2ebf7941-9d40-49cf-ad40-530b5e696770","Type":"ContainerDied","Data":"2107cf2cc30a82eb87fc6966c64b444f7a69762e0461aa4a66ad06fefdcc7933"} Mar 12 13:25:01 crc kubenswrapper[4921]: I0312 13:25:01.237434 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" event={"ID":"aabfc338-f7a1-46a8-a02a-daf1adc64862","Type":"ContainerStarted","Data":"6e90777ecc0800e85bdcff98e2a4d609f7448a8bac9e0a4152b7c886809bb114"} Mar 12 13:25:01 crc kubenswrapper[4921]: I0312 13:25:01.237712 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" Mar 12 13:25:01 crc kubenswrapper[4921]: I0312 13:25:01.301452 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" podStartSLOduration=2.479447113 podStartE2EDuration="9.301431081s" podCreationTimestamp="2026-03-12 13:24:52 +0000 UTC" firstStartedPulling="2026-03-12 13:24:53.599326363 +0000 UTC m=+916.289398334" lastFinishedPulling="2026-03-12 13:25:00.421310321 +0000 UTC m=+923.111382302" observedRunningTime="2026-03-12 13:25:01.297173307 +0000 UTC m=+923.987245288" watchObservedRunningTime="2026-03-12 13:25:01.301431081 +0000 UTC m=+923.991503062" Mar 12 13:25:02 crc kubenswrapper[4921]: I0312 13:25:02.246035 4921 generic.go:334] "Generic (PLEG): container finished" podID="2ebf7941-9d40-49cf-ad40-530b5e696770" containerID="a7f4745589b6abd458dbf24573bc3a4e2983c4250f74f6a1978df6ff0b60acd9" exitCode=0 Mar 12 13:25:02 crc kubenswrapper[4921]: I0312 13:25:02.246102 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qcglj" event={"ID":"2ebf7941-9d40-49cf-ad40-530b5e696770","Type":"ContainerDied","Data":"a7f4745589b6abd458dbf24573bc3a4e2983c4250f74f6a1978df6ff0b60acd9"} Mar 12 13:25:02 crc kubenswrapper[4921]: I0312 13:25:02.495085 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-nzvhg" Mar 12 13:25:03 crc kubenswrapper[4921]: I0312 13:25:03.252751 4921 generic.go:334] "Generic (PLEG): container finished" podID="2ebf7941-9d40-49cf-ad40-530b5e696770" containerID="3e92656f243cc89c08dd4419daad1300ffa0ff67b96a71d85770d0c57ab62b6b" exitCode=0 Mar 12 13:25:03 crc kubenswrapper[4921]: I0312 13:25:03.252990 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qcglj" event={"ID":"2ebf7941-9d40-49cf-ad40-530b5e696770","Type":"ContainerDied","Data":"3e92656f243cc89c08dd4419daad1300ffa0ff67b96a71d85770d0c57ab62b6b"} Mar 12 13:25:04 crc kubenswrapper[4921]: I0312 13:25:04.264458 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qcglj" event={"ID":"2ebf7941-9d40-49cf-ad40-530b5e696770","Type":"ContainerStarted","Data":"21eec6eb3c3bd082bfed63896b4ea26cdebb1a0c3de00e36ab350c6bf924619e"} Mar 12 13:25:04 crc kubenswrapper[4921]: I0312 13:25:04.264784 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qcglj" event={"ID":"2ebf7941-9d40-49cf-ad40-530b5e696770","Type":"ContainerStarted","Data":"62294862879c1e8a35fb7b1f7c676d2de5e392474cf786089af104999367cd72"} Mar 12 13:25:04 crc kubenswrapper[4921]: I0312 13:25:04.264802 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-qcglj" Mar 12 13:25:04 crc kubenswrapper[4921]: I0312 13:25:04.264814 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qcglj" event={"ID":"2ebf7941-9d40-49cf-ad40-530b5e696770","Type":"ContainerStarted","Data":"97a5b434260c8ea0c9012322cdd360ea162ac4c630714e6e2fa8437ca2c05019"} Mar 12 13:25:04 crc kubenswrapper[4921]: I0312 13:25:04.264863 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qcglj" event={"ID":"2ebf7941-9d40-49cf-ad40-530b5e696770","Type":"ContainerStarted","Data":"2dcc32349942e74083e02ee7f211db7718e8a723adf889d46853951cf0c06512"} Mar 12 13:25:04 crc kubenswrapper[4921]: I0312 13:25:04.264874 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qcglj" event={"ID":"2ebf7941-9d40-49cf-ad40-530b5e696770","Type":"ContainerStarted","Data":"b76114f7d274db1eb61c5d677f957076bb78b099bf157ee20cf2b1fd1d80e36a"} Mar 12 13:25:04 crc kubenswrapper[4921]: I0312 13:25:04.264882 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-qcglj" event={"ID":"2ebf7941-9d40-49cf-ad40-530b5e696770","Type":"ContainerStarted","Data":"7c3af0c1329e68255e4c20ddce27627e4d90eeb4926369cdf1df8d0758476107"} Mar 12 13:25:04 crc kubenswrapper[4921]: I0312 13:25:04.282964 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-qcglj" podStartSLOduration=5.851138764 podStartE2EDuration="12.282948982s" podCreationTimestamp="2026-03-12 13:24:52 +0000 UTC" firstStartedPulling="2026-03-12 13:24:54.006171666 +0000 UTC m=+916.696243647" lastFinishedPulling="2026-03-12 13:25:00.437981884 +0000 UTC m=+923.128053865" observedRunningTime="2026-03-12 13:25:04.280459394 +0000 UTC m=+926.970531355" watchObservedRunningTime="2026-03-12 13:25:04.282948982 +0000 UTC m=+926.973020953" Mar 12 13:25:08 crc kubenswrapper[4921]: I0312 13:25:08.884012 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-qcglj" Mar 12 13:25:08 crc kubenswrapper[4921]: I0312 13:25:08.944467 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-qcglj" Mar 12 13:25:13 crc kubenswrapper[4921]: I0312 13:25:13.399991 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" Mar 12 13:25:13 crc kubenswrapper[4921]: I0312 13:25:13.886930 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-qcglj" Mar 12 13:25:13 crc kubenswrapper[4921]: I0312 13:25:13.978919 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zfh6j" Mar 12 13:25:16 crc kubenswrapper[4921]: I0312 13:25:16.841613 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hh472"] Mar 12 13:25:16 crc kubenswrapper[4921]: I0312 13:25:16.842927 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hh472" Mar 12 13:25:16 crc kubenswrapper[4921]: I0312 13:25:16.846400 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-8mndb" Mar 12 13:25:16 crc kubenswrapper[4921]: I0312 13:25:16.847104 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 12 13:25:16 crc kubenswrapper[4921]: I0312 13:25:16.847175 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 12 13:25:16 crc kubenswrapper[4921]: I0312 13:25:16.862188 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hh472"] Mar 12 13:25:16 crc kubenswrapper[4921]: I0312 13:25:16.879772 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfqjr\" (UniqueName: \"kubernetes.io/projected/019b83cc-a1bd-4145-9a27-77bafa886614-kube-api-access-dfqjr\") pod \"openstack-operator-index-hh472\" (UID: \"019b83cc-a1bd-4145-9a27-77bafa886614\") " pod="openstack-operators/openstack-operator-index-hh472" Mar 12 13:25:16 crc kubenswrapper[4921]: I0312 13:25:16.980570 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfqjr\" (UniqueName: \"kubernetes.io/projected/019b83cc-a1bd-4145-9a27-77bafa886614-kube-api-access-dfqjr\") pod \"openstack-operator-index-hh472\" (UID: \"019b83cc-a1bd-4145-9a27-77bafa886614\") " pod="openstack-operators/openstack-operator-index-hh472" Mar 12 13:25:17 crc kubenswrapper[4921]: I0312 13:25:17.001080 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfqjr\" (UniqueName: \"kubernetes.io/projected/019b83cc-a1bd-4145-9a27-77bafa886614-kube-api-access-dfqjr\") pod \"openstack-operator-index-hh472\" (UID: \"019b83cc-a1bd-4145-9a27-77bafa886614\") " pod="openstack-operators/openstack-operator-index-hh472" Mar 12 13:25:17 crc kubenswrapper[4921]: I0312 13:25:17.181078 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hh472" Mar 12 13:25:17 crc kubenswrapper[4921]: I0312 13:25:17.611903 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hh472"] Mar 12 13:25:17 crc kubenswrapper[4921]: W0312 13:25:17.629979 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod019b83cc_a1bd_4145_9a27_77bafa886614.slice/crio-e7b325c0c558700d69a4e859a8dfb53dd887ac993ce9a3ab7ad81734436b1133 WatchSource:0}: Error finding container e7b325c0c558700d69a4e859a8dfb53dd887ac993ce9a3ab7ad81734436b1133: Status 404 returned error can't find the container with id e7b325c0c558700d69a4e859a8dfb53dd887ac993ce9a3ab7ad81734436b1133 Mar 12 13:25:18 crc kubenswrapper[4921]: I0312 13:25:18.356997 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hh472" event={"ID":"019b83cc-a1bd-4145-9a27-77bafa886614","Type":"ContainerStarted","Data":"e7b325c0c558700d69a4e859a8dfb53dd887ac993ce9a3ab7ad81734436b1133"} Mar 12 13:25:19 crc kubenswrapper[4921]: I0312 13:25:19.813294 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hh472"] Mar 12 13:25:20 crc kubenswrapper[4921]: I0312 13:25:20.372406 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hh472" event={"ID":"019b83cc-a1bd-4145-9a27-77bafa886614","Type":"ContainerStarted","Data":"4dc986e58bab5157eed95c55f290b0f56be999aee399b88805d438db05731067"} Mar 12 13:25:20 crc kubenswrapper[4921]: I0312 13:25:20.372901 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-hh472" podUID="019b83cc-a1bd-4145-9a27-77bafa886614" containerName="registry-server" containerID="cri-o://4dc986e58bab5157eed95c55f290b0f56be999aee399b88805d438db05731067" gracePeriod=2 Mar 12 13:25:20 crc kubenswrapper[4921]: I0312 13:25:20.395212 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hh472" podStartSLOduration=1.9248495939999999 podStartE2EDuration="4.395187579s" podCreationTimestamp="2026-03-12 13:25:16 +0000 UTC" firstStartedPulling="2026-03-12 13:25:17.631964676 +0000 UTC m=+940.322036667" lastFinishedPulling="2026-03-12 13:25:20.102302681 +0000 UTC m=+942.792374652" observedRunningTime="2026-03-12 13:25:20.392219306 +0000 UTC m=+943.082291307" watchObservedRunningTime="2026-03-12 13:25:20.395187579 +0000 UTC m=+943.085259570" Mar 12 13:25:20 crc kubenswrapper[4921]: I0312 13:25:20.423434 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rrhpc"] Mar 12 13:25:20 crc kubenswrapper[4921]: I0312 13:25:20.424270 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rrhpc" Mar 12 13:25:20 crc kubenswrapper[4921]: I0312 13:25:20.427567 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rrhpc"] Mar 12 13:25:20 crc kubenswrapper[4921]: I0312 13:25:20.563584 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zjbc\" (UniqueName: \"kubernetes.io/projected/5f20d433-83bd-4524-a6ce-ef19ef8a1064-kube-api-access-6zjbc\") pod \"openstack-operator-index-rrhpc\" (UID: \"5f20d433-83bd-4524-a6ce-ef19ef8a1064\") " pod="openstack-operators/openstack-operator-index-rrhpc" Mar 12 13:25:20 crc kubenswrapper[4921]: I0312 13:25:20.665341 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zjbc\" (UniqueName: \"kubernetes.io/projected/5f20d433-83bd-4524-a6ce-ef19ef8a1064-kube-api-access-6zjbc\") pod \"openstack-operator-index-rrhpc\" (UID: \"5f20d433-83bd-4524-a6ce-ef19ef8a1064\") " pod="openstack-operators/openstack-operator-index-rrhpc" Mar 12 13:25:20 crc kubenswrapper[4921]: I0312 13:25:20.685620 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zjbc\" (UniqueName: \"kubernetes.io/projected/5f20d433-83bd-4524-a6ce-ef19ef8a1064-kube-api-access-6zjbc\") pod \"openstack-operator-index-rrhpc\" (UID: \"5f20d433-83bd-4524-a6ce-ef19ef8a1064\") " pod="openstack-operators/openstack-operator-index-rrhpc" Mar 12 13:25:20 crc kubenswrapper[4921]: I0312 13:25:20.728641 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hh472" Mar 12 13:25:20 crc kubenswrapper[4921]: I0312 13:25:20.820449 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rrhpc" Mar 12 13:25:20 crc kubenswrapper[4921]: I0312 13:25:20.867239 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfqjr\" (UniqueName: \"kubernetes.io/projected/019b83cc-a1bd-4145-9a27-77bafa886614-kube-api-access-dfqjr\") pod \"019b83cc-a1bd-4145-9a27-77bafa886614\" (UID: \"019b83cc-a1bd-4145-9a27-77bafa886614\") " Mar 12 13:25:20 crc kubenswrapper[4921]: I0312 13:25:20.870523 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/019b83cc-a1bd-4145-9a27-77bafa886614-kube-api-access-dfqjr" (OuterVolumeSpecName: "kube-api-access-dfqjr") pod "019b83cc-a1bd-4145-9a27-77bafa886614" (UID: "019b83cc-a1bd-4145-9a27-77bafa886614"). InnerVolumeSpecName "kube-api-access-dfqjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:25:20 crc kubenswrapper[4921]: I0312 13:25:20.969032 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfqjr\" (UniqueName: \"kubernetes.io/projected/019b83cc-a1bd-4145-9a27-77bafa886614-kube-api-access-dfqjr\") on node \"crc\" DevicePath \"\"" Mar 12 13:25:21 crc kubenswrapper[4921]: I0312 13:25:21.243169 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rrhpc"] Mar 12 13:25:21 crc kubenswrapper[4921]: W0312 13:25:21.248115 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f20d433_83bd_4524_a6ce_ef19ef8a1064.slice/crio-75edb4ba15c688db1c70586c36c220f8e26d01da7faf6a575c0a732dd567f1d6 WatchSource:0}: Error finding container 75edb4ba15c688db1c70586c36c220f8e26d01da7faf6a575c0a732dd567f1d6: Status 404 returned error can't find the container with id 75edb4ba15c688db1c70586c36c220f8e26d01da7faf6a575c0a732dd567f1d6 Mar 12 13:25:21 crc kubenswrapper[4921]: I0312 13:25:21.380782 4921 generic.go:334] "Generic (PLEG): container finished" podID="019b83cc-a1bd-4145-9a27-77bafa886614" containerID="4dc986e58bab5157eed95c55f290b0f56be999aee399b88805d438db05731067" exitCode=0 Mar 12 13:25:21 crc kubenswrapper[4921]: I0312 13:25:21.380855 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hh472" event={"ID":"019b83cc-a1bd-4145-9a27-77bafa886614","Type":"ContainerDied","Data":"4dc986e58bab5157eed95c55f290b0f56be999aee399b88805d438db05731067"} Mar 12 13:25:21 crc kubenswrapper[4921]: I0312 13:25:21.381147 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hh472" event={"ID":"019b83cc-a1bd-4145-9a27-77bafa886614","Type":"ContainerDied","Data":"e7b325c0c558700d69a4e859a8dfb53dd887ac993ce9a3ab7ad81734436b1133"} Mar 12 13:25:21 crc kubenswrapper[4921]: I0312 13:25:21.381170 4921 scope.go:117] "RemoveContainer" containerID="4dc986e58bab5157eed95c55f290b0f56be999aee399b88805d438db05731067" Mar 12 13:25:21 crc kubenswrapper[4921]: I0312 13:25:21.380918 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hh472" Mar 12 13:25:21 crc kubenswrapper[4921]: I0312 13:25:21.382486 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rrhpc" event={"ID":"5f20d433-83bd-4524-a6ce-ef19ef8a1064","Type":"ContainerStarted","Data":"75edb4ba15c688db1c70586c36c220f8e26d01da7faf6a575c0a732dd567f1d6"} Mar 12 13:25:21 crc kubenswrapper[4921]: I0312 13:25:21.407327 4921 scope.go:117] "RemoveContainer" containerID="4dc986e58bab5157eed95c55f290b0f56be999aee399b88805d438db05731067" Mar 12 13:25:21 crc kubenswrapper[4921]: E0312 13:25:21.407897 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc986e58bab5157eed95c55f290b0f56be999aee399b88805d438db05731067\": container with ID starting with 4dc986e58bab5157eed95c55f290b0f56be999aee399b88805d438db05731067 not found: ID does not exist" containerID="4dc986e58bab5157eed95c55f290b0f56be999aee399b88805d438db05731067" Mar 12 13:25:21 crc kubenswrapper[4921]: I0312 13:25:21.407933 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc986e58bab5157eed95c55f290b0f56be999aee399b88805d438db05731067"} err="failed to get container status \"4dc986e58bab5157eed95c55f290b0f56be999aee399b88805d438db05731067\": rpc error: code = NotFound desc = could not find container \"4dc986e58bab5157eed95c55f290b0f56be999aee399b88805d438db05731067\": container with ID starting with 4dc986e58bab5157eed95c55f290b0f56be999aee399b88805d438db05731067 not found: ID does not exist" Mar 12 13:25:21 crc kubenswrapper[4921]: I0312 13:25:21.415178 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hh472"] Mar 12 13:25:21 crc kubenswrapper[4921]: I0312 13:25:21.419297 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-hh472"] Mar 12 13:25:21 crc kubenswrapper[4921]: I0312 13:25:21.994421 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="019b83cc-a1bd-4145-9a27-77bafa886614" path="/var/lib/kubelet/pods/019b83cc-a1bd-4145-9a27-77bafa886614/volumes" Mar 12 13:25:22 crc kubenswrapper[4921]: I0312 13:25:22.394553 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rrhpc" event={"ID":"5f20d433-83bd-4524-a6ce-ef19ef8a1064","Type":"ContainerStarted","Data":"815be658db90c53b3edfbc468c911326a0910a2b1cc9592c9982317f8bf8c1f7"} Mar 12 13:25:22 crc kubenswrapper[4921]: I0312 13:25:22.413541 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rrhpc" podStartSLOduration=2.358313742 podStartE2EDuration="2.413517475s" podCreationTimestamp="2026-03-12 13:25:20 +0000 UTC" firstStartedPulling="2026-03-12 13:25:21.255443365 +0000 UTC m=+943.945515356" lastFinishedPulling="2026-03-12 13:25:21.310647118 +0000 UTC m=+944.000719089" observedRunningTime="2026-03-12 13:25:22.411373757 +0000 UTC m=+945.101445768" watchObservedRunningTime="2026-03-12 13:25:22.413517475 +0000 UTC m=+945.103589486" Mar 12 13:25:24 crc kubenswrapper[4921]: I0312 13:25:24.837838 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qnbrh"] Mar 12 13:25:24 crc kubenswrapper[4921]: E0312 13:25:24.839178 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="019b83cc-a1bd-4145-9a27-77bafa886614" containerName="registry-server" Mar 12 13:25:24 crc kubenswrapper[4921]: I0312 13:25:24.839223 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="019b83cc-a1bd-4145-9a27-77bafa886614" containerName="registry-server" Mar 12 13:25:24 crc kubenswrapper[4921]: I0312 13:25:24.839582 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="019b83cc-a1bd-4145-9a27-77bafa886614" containerName="registry-server" Mar 12 13:25:24 crc kubenswrapper[4921]: I0312 13:25:24.841638 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:24 crc kubenswrapper[4921]: I0312 13:25:24.862510 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qnbrh"] Mar 12 13:25:24 crc kubenswrapper[4921]: I0312 13:25:24.941643 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpnf8\" (UniqueName: \"kubernetes.io/projected/02007124-6057-4c76-9a8a-7da45fbb5450-kube-api-access-lpnf8\") pod \"community-operators-qnbrh\" (UID: \"02007124-6057-4c76-9a8a-7da45fbb5450\") " pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:24 crc kubenswrapper[4921]: I0312 13:25:24.941737 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02007124-6057-4c76-9a8a-7da45fbb5450-catalog-content\") pod \"community-operators-qnbrh\" (UID: \"02007124-6057-4c76-9a8a-7da45fbb5450\") " pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:24 crc kubenswrapper[4921]: I0312 13:25:24.941806 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02007124-6057-4c76-9a8a-7da45fbb5450-utilities\") pod \"community-operators-qnbrh\" (UID: \"02007124-6057-4c76-9a8a-7da45fbb5450\") " pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:25 crc kubenswrapper[4921]: I0312 13:25:25.042928 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02007124-6057-4c76-9a8a-7da45fbb5450-catalog-content\") pod \"community-operators-qnbrh\" (UID: \"02007124-6057-4c76-9a8a-7da45fbb5450\") " pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:25 crc kubenswrapper[4921]: I0312 13:25:25.043005 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02007124-6057-4c76-9a8a-7da45fbb5450-utilities\") pod \"community-operators-qnbrh\" (UID: \"02007124-6057-4c76-9a8a-7da45fbb5450\") " pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:25 crc kubenswrapper[4921]: I0312 13:25:25.043078 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpnf8\" (UniqueName: \"kubernetes.io/projected/02007124-6057-4c76-9a8a-7da45fbb5450-kube-api-access-lpnf8\") pod \"community-operators-qnbrh\" (UID: \"02007124-6057-4c76-9a8a-7da45fbb5450\") " pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:25 crc kubenswrapper[4921]: I0312 13:25:25.044047 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02007124-6057-4c76-9a8a-7da45fbb5450-catalog-content\") pod \"community-operators-qnbrh\" (UID: \"02007124-6057-4c76-9a8a-7da45fbb5450\") " pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:25 crc kubenswrapper[4921]: I0312 13:25:25.044392 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02007124-6057-4c76-9a8a-7da45fbb5450-utilities\") pod \"community-operators-qnbrh\" (UID: \"02007124-6057-4c76-9a8a-7da45fbb5450\") " pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:25 crc kubenswrapper[4921]: I0312 13:25:25.080708 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpnf8\" (UniqueName: \"kubernetes.io/projected/02007124-6057-4c76-9a8a-7da45fbb5450-kube-api-access-lpnf8\") pod \"community-operators-qnbrh\" (UID: \"02007124-6057-4c76-9a8a-7da45fbb5450\") " pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:25 crc kubenswrapper[4921]: I0312 13:25:25.177580 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:25 crc kubenswrapper[4921]: I0312 13:25:25.450231 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qnbrh"] Mar 12 13:25:26 crc kubenswrapper[4921]: I0312 13:25:26.435242 4921 generic.go:334] "Generic (PLEG): container finished" podID="02007124-6057-4c76-9a8a-7da45fbb5450" containerID="be8e1aff0328d2e3a4f335b0dfb700efc023d0b314dcfd68c279a226b20d7cad" exitCode=0 Mar 12 13:25:26 crc kubenswrapper[4921]: I0312 13:25:26.435351 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qnbrh" event={"ID":"02007124-6057-4c76-9a8a-7da45fbb5450","Type":"ContainerDied","Data":"be8e1aff0328d2e3a4f335b0dfb700efc023d0b314dcfd68c279a226b20d7cad"} Mar 12 13:25:26 crc kubenswrapper[4921]: I0312 13:25:26.435590 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qnbrh" event={"ID":"02007124-6057-4c76-9a8a-7da45fbb5450","Type":"ContainerStarted","Data":"47ac7c0f32426ab0550c922bd9ec8cd53e6a331f3d9c137a7062ce9689a8ae30"} Mar 12 13:25:27 crc kubenswrapper[4921]: I0312 13:25:27.444663 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qnbrh" event={"ID":"02007124-6057-4c76-9a8a-7da45fbb5450","Type":"ContainerStarted","Data":"0c1d09c13d1538f8de372704d0177bf0c6b360e8fec4e5e7a60e93e9ca2923b4"} Mar 12 13:25:28 crc kubenswrapper[4921]: I0312 13:25:28.456436 4921 generic.go:334] "Generic (PLEG): container finished" podID="02007124-6057-4c76-9a8a-7da45fbb5450" containerID="0c1d09c13d1538f8de372704d0177bf0c6b360e8fec4e5e7a60e93e9ca2923b4" exitCode=0 Mar 12 13:25:28 crc kubenswrapper[4921]: I0312 13:25:28.456497 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qnbrh" event={"ID":"02007124-6057-4c76-9a8a-7da45fbb5450","Type":"ContainerDied","Data":"0c1d09c13d1538f8de372704d0177bf0c6b360e8fec4e5e7a60e93e9ca2923b4"} Mar 12 13:25:29 crc kubenswrapper[4921]: I0312 13:25:29.467539 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qnbrh" event={"ID":"02007124-6057-4c76-9a8a-7da45fbb5450","Type":"ContainerStarted","Data":"03ceaeb590891f2263ea251ed3386a30cd832d3c1faed756de5e3fc7776b5b93"} Mar 12 13:25:29 crc kubenswrapper[4921]: I0312 13:25:29.496245 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qnbrh" podStartSLOduration=3.063975022 podStartE2EDuration="5.496226233s" podCreationTimestamp="2026-03-12 13:25:24 +0000 UTC" firstStartedPulling="2026-03-12 13:25:26.437385965 +0000 UTC m=+949.127457936" lastFinishedPulling="2026-03-12 13:25:28.869637146 +0000 UTC m=+951.559709147" observedRunningTime="2026-03-12 13:25:29.492430094 +0000 UTC m=+952.182502125" watchObservedRunningTime="2026-03-12 13:25:29.496226233 +0000 UTC m=+952.186298214" Mar 12 13:25:30 crc kubenswrapper[4921]: I0312 13:25:30.821105 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-rrhpc" Mar 12 13:25:30 crc kubenswrapper[4921]: I0312 13:25:30.821763 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-rrhpc" Mar 12 13:25:30 crc kubenswrapper[4921]: I0312 13:25:30.846120 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-rrhpc" Mar 12 13:25:31 crc kubenswrapper[4921]: I0312 13:25:31.506114 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-rrhpc" Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.260619 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t"] Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.262651 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.266938 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wtj85" Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.280911 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t"] Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.352582 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee35f8dc-1fbf-4466-86c0-17d859d09951-bundle\") pod \"4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t\" (UID: \"ee35f8dc-1fbf-4466-86c0-17d859d09951\") " pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.352624 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee35f8dc-1fbf-4466-86c0-17d859d09951-util\") pod \"4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t\" (UID: \"ee35f8dc-1fbf-4466-86c0-17d859d09951\") " pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.352666 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28mwh\" (UniqueName: \"kubernetes.io/projected/ee35f8dc-1fbf-4466-86c0-17d859d09951-kube-api-access-28mwh\") pod \"4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t\" (UID: \"ee35f8dc-1fbf-4466-86c0-17d859d09951\") " pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.454426 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28mwh\" (UniqueName: \"kubernetes.io/projected/ee35f8dc-1fbf-4466-86c0-17d859d09951-kube-api-access-28mwh\") pod \"4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t\" (UID: \"ee35f8dc-1fbf-4466-86c0-17d859d09951\") " pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.454616 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee35f8dc-1fbf-4466-86c0-17d859d09951-bundle\") pod \"4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t\" (UID: \"ee35f8dc-1fbf-4466-86c0-17d859d09951\") " pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.454662 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee35f8dc-1fbf-4466-86c0-17d859d09951-util\") pod \"4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t\" (UID: \"ee35f8dc-1fbf-4466-86c0-17d859d09951\") " pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.455281 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee35f8dc-1fbf-4466-86c0-17d859d09951-bundle\") pod \"4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t\" (UID: \"ee35f8dc-1fbf-4466-86c0-17d859d09951\") " pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.455495 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee35f8dc-1fbf-4466-86c0-17d859d09951-util\") pod \"4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t\" (UID: \"ee35f8dc-1fbf-4466-86c0-17d859d09951\") " pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.476587 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28mwh\" (UniqueName: \"kubernetes.io/projected/ee35f8dc-1fbf-4466-86c0-17d859d09951-kube-api-access-28mwh\") pod \"4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t\" (UID: \"ee35f8dc-1fbf-4466-86c0-17d859d09951\") " pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.592430 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" Mar 12 13:25:32 crc kubenswrapper[4921]: I0312 13:25:32.874082 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t"] Mar 12 13:25:32 crc kubenswrapper[4921]: W0312 13:25:32.877630 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee35f8dc_1fbf_4466_86c0_17d859d09951.slice/crio-24f311cbd1d957bac1a87e6e465f95b37f46fb6507d37892b73d45375b58dac5 WatchSource:0}: Error finding container 24f311cbd1d957bac1a87e6e465f95b37f46fb6507d37892b73d45375b58dac5: Status 404 returned error can't find the container with id 24f311cbd1d957bac1a87e6e465f95b37f46fb6507d37892b73d45375b58dac5 Mar 12 13:25:33 crc kubenswrapper[4921]: I0312 13:25:33.502449 4921 generic.go:334] "Generic (PLEG): container finished" podID="ee35f8dc-1fbf-4466-86c0-17d859d09951" containerID="1ecfe1113f66391668c1c0abde6fea491f7fa2c6842eff7d628880d42e962219" exitCode=0 Mar 12 13:25:33 crc kubenswrapper[4921]: I0312 13:25:33.502551 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" event={"ID":"ee35f8dc-1fbf-4466-86c0-17d859d09951","Type":"ContainerDied","Data":"1ecfe1113f66391668c1c0abde6fea491f7fa2c6842eff7d628880d42e962219"} Mar 12 13:25:33 crc kubenswrapper[4921]: I0312 13:25:33.502903 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" event={"ID":"ee35f8dc-1fbf-4466-86c0-17d859d09951","Type":"ContainerStarted","Data":"24f311cbd1d957bac1a87e6e465f95b37f46fb6507d37892b73d45375b58dac5"} Mar 12 13:25:34 crc kubenswrapper[4921]: I0312 13:25:34.510281 4921 generic.go:334] "Generic (PLEG): container finished" podID="ee35f8dc-1fbf-4466-86c0-17d859d09951" containerID="d70b855a0c8e98ed993cae2f79fe9704f7f43dbd66bef0655557a14ba71a6b99" exitCode=0 Mar 12 13:25:34 crc kubenswrapper[4921]: I0312 13:25:34.510333 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" event={"ID":"ee35f8dc-1fbf-4466-86c0-17d859d09951","Type":"ContainerDied","Data":"d70b855a0c8e98ed993cae2f79fe9704f7f43dbd66bef0655557a14ba71a6b99"} Mar 12 13:25:35 crc kubenswrapper[4921]: I0312 13:25:35.178740 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:35 crc kubenswrapper[4921]: I0312 13:25:35.178886 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:35 crc kubenswrapper[4921]: I0312 13:25:35.230280 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:35 crc kubenswrapper[4921]: I0312 13:25:35.523327 4921 generic.go:334] "Generic (PLEG): container finished" podID="ee35f8dc-1fbf-4466-86c0-17d859d09951" containerID="36753a680a55a4ed5205e0e7d6279a30c303c056ae08c133211d67fd46262725" exitCode=0 Mar 12 13:25:35 crc kubenswrapper[4921]: I0312 13:25:35.523429 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" event={"ID":"ee35f8dc-1fbf-4466-86c0-17d859d09951","Type":"ContainerDied","Data":"36753a680a55a4ed5205e0e7d6279a30c303c056ae08c133211d67fd46262725"} Mar 12 13:25:35 crc kubenswrapper[4921]: I0312 13:25:35.595757 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:36 crc kubenswrapper[4921]: I0312 13:25:36.852452 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" Mar 12 13:25:36 crc kubenswrapper[4921]: I0312 13:25:36.923129 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee35f8dc-1fbf-4466-86c0-17d859d09951-util\") pod \"ee35f8dc-1fbf-4466-86c0-17d859d09951\" (UID: \"ee35f8dc-1fbf-4466-86c0-17d859d09951\") " Mar 12 13:25:36 crc kubenswrapper[4921]: I0312 13:25:36.923187 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28mwh\" (UniqueName: \"kubernetes.io/projected/ee35f8dc-1fbf-4466-86c0-17d859d09951-kube-api-access-28mwh\") pod \"ee35f8dc-1fbf-4466-86c0-17d859d09951\" (UID: \"ee35f8dc-1fbf-4466-86c0-17d859d09951\") " Mar 12 13:25:36 crc kubenswrapper[4921]: I0312 13:25:36.923309 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee35f8dc-1fbf-4466-86c0-17d859d09951-bundle\") pod \"ee35f8dc-1fbf-4466-86c0-17d859d09951\" (UID: \"ee35f8dc-1fbf-4466-86c0-17d859d09951\") " Mar 12 13:25:36 crc kubenswrapper[4921]: I0312 13:25:36.923954 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee35f8dc-1fbf-4466-86c0-17d859d09951-bundle" (OuterVolumeSpecName: "bundle") pod "ee35f8dc-1fbf-4466-86c0-17d859d09951" (UID: "ee35f8dc-1fbf-4466-86c0-17d859d09951"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:25:36 crc kubenswrapper[4921]: I0312 13:25:36.933988 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee35f8dc-1fbf-4466-86c0-17d859d09951-kube-api-access-28mwh" (OuterVolumeSpecName: "kube-api-access-28mwh") pod "ee35f8dc-1fbf-4466-86c0-17d859d09951" (UID: "ee35f8dc-1fbf-4466-86c0-17d859d09951"). InnerVolumeSpecName "kube-api-access-28mwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:25:36 crc kubenswrapper[4921]: I0312 13:25:36.939621 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee35f8dc-1fbf-4466-86c0-17d859d09951-util" (OuterVolumeSpecName: "util") pod "ee35f8dc-1fbf-4466-86c0-17d859d09951" (UID: "ee35f8dc-1fbf-4466-86c0-17d859d09951"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:25:37 crc kubenswrapper[4921]: I0312 13:25:37.025282 4921 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee35f8dc-1fbf-4466-86c0-17d859d09951-util\") on node \"crc\" DevicePath \"\"" Mar 12 13:25:37 crc kubenswrapper[4921]: I0312 13:25:37.025335 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28mwh\" (UniqueName: \"kubernetes.io/projected/ee35f8dc-1fbf-4466-86c0-17d859d09951-kube-api-access-28mwh\") on node \"crc\" DevicePath \"\"" Mar 12 13:25:37 crc kubenswrapper[4921]: I0312 13:25:37.025355 4921 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee35f8dc-1fbf-4466-86c0-17d859d09951-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:25:37 crc kubenswrapper[4921]: I0312 13:25:37.423334 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qnbrh"] Mar 12 13:25:37 crc kubenswrapper[4921]: I0312 13:25:37.545417 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" event={"ID":"ee35f8dc-1fbf-4466-86c0-17d859d09951","Type":"ContainerDied","Data":"24f311cbd1d957bac1a87e6e465f95b37f46fb6507d37892b73d45375b58dac5"} Mar 12 13:25:37 crc kubenswrapper[4921]: I0312 13:25:37.545485 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24f311cbd1d957bac1a87e6e465f95b37f46fb6507d37892b73d45375b58dac5" Mar 12 13:25:37 crc kubenswrapper[4921]: I0312 13:25:37.545497 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t" Mar 12 13:25:37 crc kubenswrapper[4921]: I0312 13:25:37.545595 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qnbrh" podUID="02007124-6057-4c76-9a8a-7da45fbb5450" containerName="registry-server" containerID="cri-o://03ceaeb590891f2263ea251ed3386a30cd832d3c1faed756de5e3fc7776b5b93" gracePeriod=2 Mar 12 13:25:38 crc kubenswrapper[4921]: I0312 13:25:38.557331 4921 generic.go:334] "Generic (PLEG): container finished" podID="02007124-6057-4c76-9a8a-7da45fbb5450" containerID="03ceaeb590891f2263ea251ed3386a30cd832d3c1faed756de5e3fc7776b5b93" exitCode=0 Mar 12 13:25:38 crc kubenswrapper[4921]: I0312 13:25:38.557450 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qnbrh" event={"ID":"02007124-6057-4c76-9a8a-7da45fbb5450","Type":"ContainerDied","Data":"03ceaeb590891f2263ea251ed3386a30cd832d3c1faed756de5e3fc7776b5b93"} Mar 12 13:25:38 crc kubenswrapper[4921]: I0312 13:25:38.557652 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qnbrh" event={"ID":"02007124-6057-4c76-9a8a-7da45fbb5450","Type":"ContainerDied","Data":"47ac7c0f32426ab0550c922bd9ec8cd53e6a331f3d9c137a7062ce9689a8ae30"} Mar 12 13:25:38 crc kubenswrapper[4921]: I0312 13:25:38.557677 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47ac7c0f32426ab0550c922bd9ec8cd53e6a331f3d9c137a7062ce9689a8ae30" Mar 12 13:25:38 crc kubenswrapper[4921]: I0312 13:25:38.560578 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:38 crc kubenswrapper[4921]: I0312 13:25:38.646735 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02007124-6057-4c76-9a8a-7da45fbb5450-catalog-content\") pod \"02007124-6057-4c76-9a8a-7da45fbb5450\" (UID: \"02007124-6057-4c76-9a8a-7da45fbb5450\") " Mar 12 13:25:38 crc kubenswrapper[4921]: I0312 13:25:38.646795 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpnf8\" (UniqueName: \"kubernetes.io/projected/02007124-6057-4c76-9a8a-7da45fbb5450-kube-api-access-lpnf8\") pod \"02007124-6057-4c76-9a8a-7da45fbb5450\" (UID: \"02007124-6057-4c76-9a8a-7da45fbb5450\") " Mar 12 13:25:38 crc kubenswrapper[4921]: I0312 13:25:38.646883 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02007124-6057-4c76-9a8a-7da45fbb5450-utilities\") pod \"02007124-6057-4c76-9a8a-7da45fbb5450\" (UID: \"02007124-6057-4c76-9a8a-7da45fbb5450\") " Mar 12 13:25:38 crc kubenswrapper[4921]: I0312 13:25:38.647790 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02007124-6057-4c76-9a8a-7da45fbb5450-utilities" (OuterVolumeSpecName: "utilities") pod "02007124-6057-4c76-9a8a-7da45fbb5450" (UID: "02007124-6057-4c76-9a8a-7da45fbb5450"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:25:38 crc kubenswrapper[4921]: I0312 13:25:38.654444 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02007124-6057-4c76-9a8a-7da45fbb5450-kube-api-access-lpnf8" (OuterVolumeSpecName: "kube-api-access-lpnf8") pod "02007124-6057-4c76-9a8a-7da45fbb5450" (UID: "02007124-6057-4c76-9a8a-7da45fbb5450"). InnerVolumeSpecName "kube-api-access-lpnf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:25:38 crc kubenswrapper[4921]: I0312 13:25:38.715613 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02007124-6057-4c76-9a8a-7da45fbb5450-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02007124-6057-4c76-9a8a-7da45fbb5450" (UID: "02007124-6057-4c76-9a8a-7da45fbb5450"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:25:38 crc kubenswrapper[4921]: I0312 13:25:38.748309 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02007124-6057-4c76-9a8a-7da45fbb5450-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:25:38 crc kubenswrapper[4921]: I0312 13:25:38.748342 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpnf8\" (UniqueName: \"kubernetes.io/projected/02007124-6057-4c76-9a8a-7da45fbb5450-kube-api-access-lpnf8\") on node \"crc\" DevicePath \"\"" Mar 12 13:25:38 crc kubenswrapper[4921]: I0312 13:25:38.748361 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02007124-6057-4c76-9a8a-7da45fbb5450-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:25:39 crc kubenswrapper[4921]: I0312 13:25:39.562521 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qnbrh" Mar 12 13:25:39 crc kubenswrapper[4921]: I0312 13:25:39.596043 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qnbrh"] Mar 12 13:25:39 crc kubenswrapper[4921]: I0312 13:25:39.601280 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qnbrh"] Mar 12 13:25:39 crc kubenswrapper[4921]: I0312 13:25:39.996878 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02007124-6057-4c76-9a8a-7da45fbb5450" path="/var/lib/kubelet/pods/02007124-6057-4c76-9a8a-7da45fbb5450/volumes" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.421020 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5bc4df7446-bp8nq"] Mar 12 13:25:42 crc kubenswrapper[4921]: E0312 13:25:42.421717 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02007124-6057-4c76-9a8a-7da45fbb5450" containerName="registry-server" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.421740 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="02007124-6057-4c76-9a8a-7da45fbb5450" containerName="registry-server" Mar 12 13:25:42 crc kubenswrapper[4921]: E0312 13:25:42.421770 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee35f8dc-1fbf-4466-86c0-17d859d09951" containerName="util" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.421783 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee35f8dc-1fbf-4466-86c0-17d859d09951" containerName="util" Mar 12 13:25:42 crc kubenswrapper[4921]: E0312 13:25:42.421797 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02007124-6057-4c76-9a8a-7da45fbb5450" containerName="extract-utilities" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.421810 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="02007124-6057-4c76-9a8a-7da45fbb5450" containerName="extract-utilities" Mar 12 13:25:42 crc kubenswrapper[4921]: E0312 13:25:42.421862 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02007124-6057-4c76-9a8a-7da45fbb5450" containerName="extract-content" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.421876 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="02007124-6057-4c76-9a8a-7da45fbb5450" containerName="extract-content" Mar 12 13:25:42 crc kubenswrapper[4921]: E0312 13:25:42.421893 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee35f8dc-1fbf-4466-86c0-17d859d09951" containerName="extract" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.421904 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee35f8dc-1fbf-4466-86c0-17d859d09951" containerName="extract" Mar 12 13:25:42 crc kubenswrapper[4921]: E0312 13:25:42.421918 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee35f8dc-1fbf-4466-86c0-17d859d09951" containerName="pull" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.421930 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee35f8dc-1fbf-4466-86c0-17d859d09951" containerName="pull" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.422122 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="02007124-6057-4c76-9a8a-7da45fbb5450" containerName="registry-server" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.422147 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee35f8dc-1fbf-4466-86c0-17d859d09951" containerName="extract" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.422731 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5bc4df7446-bp8nq" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.425471 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-hhjmj" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.467465 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5bc4df7446-bp8nq"] Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.503529 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27nqn\" (UniqueName: \"kubernetes.io/projected/c7db0c3c-40e2-49df-bffc-c0f94b26c92f-kube-api-access-27nqn\") pod \"openstack-operator-controller-init-5bc4df7446-bp8nq\" (UID: \"c7db0c3c-40e2-49df-bffc-c0f94b26c92f\") " pod="openstack-operators/openstack-operator-controller-init-5bc4df7446-bp8nq" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.604928 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27nqn\" (UniqueName: \"kubernetes.io/projected/c7db0c3c-40e2-49df-bffc-c0f94b26c92f-kube-api-access-27nqn\") pod \"openstack-operator-controller-init-5bc4df7446-bp8nq\" (UID: \"c7db0c3c-40e2-49df-bffc-c0f94b26c92f\") " pod="openstack-operators/openstack-operator-controller-init-5bc4df7446-bp8nq" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.622594 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27nqn\" (UniqueName: \"kubernetes.io/projected/c7db0c3c-40e2-49df-bffc-c0f94b26c92f-kube-api-access-27nqn\") pod \"openstack-operator-controller-init-5bc4df7446-bp8nq\" (UID: \"c7db0c3c-40e2-49df-bffc-c0f94b26c92f\") " pod="openstack-operators/openstack-operator-controller-init-5bc4df7446-bp8nq" Mar 12 13:25:42 crc kubenswrapper[4921]: I0312 13:25:42.745074 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5bc4df7446-bp8nq" Mar 12 13:25:43 crc kubenswrapper[4921]: I0312 13:25:43.048863 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5bc4df7446-bp8nq"] Mar 12 13:25:43 crc kubenswrapper[4921]: I0312 13:25:43.592349 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5bc4df7446-bp8nq" event={"ID":"c7db0c3c-40e2-49df-bffc-c0f94b26c92f","Type":"ContainerStarted","Data":"5d34cc9ec8b9bd61d38622d4ffe6e6ed0d43a265dc176c86fc4a1aa08dcf2eb7"} Mar 12 13:25:47 crc kubenswrapper[4921]: I0312 13:25:47.637653 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5bc4df7446-bp8nq" event={"ID":"c7db0c3c-40e2-49df-bffc-c0f94b26c92f","Type":"ContainerStarted","Data":"487610a51d09892acc962ab009e8739c79cf489ac24a9aca9be5e62f559fe45f"} Mar 12 13:25:47 crc kubenswrapper[4921]: I0312 13:25:47.638683 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5bc4df7446-bp8nq" Mar 12 13:25:47 crc kubenswrapper[4921]: I0312 13:25:47.694194 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5bc4df7446-bp8nq" podStartSLOduration=2.189832037 podStartE2EDuration="5.694160188s" podCreationTimestamp="2026-03-12 13:25:42 +0000 UTC" firstStartedPulling="2026-03-12 13:25:43.054872153 +0000 UTC m=+965.744944124" lastFinishedPulling="2026-03-12 13:25:46.559200304 +0000 UTC m=+969.249272275" observedRunningTime="2026-03-12 13:25:47.681716208 +0000 UTC m=+970.371788219" watchObservedRunningTime="2026-03-12 13:25:47.694160188 +0000 UTC m=+970.384232209" Mar 12 13:25:52 crc kubenswrapper[4921]: I0312 13:25:52.748627 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5bc4df7446-bp8nq" Mar 12 13:25:56 crc kubenswrapper[4921]: I0312 13:25:56.323623 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:25:56 crc kubenswrapper[4921]: I0312 13:25:56.324037 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:26:00 crc kubenswrapper[4921]: I0312 13:26:00.135785 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555366-2kvrw"] Mar 12 13:26:00 crc kubenswrapper[4921]: I0312 13:26:00.136496 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555366-2kvrw" Mar 12 13:26:00 crc kubenswrapper[4921]: I0312 13:26:00.138795 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:26:00 crc kubenswrapper[4921]: I0312 13:26:00.138830 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:26:00 crc kubenswrapper[4921]: I0312 13:26:00.138852 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:26:00 crc kubenswrapper[4921]: I0312 13:26:00.142053 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555366-2kvrw"] Mar 12 13:26:00 crc kubenswrapper[4921]: I0312 13:26:00.255056 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fqpv\" (UniqueName: \"kubernetes.io/projected/230fb418-c791-493a-9703-188ba4af8657-kube-api-access-8fqpv\") pod \"auto-csr-approver-29555366-2kvrw\" (UID: \"230fb418-c791-493a-9703-188ba4af8657\") " pod="openshift-infra/auto-csr-approver-29555366-2kvrw" Mar 12 13:26:00 crc kubenswrapper[4921]: I0312 13:26:00.356977 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fqpv\" (UniqueName: \"kubernetes.io/projected/230fb418-c791-493a-9703-188ba4af8657-kube-api-access-8fqpv\") pod \"auto-csr-approver-29555366-2kvrw\" (UID: \"230fb418-c791-493a-9703-188ba4af8657\") " pod="openshift-infra/auto-csr-approver-29555366-2kvrw" Mar 12 13:26:00 crc kubenswrapper[4921]: I0312 13:26:00.393619 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fqpv\" (UniqueName: \"kubernetes.io/projected/230fb418-c791-493a-9703-188ba4af8657-kube-api-access-8fqpv\") pod \"auto-csr-approver-29555366-2kvrw\" (UID: \"230fb418-c791-493a-9703-188ba4af8657\") " pod="openshift-infra/auto-csr-approver-29555366-2kvrw" Mar 12 13:26:00 crc kubenswrapper[4921]: I0312 13:26:00.451283 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555366-2kvrw" Mar 12 13:26:00 crc kubenswrapper[4921]: I0312 13:26:00.694208 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555366-2kvrw"] Mar 12 13:26:00 crc kubenswrapper[4921]: I0312 13:26:00.739261 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555366-2kvrw" event={"ID":"230fb418-c791-493a-9703-188ba4af8657","Type":"ContainerStarted","Data":"b3a2e1966e5bf05c9ad360e11725ccd5a47314af67408e9301bf89c775ea81e6"} Mar 12 13:26:02 crc kubenswrapper[4921]: I0312 13:26:02.752154 4921 generic.go:334] "Generic (PLEG): container finished" podID="230fb418-c791-493a-9703-188ba4af8657" containerID="6ec6847728a310a5ebe83d645ad8fba01a7971d5fcc48461074fa52038ee05a5" exitCode=0 Mar 12 13:26:02 crc kubenswrapper[4921]: I0312 13:26:02.752217 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555366-2kvrw" event={"ID":"230fb418-c791-493a-9703-188ba4af8657","Type":"ContainerDied","Data":"6ec6847728a310a5ebe83d645ad8fba01a7971d5fcc48461074fa52038ee05a5"} Mar 12 13:26:04 crc kubenswrapper[4921]: I0312 13:26:04.010275 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555366-2kvrw" Mar 12 13:26:04 crc kubenswrapper[4921]: I0312 13:26:04.103233 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fqpv\" (UniqueName: \"kubernetes.io/projected/230fb418-c791-493a-9703-188ba4af8657-kube-api-access-8fqpv\") pod \"230fb418-c791-493a-9703-188ba4af8657\" (UID: \"230fb418-c791-493a-9703-188ba4af8657\") " Mar 12 13:26:04 crc kubenswrapper[4921]: I0312 13:26:04.108657 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/230fb418-c791-493a-9703-188ba4af8657-kube-api-access-8fqpv" (OuterVolumeSpecName: "kube-api-access-8fqpv") pod "230fb418-c791-493a-9703-188ba4af8657" (UID: "230fb418-c791-493a-9703-188ba4af8657"). InnerVolumeSpecName "kube-api-access-8fqpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:26:04 crc kubenswrapper[4921]: I0312 13:26:04.204349 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fqpv\" (UniqueName: \"kubernetes.io/projected/230fb418-c791-493a-9703-188ba4af8657-kube-api-access-8fqpv\") on node \"crc\" DevicePath \"\"" Mar 12 13:26:04 crc kubenswrapper[4921]: I0312 13:26:04.765390 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555366-2kvrw" event={"ID":"230fb418-c791-493a-9703-188ba4af8657","Type":"ContainerDied","Data":"b3a2e1966e5bf05c9ad360e11725ccd5a47314af67408e9301bf89c775ea81e6"} Mar 12 13:26:04 crc kubenswrapper[4921]: I0312 13:26:04.765439 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3a2e1966e5bf05c9ad360e11725ccd5a47314af67408e9301bf89c775ea81e6" Mar 12 13:26:04 crc kubenswrapper[4921]: I0312 13:26:04.765445 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555366-2kvrw" Mar 12 13:26:05 crc kubenswrapper[4921]: I0312 13:26:05.057549 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555360-t8xh2"] Mar 12 13:26:05 crc kubenswrapper[4921]: I0312 13:26:05.061294 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555360-t8xh2"] Mar 12 13:26:05 crc kubenswrapper[4921]: I0312 13:26:05.993263 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68c6a2d3-4d37-4e4a-b16b-011befefbb0c" path="/var/lib/kubelet/pods/68c6a2d3-4d37-4e4a-b16b-011befefbb0c/volumes" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.786914 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-dmwhv"] Mar 12 13:26:12 crc kubenswrapper[4921]: E0312 13:26:12.787979 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="230fb418-c791-493a-9703-188ba4af8657" containerName="oc" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.787999 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="230fb418-c791-493a-9703-188ba4af8657" containerName="oc" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.788200 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="230fb418-c791-493a-9703-188ba4af8657" containerName="oc" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.788729 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-dmwhv" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.791465 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-mrd9b" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.792615 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-zmq56"] Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.793461 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zmq56" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.798242 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-wzx2n" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.811585 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-zmq56"] Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.818973 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-j46tf"] Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.819838 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j46tf" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.822507 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-dmwhv"] Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.829406 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-9p5m5" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.845604 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-j46tf"] Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.864336 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-5jt7c"] Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.865539 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jt7c" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.868062 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-9nc7r" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.887883 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-5jt7c"] Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.901498 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fp4rs"] Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.902293 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fp4rs" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.906069 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-97qs7" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.917558 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcs49\" (UniqueName: \"kubernetes.io/projected/ac8d4a43-01b6-438e-b1d8-d3521ed82176-kube-api-access-dcs49\") pod \"cinder-operator-controller-manager-984cd4dcf-zmq56\" (UID: \"ac8d4a43-01b6-438e-b1d8-d3521ed82176\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zmq56" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.917625 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsmgm\" (UniqueName: \"kubernetes.io/projected/7494cb10-090c-4ac2-bbf1-663979f3e4cf-kube-api-access-rsmgm\") pod \"glance-operator-controller-manager-5964f64c48-5jt7c\" (UID: \"7494cb10-090c-4ac2-bbf1-663979f3e4cf\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jt7c" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.917686 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bltrr\" (UniqueName: \"kubernetes.io/projected/0cc6c5ac-1bcd-4636-924a-8a6d6ebfaeea-kube-api-access-bltrr\") pod \"barbican-operator-controller-manager-677bd678f7-dmwhv\" (UID: \"0cc6c5ac-1bcd-4636-924a-8a6d6ebfaeea\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-dmwhv" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.917706 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf5sd\" (UniqueName: \"kubernetes.io/projected/5908e8b2-d088-4190-8ccf-ea7526921e80-kube-api-access-qf5sd\") pod \"designate-operator-controller-manager-66d56f6ff4-j46tf\" (UID: \"5908e8b2-d088-4190-8ccf-ea7526921e80\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j46tf" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.922192 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-nq8wj"] Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.923088 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-nq8wj" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.925157 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-h8b9r" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.941802 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fp4rs"] Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.952268 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv"] Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.953252 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.954997 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7dfmc" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.957113 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.966276 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-nq8wj"] Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.995178 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-67xqg"] Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.996074 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-67xqg" Mar 12 13:26:12 crc kubenswrapper[4921]: I0312 13:26:12.999019 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-lmmmj" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.005308 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.019490 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bltrr\" (UniqueName: \"kubernetes.io/projected/0cc6c5ac-1bcd-4636-924a-8a6d6ebfaeea-kube-api-access-bltrr\") pod \"barbican-operator-controller-manager-677bd678f7-dmwhv\" (UID: \"0cc6c5ac-1bcd-4636-924a-8a6d6ebfaeea\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-dmwhv" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.019540 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert\") pod \"infra-operator-controller-manager-5995f4446f-9tkrv\" (UID: \"c09491c8-72c5-4019-91bf-37ee1a3a937c\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.019565 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf5sd\" (UniqueName: \"kubernetes.io/projected/5908e8b2-d088-4190-8ccf-ea7526921e80-kube-api-access-qf5sd\") pod \"designate-operator-controller-manager-66d56f6ff4-j46tf\" (UID: \"5908e8b2-d088-4190-8ccf-ea7526921e80\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j46tf" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.019594 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pr8k\" (UniqueName: \"kubernetes.io/projected/c09491c8-72c5-4019-91bf-37ee1a3a937c-kube-api-access-4pr8k\") pod \"infra-operator-controller-manager-5995f4446f-9tkrv\" (UID: \"c09491c8-72c5-4019-91bf-37ee1a3a937c\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.019629 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24qpz\" (UniqueName: \"kubernetes.io/projected/c6de3785-ea06-49bb-9b39-d8f2f10bce81-kube-api-access-24qpz\") pod \"heat-operator-controller-manager-77b6666d85-nq8wj\" (UID: \"c6de3785-ea06-49bb-9b39-d8f2f10bce81\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-nq8wj" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.019653 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcs49\" (UniqueName: \"kubernetes.io/projected/ac8d4a43-01b6-438e-b1d8-d3521ed82176-kube-api-access-dcs49\") pod \"cinder-operator-controller-manager-984cd4dcf-zmq56\" (UID: \"ac8d4a43-01b6-438e-b1d8-d3521ed82176\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zmq56" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.019685 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsmgm\" (UniqueName: \"kubernetes.io/projected/7494cb10-090c-4ac2-bbf1-663979f3e4cf-kube-api-access-rsmgm\") pod \"glance-operator-controller-manager-5964f64c48-5jt7c\" (UID: \"7494cb10-090c-4ac2-bbf1-663979f3e4cf\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jt7c" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.019711 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv68n\" (UniqueName: \"kubernetes.io/projected/001425f5-0a2a-4bdc-a437-d6f9ba3687b4-kube-api-access-cv68n\") pod \"horizon-operator-controller-manager-6d9d6b584d-fp4rs\" (UID: \"001425f5-0a2a-4bdc-a437-d6f9ba3687b4\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fp4rs" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.044875 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-67xqg"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.063623 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.064581 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.068070 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcs49\" (UniqueName: \"kubernetes.io/projected/ac8d4a43-01b6-438e-b1d8-d3521ed82176-kube-api-access-dcs49\") pod \"cinder-operator-controller-manager-984cd4dcf-zmq56\" (UID: \"ac8d4a43-01b6-438e-b1d8-d3521ed82176\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zmq56" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.071167 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-fj2nc" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.073277 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf5sd\" (UniqueName: \"kubernetes.io/projected/5908e8b2-d088-4190-8ccf-ea7526921e80-kube-api-access-qf5sd\") pod \"designate-operator-controller-manager-66d56f6ff4-j46tf\" (UID: \"5908e8b2-d088-4190-8ccf-ea7526921e80\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j46tf" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.081496 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bltrr\" (UniqueName: \"kubernetes.io/projected/0cc6c5ac-1bcd-4636-924a-8a6d6ebfaeea-kube-api-access-bltrr\") pod \"barbican-operator-controller-manager-677bd678f7-dmwhv\" (UID: \"0cc6c5ac-1bcd-4636-924a-8a6d6ebfaeea\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-dmwhv" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.086008 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsmgm\" (UniqueName: \"kubernetes.io/projected/7494cb10-090c-4ac2-bbf1-663979f3e4cf-kube-api-access-rsmgm\") pod \"glance-operator-controller-manager-5964f64c48-5jt7c\" (UID: \"7494cb10-090c-4ac2-bbf1-663979f3e4cf\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jt7c" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.106126 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-xzm8h"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.106881 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-xzm8h" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.114703 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-5mc52" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.121497 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfmrg\" (UniqueName: \"kubernetes.io/projected/6a1a1aea-a74a-4886-ae24-1d188243e859-kube-api-access-wfmrg\") pod \"ironic-operator-controller-manager-6bbb499bbc-67xqg\" (UID: \"6a1a1aea-a74a-4886-ae24-1d188243e859\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-67xqg" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.121566 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert\") pod \"infra-operator-controller-manager-5995f4446f-9tkrv\" (UID: \"c09491c8-72c5-4019-91bf-37ee1a3a937c\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.121603 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjnts\" (UniqueName: \"kubernetes.io/projected/d4de9b0c-3812-462a-aa80-ffe00e6d47ca-kube-api-access-qjnts\") pod \"keystone-operator-controller-manager-684f77d66d-v42m2\" (UID: \"d4de9b0c-3812-462a-aa80-ffe00e6d47ca\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.121638 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pr8k\" (UniqueName: \"kubernetes.io/projected/c09491c8-72c5-4019-91bf-37ee1a3a937c-kube-api-access-4pr8k\") pod \"infra-operator-controller-manager-5995f4446f-9tkrv\" (UID: \"c09491c8-72c5-4019-91bf-37ee1a3a937c\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.121691 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24qpz\" (UniqueName: \"kubernetes.io/projected/c6de3785-ea06-49bb-9b39-d8f2f10bce81-kube-api-access-24qpz\") pod \"heat-operator-controller-manager-77b6666d85-nq8wj\" (UID: \"c6de3785-ea06-49bb-9b39-d8f2f10bce81\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-nq8wj" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.121758 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv68n\" (UniqueName: \"kubernetes.io/projected/001425f5-0a2a-4bdc-a437-d6f9ba3687b4-kube-api-access-cv68n\") pod \"horizon-operator-controller-manager-6d9d6b584d-fp4rs\" (UID: \"001425f5-0a2a-4bdc-a437-d6f9ba3687b4\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fp4rs" Mar 12 13:26:13 crc kubenswrapper[4921]: E0312 13:26:13.122044 4921 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 13:26:13 crc kubenswrapper[4921]: E0312 13:26:13.122153 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert podName:c09491c8-72c5-4019-91bf-37ee1a3a937c nodeName:}" failed. No retries permitted until 2026-03-12 13:26:13.622135574 +0000 UTC m=+996.312207545 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert") pod "infra-operator-controller-manager-5995f4446f-9tkrv" (UID: "c09491c8-72c5-4019-91bf-37ee1a3a937c") : secret "infra-operator-webhook-server-cert" not found Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.122458 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-dmwhv" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.123136 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.147179 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zmq56" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.157197 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j46tf" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.157687 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-xzm8h"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.161383 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv68n\" (UniqueName: \"kubernetes.io/projected/001425f5-0a2a-4bdc-a437-d6f9ba3687b4-kube-api-access-cv68n\") pod \"horizon-operator-controller-manager-6d9d6b584d-fp4rs\" (UID: \"001425f5-0a2a-4bdc-a437-d6f9ba3687b4\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fp4rs" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.162899 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pr8k\" (UniqueName: \"kubernetes.io/projected/c09491c8-72c5-4019-91bf-37ee1a3a937c-kube-api-access-4pr8k\") pod \"infra-operator-controller-manager-5995f4446f-9tkrv\" (UID: \"c09491c8-72c5-4019-91bf-37ee1a3a937c\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.165460 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-692s5"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.168359 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-692s5" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.170983 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-2dpl4" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.177040 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24qpz\" (UniqueName: \"kubernetes.io/projected/c6de3785-ea06-49bb-9b39-d8f2f10bce81-kube-api-access-24qpz\") pod \"heat-operator-controller-manager-77b6666d85-nq8wj\" (UID: \"c6de3785-ea06-49bb-9b39-d8f2f10bce81\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-nq8wj" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.180024 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-692s5"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.193381 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jt7c" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.194315 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-kzh67"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.195315 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kzh67" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.197368 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-zf8wg" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.199705 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.202070 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.209523 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-bj8mw" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.223526 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf2s6\" (UniqueName: \"kubernetes.io/projected/fd1bc9ca-529d-4d59-a236-db1bb5c121ca-kube-api-access-zf2s6\") pod \"manila-operator-controller-manager-68f45f9d9f-xzm8h\" (UID: \"fd1bc9ca-529d-4d59-a236-db1bb5c121ca\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-xzm8h" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.223599 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfmrg\" (UniqueName: \"kubernetes.io/projected/6a1a1aea-a74a-4886-ae24-1d188243e859-kube-api-access-wfmrg\") pod \"ironic-operator-controller-manager-6bbb499bbc-67xqg\" (UID: \"6a1a1aea-a74a-4886-ae24-1d188243e859\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-67xqg" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.223663 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjnts\" (UniqueName: \"kubernetes.io/projected/d4de9b0c-3812-462a-aa80-ffe00e6d47ca-kube-api-access-qjnts\") pod \"keystone-operator-controller-manager-684f77d66d-v42m2\" (UID: \"d4de9b0c-3812-462a-aa80-ffe00e6d47ca\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.223690 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9wxr\" (UniqueName: \"kubernetes.io/projected/6131e4c9-d85a-4cdf-9cec-128c9e81bc29-kube-api-access-k9wxr\") pod \"mariadb-operator-controller-manager-658d4cdd5-692s5\" (UID: \"6131e4c9-d85a-4cdf-9cec-128c9e81bc29\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-692s5" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.224907 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fp4rs" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.240020 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-kzh67"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.242217 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-nq8wj" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.244228 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjnts\" (UniqueName: \"kubernetes.io/projected/d4de9b0c-3812-462a-aa80-ffe00e6d47ca-kube-api-access-qjnts\") pod \"keystone-operator-controller-manager-684f77d66d-v42m2\" (UID: \"d4de9b0c-3812-462a-aa80-ffe00e6d47ca\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.246067 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.246576 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfmrg\" (UniqueName: \"kubernetes.io/projected/6a1a1aea-a74a-4886-ae24-1d188243e859-kube-api-access-wfmrg\") pod \"ironic-operator-controller-manager-6bbb499bbc-67xqg\" (UID: \"6a1a1aea-a74a-4886-ae24-1d188243e859\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-67xqg" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.253111 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bz8j7"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.255005 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bz8j7" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.257686 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-6qknp" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.258636 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bz8j7"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.263555 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.264613 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.267058 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-dwtg9" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.267133 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.275677 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.276592 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.281770 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-hth6h" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.286344 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.299303 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.304750 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-64dcj"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.305477 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-64dcj" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.315690 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-64dcj"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.315820 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-m842c"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.316432 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m842c" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.316924 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-67xqg" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.321308 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-m842c"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.325487 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8fxx\" (UniqueName: \"kubernetes.io/projected/2394f3bd-4f8b-4036-b240-7ed71b80798a-kube-api-access-p8fxx\") pod \"neutron-operator-controller-manager-776c5696bf-kzh67\" (UID: \"2394f3bd-4f8b-4036-b240-7ed71b80798a\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kzh67" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.325535 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9xks\" (UniqueName: \"kubernetes.io/projected/4e1ee178-3f0e-405a-93cb-9414b2fccbe0-kube-api-access-j9xks\") pod \"octavia-operator-controller-manager-5f4f55cb5c-bz8j7\" (UID: \"4e1ee178-3f0e-405a-93cb-9414b2fccbe0\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bz8j7" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.325558 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbcwx\" (UniqueName: \"kubernetes.io/projected/1a0b0ff9-21c3-452f-9ded-00d374fbbcbe-kube-api-access-qbcwx\") pod \"nova-operator-controller-manager-686d5f9fbd-hmkmx\" (UID: \"1a0b0ff9-21c3-452f-9ded-00d374fbbcbe\") " pod="openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.325609 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf2s6\" (UniqueName: \"kubernetes.io/projected/fd1bc9ca-529d-4d59-a236-db1bb5c121ca-kube-api-access-zf2s6\") pod \"manila-operator-controller-manager-68f45f9d9f-xzm8h\" (UID: \"fd1bc9ca-529d-4d59-a236-db1bb5c121ca\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-xzm8h" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.325636 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqcft\" (UniqueName: \"kubernetes.io/projected/0c9cd39f-8440-4f22-82ce-d3be95bea1be-kube-api-access-xqcft\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8\" (UID: \"0c9cd39f-8440-4f22-82ce-d3be95bea1be\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.325688 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8\" (UID: \"0c9cd39f-8440-4f22-82ce-d3be95bea1be\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.325714 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9wxr\" (UniqueName: \"kubernetes.io/projected/6131e4c9-d85a-4cdf-9cec-128c9e81bc29-kube-api-access-k9wxr\") pod \"mariadb-operator-controller-manager-658d4cdd5-692s5\" (UID: \"6131e4c9-d85a-4cdf-9cec-128c9e81bc29\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-692s5" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.325760 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsnml\" (UniqueName: \"kubernetes.io/projected/994c3a47-47a7-4fbe-9f9c-df011597775b-kube-api-access-dsnml\") pod \"ovn-operator-controller-manager-bbc5b68f9-x4tf4\" (UID: \"994c3a47-47a7-4fbe-9f9c-df011597775b\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.326685 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.327525 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.331827 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-scqtq" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.332037 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-45d8p" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.332514 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-5qjq6" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.341247 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.355733 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf2s6\" (UniqueName: \"kubernetes.io/projected/fd1bc9ca-529d-4d59-a236-db1bb5c121ca-kube-api-access-zf2s6\") pod \"manila-operator-controller-manager-68f45f9d9f-xzm8h\" (UID: \"fd1bc9ca-529d-4d59-a236-db1bb5c121ca\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-xzm8h" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.357865 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9wxr\" (UniqueName: \"kubernetes.io/projected/6131e4c9-d85a-4cdf-9cec-128c9e81bc29-kube-api-access-k9wxr\") pod \"mariadb-operator-controller-manager-658d4cdd5-692s5\" (UID: \"6131e4c9-d85a-4cdf-9cec-128c9e81bc29\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-692s5" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.362830 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.367357 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.377907 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-9h97q" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.401271 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.427945 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqcft\" (UniqueName: \"kubernetes.io/projected/0c9cd39f-8440-4f22-82ce-d3be95bea1be-kube-api-access-xqcft\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8\" (UID: \"0c9cd39f-8440-4f22-82ce-d3be95bea1be\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.427996 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8\" (UID: \"0c9cd39f-8440-4f22-82ce-d3be95bea1be\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.428031 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsnml\" (UniqueName: \"kubernetes.io/projected/994c3a47-47a7-4fbe-9f9c-df011597775b-kube-api-access-dsnml\") pod \"ovn-operator-controller-manager-bbc5b68f9-x4tf4\" (UID: \"994c3a47-47a7-4fbe-9f9c-df011597775b\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.428059 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtk5m\" (UniqueName: \"kubernetes.io/projected/3a930c0b-6c3b-4a1d-b02f-1190a124ceb2-kube-api-access-rtk5m\") pod \"placement-operator-controller-manager-574d45c66c-64dcj\" (UID: \"3a930c0b-6c3b-4a1d-b02f-1190a124ceb2\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-64dcj" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.428096 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvnnw\" (UniqueName: \"kubernetes.io/projected/fe35cc9d-bfc6-4a4d-b21f-06ab55672726-kube-api-access-mvnnw\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-dlgkj\" (UID: \"fe35cc9d-bfc6-4a4d-b21f-06ab55672726\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.428134 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8fxx\" (UniqueName: \"kubernetes.io/projected/2394f3bd-4f8b-4036-b240-7ed71b80798a-kube-api-access-p8fxx\") pod \"neutron-operator-controller-manager-776c5696bf-kzh67\" (UID: \"2394f3bd-4f8b-4036-b240-7ed71b80798a\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kzh67" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.428152 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9xks\" (UniqueName: \"kubernetes.io/projected/4e1ee178-3f0e-405a-93cb-9414b2fccbe0-kube-api-access-j9xks\") pod \"octavia-operator-controller-manager-5f4f55cb5c-bz8j7\" (UID: \"4e1ee178-3f0e-405a-93cb-9414b2fccbe0\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bz8j7" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.428172 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbcwx\" (UniqueName: \"kubernetes.io/projected/1a0b0ff9-21c3-452f-9ded-00d374fbbcbe-kube-api-access-qbcwx\") pod \"nova-operator-controller-manager-686d5f9fbd-hmkmx\" (UID: \"1a0b0ff9-21c3-452f-9ded-00d374fbbcbe\") " pod="openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.428197 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4w2p\" (UniqueName: \"kubernetes.io/projected/ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b-kube-api-access-f4w2p\") pod \"test-operator-controller-manager-5c5cb9c4d7-2sf7v\" (UID: \"ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.428217 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bnwp\" (UniqueName: \"kubernetes.io/projected/f2c81917-4047-4d0b-baed-45afa8a53a60-kube-api-access-6bnwp\") pod \"swift-operator-controller-manager-677c674df7-m842c\" (UID: \"f2c81917-4047-4d0b-baed-45afa8a53a60\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-m842c" Mar 12 13:26:13 crc kubenswrapper[4921]: E0312 13:26:13.428230 4921 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 13:26:13 crc kubenswrapper[4921]: E0312 13:26:13.428306 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert podName:0c9cd39f-8440-4f22-82ce-d3be95bea1be nodeName:}" failed. No retries permitted until 2026-03-12 13:26:13.928279627 +0000 UTC m=+996.618351598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" (UID: "0c9cd39f-8440-4f22-82ce-d3be95bea1be") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.453553 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.454609 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.456284 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbcwx\" (UniqueName: \"kubernetes.io/projected/1a0b0ff9-21c3-452f-9ded-00d374fbbcbe-kube-api-access-qbcwx\") pod \"nova-operator-controller-manager-686d5f9fbd-hmkmx\" (UID: \"1a0b0ff9-21c3-452f-9ded-00d374fbbcbe\") " pod="openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.457222 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.457472 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqcft\" (UniqueName: \"kubernetes.io/projected/0c9cd39f-8440-4f22-82ce-d3be95bea1be-kube-api-access-xqcft\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8\" (UID: \"0c9cd39f-8440-4f22-82ce-d3be95bea1be\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.458591 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-rtzcc" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.459568 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsnml\" (UniqueName: \"kubernetes.io/projected/994c3a47-47a7-4fbe-9f9c-df011597775b-kube-api-access-dsnml\") pod \"ovn-operator-controller-manager-bbc5b68f9-x4tf4\" (UID: \"994c3a47-47a7-4fbe-9f9c-df011597775b\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.461368 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8fxx\" (UniqueName: \"kubernetes.io/projected/2394f3bd-4f8b-4036-b240-7ed71b80798a-kube-api-access-p8fxx\") pod \"neutron-operator-controller-manager-776c5696bf-kzh67\" (UID: \"2394f3bd-4f8b-4036-b240-7ed71b80798a\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kzh67" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.471434 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9xks\" (UniqueName: \"kubernetes.io/projected/4e1ee178-3f0e-405a-93cb-9414b2fccbe0-kube-api-access-j9xks\") pod \"octavia-operator-controller-manager-5f4f55cb5c-bz8j7\" (UID: \"4e1ee178-3f0e-405a-93cb-9414b2fccbe0\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bz8j7" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.499211 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.500091 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.503010 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.503176 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.503314 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-8cqnd" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.516294 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.528564 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.528885 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4w2p\" (UniqueName: \"kubernetes.io/projected/ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b-kube-api-access-f4w2p\") pod \"test-operator-controller-manager-5c5cb9c4d7-2sf7v\" (UID: \"ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.528929 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bnwp\" (UniqueName: \"kubernetes.io/projected/f2c81917-4047-4d0b-baed-45afa8a53a60-kube-api-access-6bnwp\") pod \"swift-operator-controller-manager-677c674df7-m842c\" (UID: \"f2c81917-4047-4d0b-baed-45afa8a53a60\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-m842c" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.528996 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jnq5\" (UniqueName: \"kubernetes.io/projected/2db21a73-26d9-44d6-aa91-ba8068b0525a-kube-api-access-8jnq5\") pod \"watcher-operator-controller-manager-6dd88c6f67-7l7sm\" (UID: \"2db21a73-26d9-44d6-aa91-ba8068b0525a\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.529032 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtk5m\" (UniqueName: \"kubernetes.io/projected/3a930c0b-6c3b-4a1d-b02f-1190a124ceb2-kube-api-access-rtk5m\") pod \"placement-operator-controller-manager-574d45c66c-64dcj\" (UID: \"3a930c0b-6c3b-4a1d-b02f-1190a124ceb2\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-64dcj" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.529057 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvnnw\" (UniqueName: \"kubernetes.io/projected/fe35cc9d-bfc6-4a4d-b21f-06ab55672726-kube-api-access-mvnnw\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-dlgkj\" (UID: \"fe35cc9d-bfc6-4a4d-b21f-06ab55672726\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.545177 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bnwp\" (UniqueName: \"kubernetes.io/projected/f2c81917-4047-4d0b-baed-45afa8a53a60-kube-api-access-6bnwp\") pod \"swift-operator-controller-manager-677c674df7-m842c\" (UID: \"f2c81917-4047-4d0b-baed-45afa8a53a60\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-m842c" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.545564 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtk5m\" (UniqueName: \"kubernetes.io/projected/3a930c0b-6c3b-4a1d-b02f-1190a124ceb2-kube-api-access-rtk5m\") pod \"placement-operator-controller-manager-574d45c66c-64dcj\" (UID: \"3a930c0b-6c3b-4a1d-b02f-1190a124ceb2\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-64dcj" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.546137 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4w2p\" (UniqueName: \"kubernetes.io/projected/ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b-kube-api-access-f4w2p\") pod \"test-operator-controller-manager-5c5cb9c4d7-2sf7v\" (UID: \"ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.546940 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvnnw\" (UniqueName: \"kubernetes.io/projected/fe35cc9d-bfc6-4a4d-b21f-06ab55672726-kube-api-access-mvnnw\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-dlgkj\" (UID: \"fe35cc9d-bfc6-4a4d-b21f-06ab55672726\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.547078 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-xzm8h" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.564340 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-692s5" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.575646 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.578260 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kzh67" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.598252 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-64dcj" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.609561 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m842c" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.610070 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h97zm"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.618048 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h97zm" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.625475 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-r6lf6" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.637946 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzq9p\" (UniqueName: \"kubernetes.io/projected/9b888138-4648-48a6-9364-639fb0e0c8b6-kube-api-access-rzq9p\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.638062 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert\") pod \"infra-operator-controller-manager-5995f4446f-9tkrv\" (UID: \"c09491c8-72c5-4019-91bf-37ee1a3a937c\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.638129 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jnq5\" (UniqueName: \"kubernetes.io/projected/2db21a73-26d9-44d6-aa91-ba8068b0525a-kube-api-access-8jnq5\") pod \"watcher-operator-controller-manager-6dd88c6f67-7l7sm\" (UID: \"2db21a73-26d9-44d6-aa91-ba8068b0525a\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.638170 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.638257 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.642488 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.642757 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj" Mar 12 13:26:13 crc kubenswrapper[4921]: E0312 13:26:13.644446 4921 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 13:26:13 crc kubenswrapper[4921]: E0312 13:26:13.644517 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert podName:c09491c8-72c5-4019-91bf-37ee1a3a937c nodeName:}" failed. No retries permitted until 2026-03-12 13:26:14.64449158 +0000 UTC m=+997.334563551 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert") pod "infra-operator-controller-manager-5995f4446f-9tkrv" (UID: "c09491c8-72c5-4019-91bf-37ee1a3a937c") : secret "infra-operator-webhook-server-cert" not found Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.646254 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h97zm"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.668609 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bz8j7" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.680978 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jnq5\" (UniqueName: \"kubernetes.io/projected/2db21a73-26d9-44d6-aa91-ba8068b0525a-kube-api-access-8jnq5\") pod \"watcher-operator-controller-manager-6dd88c6f67-7l7sm\" (UID: \"2db21a73-26d9-44d6-aa91-ba8068b0525a\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.683683 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.708517 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.740478 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dx47\" (UniqueName: \"kubernetes.io/projected/f0da206d-658e-47e1-9cfb-5b74237c406a-kube-api-access-9dx47\") pod \"rabbitmq-cluster-operator-manager-668c99d594-h97zm\" (UID: \"f0da206d-658e-47e1-9cfb-5b74237c406a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h97zm" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.740949 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.741007 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.741294 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzq9p\" (UniqueName: \"kubernetes.io/projected/9b888138-4648-48a6-9364-639fb0e0c8b6-kube-api-access-rzq9p\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:13 crc kubenswrapper[4921]: E0312 13:26:13.742226 4921 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 13:26:13 crc kubenswrapper[4921]: E0312 13:26:13.742273 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs podName:9b888138-4648-48a6-9364-639fb0e0c8b6 nodeName:}" failed. No retries permitted until 2026-03-12 13:26:14.242257957 +0000 UTC m=+996.932329928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs") pod "openstack-operator-controller-manager-5785b7957-24wxp" (UID: "9b888138-4648-48a6-9364-639fb0e0c8b6") : secret "metrics-server-cert" not found Mar 12 13:26:13 crc kubenswrapper[4921]: E0312 13:26:13.742422 4921 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 13:26:13 crc kubenswrapper[4921]: E0312 13:26:13.742452 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs podName:9b888138-4648-48a6-9364-639fb0e0c8b6 nodeName:}" failed. No retries permitted until 2026-03-12 13:26:14.242442773 +0000 UTC m=+996.932514744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs") pod "openstack-operator-controller-manager-5785b7957-24wxp" (UID: "9b888138-4648-48a6-9364-639fb0e0c8b6") : secret "webhook-server-cert" not found Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.760524 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzq9p\" (UniqueName: \"kubernetes.io/projected/9b888138-4648-48a6-9364-639fb0e0c8b6-kube-api-access-rzq9p\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.839595 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-dmwhv"] Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.845124 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dx47\" (UniqueName: \"kubernetes.io/projected/f0da206d-658e-47e1-9cfb-5b74237c406a-kube-api-access-9dx47\") pod \"rabbitmq-cluster-operator-manager-668c99d594-h97zm\" (UID: \"f0da206d-658e-47e1-9cfb-5b74237c406a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h97zm" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.862136 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dx47\" (UniqueName: \"kubernetes.io/projected/f0da206d-658e-47e1-9cfb-5b74237c406a-kube-api-access-9dx47\") pod \"rabbitmq-cluster-operator-manager-668c99d594-h97zm\" (UID: \"f0da206d-658e-47e1-9cfb-5b74237c406a\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h97zm" Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.946171 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8\" (UID: \"0c9cd39f-8440-4f22-82ce-d3be95bea1be\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:13 crc kubenswrapper[4921]: E0312 13:26:13.946391 4921 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 13:26:13 crc kubenswrapper[4921]: E0312 13:26:13.946439 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert podName:0c9cd39f-8440-4f22-82ce-d3be95bea1be nodeName:}" failed. No retries permitted until 2026-03-12 13:26:14.946425481 +0000 UTC m=+997.636497452 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" (UID: "0c9cd39f-8440-4f22-82ce-d3be95bea1be") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 13:26:13 crc kubenswrapper[4921]: I0312 13:26:13.964076 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-j46tf"] Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.011397 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-zmq56"] Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.031407 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h97zm" Mar 12 13:26:14 crc kubenswrapper[4921]: W0312 13:26:14.043863 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5908e8b2_d088_4190_8ccf_ea7526921e80.slice/crio-6b486f0f6e75138b9aeb4a3cd8b05bb54c4de3cd90e2fb02904975b14c950812 WatchSource:0}: Error finding container 6b486f0f6e75138b9aeb4a3cd8b05bb54c4de3cd90e2fb02904975b14c950812: Status 404 returned error can't find the container with id 6b486f0f6e75138b9aeb4a3cd8b05bb54c4de3cd90e2fb02904975b14c950812 Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.130540 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-nq8wj"] Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.139524 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-5jt7c"] Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.143855 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2"] Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.150471 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fp4rs"] Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.251929 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.252049 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.252163 4921 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.252191 4921 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.252252 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs podName:9b888138-4648-48a6-9364-639fb0e0c8b6 nodeName:}" failed. No retries permitted until 2026-03-12 13:26:15.252233075 +0000 UTC m=+997.942305036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs") pod "openstack-operator-controller-manager-5785b7957-24wxp" (UID: "9b888138-4648-48a6-9364-639fb0e0c8b6") : secret "webhook-server-cert" not found Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.252267 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs podName:9b888138-4648-48a6-9364-639fb0e0c8b6 nodeName:}" failed. No retries permitted until 2026-03-12 13:26:15.252261096 +0000 UTC m=+997.942333127 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs") pod "openstack-operator-controller-manager-5785b7957-24wxp" (UID: "9b888138-4648-48a6-9364-639fb0e0c8b6") : secret "metrics-server-cert" not found Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.324085 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-67xqg"] Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.329190 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-xzm8h"] Mar 12 13:26:14 crc kubenswrapper[4921]: W0312 13:26:14.374631 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd1bc9ca_529d_4d59_a236_db1bb5c121ca.slice/crio-7a3f8d8c97a2c2195a13ade860a2e7c9ed72d6180e508b981439bf64b2623189 WatchSource:0}: Error finding container 7a3f8d8c97a2c2195a13ade860a2e7c9ed72d6180e508b981439bf64b2623189: Status 404 returned error can't find the container with id 7a3f8d8c97a2c2195a13ade860a2e7c9ed72d6180e508b981439bf64b2623189 Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.547580 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-64dcj"] Mar 12 13:26:14 crc kubenswrapper[4921]: W0312 13:26:14.562333 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a930c0b_6c3b_4a1d_b02f_1190a124ceb2.slice/crio-0dc6a0dd85f828571ea50dc208b3f3d8d9eda7a4fa9033fff724070434979945 WatchSource:0}: Error finding container 0dc6a0dd85f828571ea50dc208b3f3d8d9eda7a4fa9033fff724070434979945: Status 404 returned error can't find the container with id 0dc6a0dd85f828571ea50dc208b3f3d8d9eda7a4fa9033fff724070434979945 Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.564679 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-kzh67"] Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.659878 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert\") pod \"infra-operator-controller-manager-5995f4446f-9tkrv\" (UID: \"c09491c8-72c5-4019-91bf-37ee1a3a937c\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.659934 4921 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.660024 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert podName:c09491c8-72c5-4019-91bf-37ee1a3a937c nodeName:}" failed. No retries permitted until 2026-03-12 13:26:16.659997426 +0000 UTC m=+999.350069397 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert") pod "infra-operator-controller-manager-5995f4446f-9tkrv" (UID: "c09491c8-72c5-4019-91bf-37ee1a3a937c") : secret "infra-operator-webhook-server-cert" not found Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.720654 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-692s5"] Mar 12 13:26:14 crc kubenswrapper[4921]: W0312 13:26:14.720875 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e1ee178_3f0e_405a_93cb_9414b2fccbe0.slice/crio-62851e581ed078bbea65290a7b3653459833305c6b2763a2a12752b9055ec767 WatchSource:0}: Error finding container 62851e581ed078bbea65290a7b3653459833305c6b2763a2a12752b9055ec767: Status 404 returned error can't find the container with id 62851e581ed078bbea65290a7b3653459833305c6b2763a2a12752b9055ec767 Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.735621 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4"] Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.741949 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bz8j7"] Mar 12 13:26:14 crc kubenswrapper[4921]: W0312 13:26:14.743133 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6131e4c9_d85a_4cdf_9cec_128c9e81bc29.slice/crio-002afb057afe1b95bc68a582fe84a2537ce8e8cea342ff320b1cb89383de0f00 WatchSource:0}: Error finding container 002afb057afe1b95bc68a582fe84a2537ce8e8cea342ff320b1cb89383de0f00: Status 404 returned error can't find the container with id 002afb057afe1b95bc68a582fe84a2537ce8e8cea342ff320b1cb89383de0f00 Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.749524 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mvnnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-dlgkj_openstack-operators(fe35cc9d-bfc6-4a4d-b21f-06ab55672726): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 13:26:14 crc kubenswrapper[4921]: W0312 13:26:14.749879 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a0b0ff9_21c3_452f_9ded_00d374fbbcbe.slice/crio-190a712634a72802c15263eec2528bc89f535acb1d65aab4bb096bc5827ec819 WatchSource:0}: Error finding container 190a712634a72802c15263eec2528bc89f535acb1d65aab4bb096bc5827ec819: Status 404 returned error can't find the container with id 190a712634a72802c15263eec2528bc89f535acb1d65aab4bb096bc5827ec819 Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.750592 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj" podUID="fe35cc9d-bfc6-4a4d-b21f-06ab55672726" Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.753510 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.129.56.97:5001/openstack-k8s-operators/nova-operator:8734adf928be66fa1f808466edcc3ea058f7094f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qbcwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-686d5f9fbd-hmkmx_openstack-operators(1a0b0ff9-21c3-452f-9ded-00d374fbbcbe): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.754857 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8jnq5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-7l7sm_openstack-operators(2db21a73-26d9-44d6-aa91-ba8068b0525a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.755036 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx" podUID="1a0b0ff9-21c3-452f-9ded-00d374fbbcbe" Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.757668 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx"] Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.757727 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm" podUID="2db21a73-26d9-44d6-aa91-ba8068b0525a" Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.763859 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj"] Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.767353 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm"] Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.841745 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-dmwhv" event={"ID":"0cc6c5ac-1bcd-4636-924a-8a6d6ebfaeea","Type":"ContainerStarted","Data":"cb9f8225262d77535f89bd038f979e6689d65b0d2c9970d3e7536f47af328f60"} Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.842870 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-nq8wj" event={"ID":"c6de3785-ea06-49bb-9b39-d8f2f10bce81","Type":"ContainerStarted","Data":"0b74d56f7d7d3860cc1f66e864d8050d548ef7d5ac593e90ce462eabb5e585a8"} Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.843688 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kzh67" event={"ID":"2394f3bd-4f8b-4036-b240-7ed71b80798a","Type":"ContainerStarted","Data":"05c330db043f97ff17a60a15631d77c81127fa4de02ad9c35d9ed54eac6a7739"} Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.846112 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-xzm8h" event={"ID":"fd1bc9ca-529d-4d59-a236-db1bb5c121ca","Type":"ContainerStarted","Data":"7a3f8d8c97a2c2195a13ade860a2e7c9ed72d6180e508b981439bf64b2623189"} Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.847159 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-m842c"] Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.847861 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx" event={"ID":"1a0b0ff9-21c3-452f-9ded-00d374fbbcbe","Type":"ContainerStarted","Data":"190a712634a72802c15263eec2528bc89f535acb1d65aab4bb096bc5827ec819"} Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.849208 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.97:5001/openstack-k8s-operators/nova-operator:8734adf928be66fa1f808466edcc3ea058f7094f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx" podUID="1a0b0ff9-21c3-452f-9ded-00d374fbbcbe" Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.850038 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-67xqg" event={"ID":"6a1a1aea-a74a-4886-ae24-1d188243e859","Type":"ContainerStarted","Data":"62e970a48253358c97ec82e332b542f7d45c79e85295bc31d4c53f96d5b41e61"} Mar 12 13:26:14 crc kubenswrapper[4921]: W0312 13:26:14.854913 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2c81917_4047_4d0b_baed_45afa8a53a60.slice/crio-c24030ca0c1c97b97116000df9f9bef1eefba7a0d0b63781df61fbd8f94894ad WatchSource:0}: Error finding container c24030ca0c1c97b97116000df9f9bef1eefba7a0d0b63781df61fbd8f94894ad: Status 404 returned error can't find the container with id c24030ca0c1c97b97116000df9f9bef1eefba7a0d0b63781df61fbd8f94894ad Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.855508 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jt7c" event={"ID":"7494cb10-090c-4ac2-bbf1-663979f3e4cf","Type":"ContainerStarted","Data":"48174f2cb0fb7673e37ce00dcf1ced09ff23a1232a9e7b2e43b14440f96f7402"} Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.858911 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-64dcj" event={"ID":"3a930c0b-6c3b-4a1d-b02f-1190a124ceb2","Type":"ContainerStarted","Data":"0dc6a0dd85f828571ea50dc208b3f3d8d9eda7a4fa9033fff724070434979945"} Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.858988 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6bnwp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-m842c_openstack-operators(f2c81917-4047-4d0b-baed-45afa8a53a60): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 13:26:14 crc kubenswrapper[4921]: W0312 13:26:14.859705 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca8b207a_2cf1_455c_b7b4_0f7e2ec5a91b.slice/crio-fd27569d0edc0d88ae72f1151e64665f3ac514cefb98aa086104a5791419885b WatchSource:0}: Error finding container fd27569d0edc0d88ae72f1151e64665f3ac514cefb98aa086104a5791419885b: Status 404 returned error can't find the container with id fd27569d0edc0d88ae72f1151e64665f3ac514cefb98aa086104a5791419885b Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.860072 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m842c" podUID="f2c81917-4047-4d0b-baed-45afa8a53a60" Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.860716 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm" event={"ID":"2db21a73-26d9-44d6-aa91-ba8068b0525a","Type":"ContainerStarted","Data":"fb8fd7fe850c119165cbd1ab3faf291c211db2bc120a0684abbcdc19c388269d"} Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.868451 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm" podUID="2db21a73-26d9-44d6-aa91-ba8068b0525a" Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.869916 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f4w2p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-2sf7v_openstack-operators(ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.869956 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zmq56" event={"ID":"ac8d4a43-01b6-438e-b1d8-d3521ed82176","Type":"ContainerStarted","Data":"59e405bd3d864c1aa427130001be011c8c3e47d38b94662bf119212dcd118c19"} Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.871018 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j46tf" event={"ID":"5908e8b2-d088-4190-8ccf-ea7526921e80","Type":"ContainerStarted","Data":"6b486f0f6e75138b9aeb4a3cd8b05bb54c4de3cd90e2fb02904975b14c950812"} Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.871073 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v" podUID="ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b" Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.872105 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bz8j7" event={"ID":"4e1ee178-3f0e-405a-93cb-9414b2fccbe0","Type":"ContainerStarted","Data":"62851e581ed078bbea65290a7b3653459833305c6b2763a2a12752b9055ec767"} Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.873400 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-692s5" event={"ID":"6131e4c9-d85a-4cdf-9cec-128c9e81bc29","Type":"ContainerStarted","Data":"002afb057afe1b95bc68a582fe84a2537ce8e8cea342ff320b1cb89383de0f00"} Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.876600 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v"] Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.878477 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fp4rs" event={"ID":"001425f5-0a2a-4bdc-a437-d6f9ba3687b4","Type":"ContainerStarted","Data":"e62dc5ae95b05540d8558ea256ee42ab3941efb1ac5caea7cfb1d548aa86c740"} Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.883542 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj" event={"ID":"fe35cc9d-bfc6-4a4d-b21f-06ab55672726","Type":"ContainerStarted","Data":"359d65a0f57bc646af44c2ff7cc4eaaa597e17c870c49a1f07617b6337920dd5"} Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.886235 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2" event={"ID":"d4de9b0c-3812-462a-aa80-ffe00e6d47ca","Type":"ContainerStarted","Data":"d3605b114a88134a91e447c2a6aa37a8ebac70ddfb3cedb03a03b8e1abdf0444"} Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.886800 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj" podUID="fe35cc9d-bfc6-4a4d-b21f-06ab55672726" Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.890206 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4" event={"ID":"994c3a47-47a7-4fbe-9f9c-df011597775b","Type":"ContainerStarted","Data":"8f19fa2a4ac846f4f9179f3072cae473c0dac186c31a05eaee9016e6c03af200"} Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.914201 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h97zm"] Mar 12 13:26:14 crc kubenswrapper[4921]: W0312 13:26:14.926241 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0da206d_658e_47e1_9cfb_5b74237c406a.slice/crio-5fdc6247efd3f0a630a8179545b5eb10298a6d6bcd970642ce0ee535b4eaefb3 WatchSource:0}: Error finding container 5fdc6247efd3f0a630a8179545b5eb10298a6d6bcd970642ce0ee535b4eaefb3: Status 404 returned error can't find the container with id 5fdc6247efd3f0a630a8179545b5eb10298a6d6bcd970642ce0ee535b4eaefb3 Mar 12 13:26:14 crc kubenswrapper[4921]: I0312 13:26:14.967426 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8\" (UID: \"0c9cd39f-8440-4f22-82ce-d3be95bea1be\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.968759 4921 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 13:26:14 crc kubenswrapper[4921]: E0312 13:26:14.968849 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert podName:0c9cd39f-8440-4f22-82ce-d3be95bea1be nodeName:}" failed. No retries permitted until 2026-03-12 13:26:16.968831845 +0000 UTC m=+999.658903806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" (UID: "0c9cd39f-8440-4f22-82ce-d3be95bea1be") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 13:26:15 crc kubenswrapper[4921]: I0312 13:26:15.272722 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:15 crc kubenswrapper[4921]: I0312 13:26:15.272796 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:15 crc kubenswrapper[4921]: E0312 13:26:15.273167 4921 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 13:26:15 crc kubenswrapper[4921]: E0312 13:26:15.273224 4921 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 13:26:15 crc kubenswrapper[4921]: E0312 13:26:15.273266 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs podName:9b888138-4648-48a6-9364-639fb0e0c8b6 nodeName:}" failed. No retries permitted until 2026-03-12 13:26:17.273242154 +0000 UTC m=+999.963314225 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs") pod "openstack-operator-controller-manager-5785b7957-24wxp" (UID: "9b888138-4648-48a6-9364-639fb0e0c8b6") : secret "webhook-server-cert" not found Mar 12 13:26:15 crc kubenswrapper[4921]: E0312 13:26:15.273324 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs podName:9b888138-4648-48a6-9364-639fb0e0c8b6 nodeName:}" failed. No retries permitted until 2026-03-12 13:26:17.273302846 +0000 UTC m=+999.963374817 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs") pod "openstack-operator-controller-manager-5785b7957-24wxp" (UID: "9b888138-4648-48a6-9364-639fb0e0c8b6") : secret "metrics-server-cert" not found Mar 12 13:26:15 crc kubenswrapper[4921]: I0312 13:26:15.900776 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h97zm" event={"ID":"f0da206d-658e-47e1-9cfb-5b74237c406a","Type":"ContainerStarted","Data":"5fdc6247efd3f0a630a8179545b5eb10298a6d6bcd970642ce0ee535b4eaefb3"} Mar 12 13:26:15 crc kubenswrapper[4921]: I0312 13:26:15.903019 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v" event={"ID":"ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b","Type":"ContainerStarted","Data":"fd27569d0edc0d88ae72f1151e64665f3ac514cefb98aa086104a5791419885b"} Mar 12 13:26:15 crc kubenswrapper[4921]: E0312 13:26:15.904089 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v" podUID="ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b" Mar 12 13:26:15 crc kubenswrapper[4921]: I0312 13:26:15.906649 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m842c" event={"ID":"f2c81917-4047-4d0b-baed-45afa8a53a60","Type":"ContainerStarted","Data":"c24030ca0c1c97b97116000df9f9bef1eefba7a0d0b63781df61fbd8f94894ad"} Mar 12 13:26:15 crc kubenswrapper[4921]: E0312 13:26:15.907746 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj" podUID="fe35cc9d-bfc6-4a4d-b21f-06ab55672726" Mar 12 13:26:15 crc kubenswrapper[4921]: E0312 13:26:15.907990 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m842c" podUID="f2c81917-4047-4d0b-baed-45afa8a53a60" Mar 12 13:26:15 crc kubenswrapper[4921]: E0312 13:26:15.908503 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.97:5001/openstack-k8s-operators/nova-operator:8734adf928be66fa1f808466edcc3ea058f7094f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx" podUID="1a0b0ff9-21c3-452f-9ded-00d374fbbcbe" Mar 12 13:26:15 crc kubenswrapper[4921]: E0312 13:26:15.909185 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm" podUID="2db21a73-26d9-44d6-aa91-ba8068b0525a" Mar 12 13:26:16 crc kubenswrapper[4921]: I0312 13:26:16.694096 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert\") pod \"infra-operator-controller-manager-5995f4446f-9tkrv\" (UID: \"c09491c8-72c5-4019-91bf-37ee1a3a937c\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:16 crc kubenswrapper[4921]: E0312 13:26:16.694287 4921 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 13:26:16 crc kubenswrapper[4921]: E0312 13:26:16.694367 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert podName:c09491c8-72c5-4019-91bf-37ee1a3a937c nodeName:}" failed. No retries permitted until 2026-03-12 13:26:20.694348895 +0000 UTC m=+1003.384420866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert") pod "infra-operator-controller-manager-5995f4446f-9tkrv" (UID: "c09491c8-72c5-4019-91bf-37ee1a3a937c") : secret "infra-operator-webhook-server-cert" not found Mar 12 13:26:16 crc kubenswrapper[4921]: E0312 13:26:16.915116 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m842c" podUID="f2c81917-4047-4d0b-baed-45afa8a53a60" Mar 12 13:26:16 crc kubenswrapper[4921]: E0312 13:26:16.915414 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v" podUID="ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b" Mar 12 13:26:16 crc kubenswrapper[4921]: I0312 13:26:16.998682 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8\" (UID: \"0c9cd39f-8440-4f22-82ce-d3be95bea1be\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:16 crc kubenswrapper[4921]: E0312 13:26:16.998852 4921 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 13:26:16 crc kubenswrapper[4921]: E0312 13:26:16.998933 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert podName:0c9cd39f-8440-4f22-82ce-d3be95bea1be nodeName:}" failed. No retries permitted until 2026-03-12 13:26:20.998914309 +0000 UTC m=+1003.688986280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" (UID: "0c9cd39f-8440-4f22-82ce-d3be95bea1be") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 13:26:17 crc kubenswrapper[4921]: I0312 13:26:17.302436 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:17 crc kubenswrapper[4921]: I0312 13:26:17.302487 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:17 crc kubenswrapper[4921]: E0312 13:26:17.302624 4921 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 13:26:17 crc kubenswrapper[4921]: E0312 13:26:17.302668 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs podName:9b888138-4648-48a6-9364-639fb0e0c8b6 nodeName:}" failed. No retries permitted until 2026-03-12 13:26:21.302656307 +0000 UTC m=+1003.992728278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs") pod "openstack-operator-controller-manager-5785b7957-24wxp" (UID: "9b888138-4648-48a6-9364-639fb0e0c8b6") : secret "webhook-server-cert" not found Mar 12 13:26:17 crc kubenswrapper[4921]: E0312 13:26:17.302999 4921 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 13:26:17 crc kubenswrapper[4921]: E0312 13:26:17.303136 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs podName:9b888138-4648-48a6-9364-639fb0e0c8b6 nodeName:}" failed. No retries permitted until 2026-03-12 13:26:21.303106932 +0000 UTC m=+1003.993178953 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs") pod "openstack-operator-controller-manager-5785b7957-24wxp" (UID: "9b888138-4648-48a6-9364-639fb0e0c8b6") : secret "metrics-server-cert" not found Mar 12 13:26:20 crc kubenswrapper[4921]: I0312 13:26:20.751177 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert\") pod \"infra-operator-controller-manager-5995f4446f-9tkrv\" (UID: \"c09491c8-72c5-4019-91bf-37ee1a3a937c\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:20 crc kubenswrapper[4921]: E0312 13:26:20.751529 4921 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 13:26:20 crc kubenswrapper[4921]: E0312 13:26:20.751616 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert podName:c09491c8-72c5-4019-91bf-37ee1a3a937c nodeName:}" failed. No retries permitted until 2026-03-12 13:26:28.751591772 +0000 UTC m=+1011.441663813 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert") pod "infra-operator-controller-manager-5995f4446f-9tkrv" (UID: "c09491c8-72c5-4019-91bf-37ee1a3a937c") : secret "infra-operator-webhook-server-cert" not found Mar 12 13:26:21 crc kubenswrapper[4921]: I0312 13:26:21.055638 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8\" (UID: \"0c9cd39f-8440-4f22-82ce-d3be95bea1be\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:21 crc kubenswrapper[4921]: E0312 13:26:21.055795 4921 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 13:26:21 crc kubenswrapper[4921]: E0312 13:26:21.055894 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert podName:0c9cd39f-8440-4f22-82ce-d3be95bea1be nodeName:}" failed. No retries permitted until 2026-03-12 13:26:29.055873558 +0000 UTC m=+1011.745945529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" (UID: "0c9cd39f-8440-4f22-82ce-d3be95bea1be") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 13:26:21 crc kubenswrapper[4921]: I0312 13:26:21.360316 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:21 crc kubenswrapper[4921]: E0312 13:26:21.360498 4921 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 13:26:21 crc kubenswrapper[4921]: I0312 13:26:21.360553 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:21 crc kubenswrapper[4921]: E0312 13:26:21.360569 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs podName:9b888138-4648-48a6-9364-639fb0e0c8b6 nodeName:}" failed. No retries permitted until 2026-03-12 13:26:29.360551375 +0000 UTC m=+1012.050623346 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs") pod "openstack-operator-controller-manager-5785b7957-24wxp" (UID: "9b888138-4648-48a6-9364-639fb0e0c8b6") : secret "webhook-server-cert" not found Mar 12 13:26:21 crc kubenswrapper[4921]: E0312 13:26:21.360767 4921 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 13:26:21 crc kubenswrapper[4921]: E0312 13:26:21.360889 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs podName:9b888138-4648-48a6-9364-639fb0e0c8b6 nodeName:}" failed. No retries permitted until 2026-03-12 13:26:29.360863405 +0000 UTC m=+1012.050935416 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs") pod "openstack-operator-controller-manager-5785b7957-24wxp" (UID: "9b888138-4648-48a6-9364-639fb0e0c8b6") : secret "metrics-server-cert" not found Mar 12 13:26:26 crc kubenswrapper[4921]: I0312 13:26:26.323238 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:26:26 crc kubenswrapper[4921]: I0312 13:26:26.324011 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:26:26 crc kubenswrapper[4921]: E0312 13:26:26.805043 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f" Mar 12 13:26:26 crc kubenswrapper[4921]: E0312 13:26:26.805227 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dsnml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-x4tf4_openstack-operators(994c3a47-47a7-4fbe-9f9c-df011597775b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 13:26:26 crc kubenswrapper[4921]: E0312 13:26:26.806390 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4" podUID="994c3a47-47a7-4fbe-9f9c-df011597775b" Mar 12 13:26:26 crc kubenswrapper[4921]: E0312 13:26:26.981671 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4" podUID="994c3a47-47a7-4fbe-9f9c-df011597775b" Mar 12 13:26:27 crc kubenswrapper[4921]: E0312 13:26:27.459222 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 12 13:26:27 crc kubenswrapper[4921]: E0312 13:26:27.459433 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qjnts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-v42m2_openstack-operators(d4de9b0c-3812-462a-aa80-ffe00e6d47ca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 13:26:27 crc kubenswrapper[4921]: E0312 13:26:27.460539 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2" podUID="d4de9b0c-3812-462a-aa80-ffe00e6d47ca" Mar 12 13:26:27 crc kubenswrapper[4921]: I0312 13:26:27.998022 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-692s5" event={"ID":"6131e4c9-d85a-4cdf-9cec-128c9e81bc29","Type":"ContainerStarted","Data":"d5fb71f6d3a4e1244fe498532f1423b3bbea3b65d5d2bd2bbdf4021b4e1cc032"} Mar 12 13:26:27 crc kubenswrapper[4921]: I0312 13:26:27.998377 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-692s5" Mar 12 13:26:27 crc kubenswrapper[4921]: I0312 13:26:27.998393 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kzh67" event={"ID":"2394f3bd-4f8b-4036-b240-7ed71b80798a","Type":"ContainerStarted","Data":"8189c378384f21a4ad03017abba194251a84a0e93303b1c4cd5fe25a733db0dc"} Mar 12 13:26:27 crc kubenswrapper[4921]: I0312 13:26:27.998408 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kzh67" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.001357 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h97zm" event={"ID":"f0da206d-658e-47e1-9cfb-5b74237c406a","Type":"ContainerStarted","Data":"21f4a47525c71708012b020f6b9a87e00b8fc4fac21b05cea95b79f52b5442c4"} Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.003661 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-xzm8h" event={"ID":"fd1bc9ca-529d-4d59-a236-db1bb5c121ca","Type":"ContainerStarted","Data":"40b6467395216a57dc1acf1544bad75d7bf22f9d0936ebbb39c1c617fdb933ce"} Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.003776 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-xzm8h" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.004896 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-67xqg" event={"ID":"6a1a1aea-a74a-4886-ae24-1d188243e859","Type":"ContainerStarted","Data":"de59a82891a7c1bef6f9d348f701ff43aeed32b10a759876deb7bc2a3669daed"} Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.005514 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-67xqg" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.008621 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-64dcj" event={"ID":"3a930c0b-6c3b-4a1d-b02f-1190a124ceb2","Type":"ContainerStarted","Data":"f5bbcdcb1dd35d7b3c4257bba88d08f42485f6e9569d8da04ec15d8743ef6306"} Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.008827 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-64dcj" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.011604 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-nq8wj" event={"ID":"c6de3785-ea06-49bb-9b39-d8f2f10bce81","Type":"ContainerStarted","Data":"18d76e51e288863dc7e70b82f397a5e7913f569c4400ecb9c31da9fd5f32b6c5"} Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.011699 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-nq8wj" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.015859 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jt7c" event={"ID":"7494cb10-090c-4ac2-bbf1-663979f3e4cf","Type":"ContainerStarted","Data":"378a40a38fc01cbdcd386c91894227be8ffcaa8f8e068b62e106c568d15a8f1e"} Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.016462 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jt7c" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.016850 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-692s5" podStartSLOduration=3.271082847 podStartE2EDuration="16.016831546s" podCreationTimestamp="2026-03-12 13:26:12 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.744954832 +0000 UTC m=+997.435026803" lastFinishedPulling="2026-03-12 13:26:27.490703531 +0000 UTC m=+1010.180775502" observedRunningTime="2026-03-12 13:26:28.014582265 +0000 UTC m=+1010.704654236" watchObservedRunningTime="2026-03-12 13:26:28.016831546 +0000 UTC m=+1010.706903517" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.018695 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fp4rs" event={"ID":"001425f5-0a2a-4bdc-a437-d6f9ba3687b4","Type":"ContainerStarted","Data":"e72f53a3057173c39872851cdea6419f0d26aaf72705853ec824268b8085c617"} Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.018866 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fp4rs" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.021125 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-dmwhv" event={"ID":"0cc6c5ac-1bcd-4636-924a-8a6d6ebfaeea","Type":"ContainerStarted","Data":"7f28f5a6300644cc7f3f8c52d0fc15fcd5ade0acfb920ae8cdd9c42a48cbe98b"} Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.021256 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-dmwhv" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.034139 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bz8j7" event={"ID":"4e1ee178-3f0e-405a-93cb-9414b2fccbe0","Type":"ContainerStarted","Data":"69bd899303f5ea4c75aa4d2c6ec28dbc513cd81732c004f69621c6bee252436e"} Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.034319 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bz8j7" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.036430 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zmq56" event={"ID":"ac8d4a43-01b6-438e-b1d8-d3521ed82176","Type":"ContainerStarted","Data":"97f75f34959827aad6224af45e557a1e03db9ba277bded68be0cec7014bb2908"} Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.036615 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zmq56" Mar 12 13:26:28 crc kubenswrapper[4921]: E0312 13:26:28.038084 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2" podUID="d4de9b0c-3812-462a-aa80-ffe00e6d47ca" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.098216 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-67xqg" podStartSLOduration=2.996920285 podStartE2EDuration="16.098185778s" podCreationTimestamp="2026-03-12 13:26:12 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.329170018 +0000 UTC m=+997.019241989" lastFinishedPulling="2026-03-12 13:26:27.430435511 +0000 UTC m=+1010.120507482" observedRunningTime="2026-03-12 13:26:28.045386061 +0000 UTC m=+1010.735458032" watchObservedRunningTime="2026-03-12 13:26:28.098185778 +0000 UTC m=+1010.788257749" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.100450 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kzh67" podStartSLOduration=3.211700123 podStartE2EDuration="16.100442238s" podCreationTimestamp="2026-03-12 13:26:12 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.591665633 +0000 UTC m=+997.281737604" lastFinishedPulling="2026-03-12 13:26:27.480407748 +0000 UTC m=+1010.170479719" observedRunningTime="2026-03-12 13:26:28.080351789 +0000 UTC m=+1010.770423760" watchObservedRunningTime="2026-03-12 13:26:28.100442238 +0000 UTC m=+1010.790514209" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.116392 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-h97zm" podStartSLOduration=2.551894196 podStartE2EDuration="15.116354398s" podCreationTimestamp="2026-03-12 13:26:13 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.928641394 +0000 UTC m=+997.618713365" lastFinishedPulling="2026-03-12 13:26:27.493101606 +0000 UTC m=+1010.183173567" observedRunningTime="2026-03-12 13:26:28.109444511 +0000 UTC m=+1010.799516482" watchObservedRunningTime="2026-03-12 13:26:28.116354398 +0000 UTC m=+1010.806426369" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.144158 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-64dcj" podStartSLOduration=2.2666827290000002 podStartE2EDuration="15.14414026s" podCreationTimestamp="2026-03-12 13:26:13 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.582044731 +0000 UTC m=+997.272116702" lastFinishedPulling="2026-03-12 13:26:27.459502262 +0000 UTC m=+1010.149574233" observedRunningTime="2026-03-12 13:26:28.136158649 +0000 UTC m=+1010.826230620" watchObservedRunningTime="2026-03-12 13:26:28.14414026 +0000 UTC m=+1010.834212231" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.198345 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-xzm8h" podStartSLOduration=3.113661797 podStartE2EDuration="16.198327889s" podCreationTimestamp="2026-03-12 13:26:12 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.376648767 +0000 UTC m=+997.066720738" lastFinishedPulling="2026-03-12 13:26:27.461314859 +0000 UTC m=+1010.151386830" observedRunningTime="2026-03-12 13:26:28.196714929 +0000 UTC m=+1010.886786900" watchObservedRunningTime="2026-03-12 13:26:28.198327889 +0000 UTC m=+1010.888399860" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.224984 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-nq8wj" podStartSLOduration=3.015901411 podStartE2EDuration="16.224969145s" podCreationTimestamp="2026-03-12 13:26:12 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.242691915 +0000 UTC m=+996.932763886" lastFinishedPulling="2026-03-12 13:26:27.451759649 +0000 UTC m=+1010.141831620" observedRunningTime="2026-03-12 13:26:28.222678264 +0000 UTC m=+1010.912750235" watchObservedRunningTime="2026-03-12 13:26:28.224969145 +0000 UTC m=+1010.915041116" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.253420 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zmq56" podStartSLOduration=2.847576231 podStartE2EDuration="16.253403987s" podCreationTimestamp="2026-03-12 13:26:12 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.024723258 +0000 UTC m=+996.714795229" lastFinishedPulling="2026-03-12 13:26:27.430551004 +0000 UTC m=+1010.120622985" observedRunningTime="2026-03-12 13:26:28.251481676 +0000 UTC m=+1010.941553647" watchObservedRunningTime="2026-03-12 13:26:28.253403987 +0000 UTC m=+1010.943475958" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.293285 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-dmwhv" podStartSLOduration=2.701772428 podStartE2EDuration="16.293269538s" podCreationTimestamp="2026-03-12 13:26:12 +0000 UTC" firstStartedPulling="2026-03-12 13:26:13.890776656 +0000 UTC m=+996.580848627" lastFinishedPulling="2026-03-12 13:26:27.482273766 +0000 UTC m=+1010.172345737" observedRunningTime="2026-03-12 13:26:28.29015582 +0000 UTC m=+1010.980227791" watchObservedRunningTime="2026-03-12 13:26:28.293269538 +0000 UTC m=+1010.983341509" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.398764 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fp4rs" podStartSLOduration=3.209690241 podStartE2EDuration="16.398744997s" podCreationTimestamp="2026-03-12 13:26:12 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.242952974 +0000 UTC m=+996.933024945" lastFinishedPulling="2026-03-12 13:26:27.43200772 +0000 UTC m=+1010.122079701" observedRunningTime="2026-03-12 13:26:28.387236866 +0000 UTC m=+1011.077308837" watchObservedRunningTime="2026-03-12 13:26:28.398744997 +0000 UTC m=+1011.088816968" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.412035 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bz8j7" podStartSLOduration=2.677641081 podStartE2EDuration="15.412021793s" podCreationTimestamp="2026-03-12 13:26:13 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.734297538 +0000 UTC m=+997.424369509" lastFinishedPulling="2026-03-12 13:26:27.46867825 +0000 UTC m=+1010.158750221" observedRunningTime="2026-03-12 13:26:28.410262458 +0000 UTC m=+1011.100334419" watchObservedRunningTime="2026-03-12 13:26:28.412021793 +0000 UTC m=+1011.102093754" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.780985 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert\") pod \"infra-operator-controller-manager-5995f4446f-9tkrv\" (UID: \"c09491c8-72c5-4019-91bf-37ee1a3a937c\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.786094 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c09491c8-72c5-4019-91bf-37ee1a3a937c-cert\") pod \"infra-operator-controller-manager-5995f4446f-9tkrv\" (UID: \"c09491c8-72c5-4019-91bf-37ee1a3a937c\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:28 crc kubenswrapper[4921]: I0312 13:26:28.887994 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:29 crc kubenswrapper[4921]: I0312 13:26:29.087005 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8\" (UID: \"0c9cd39f-8440-4f22-82ce-d3be95bea1be\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:29 crc kubenswrapper[4921]: E0312 13:26:29.087557 4921 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 13:26:29 crc kubenswrapper[4921]: E0312 13:26:29.087620 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert podName:0c9cd39f-8440-4f22-82ce-d3be95bea1be nodeName:}" failed. No retries permitted until 2026-03-12 13:26:45.087597296 +0000 UTC m=+1027.777669267 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" (UID: "0c9cd39f-8440-4f22-82ce-d3be95bea1be") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 13:26:29 crc kubenswrapper[4921]: I0312 13:26:29.088134 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j46tf" event={"ID":"5908e8b2-d088-4190-8ccf-ea7526921e80","Type":"ContainerStarted","Data":"8818da9e7526b2539e5c9d217864d4bdfd850847fb56e71b61d44135b762cc42"} Mar 12 13:26:29 crc kubenswrapper[4921]: I0312 13:26:29.089888 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j46tf" Mar 12 13:26:29 crc kubenswrapper[4921]: I0312 13:26:29.112174 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j46tf" podStartSLOduration=3.6777106120000003 podStartE2EDuration="17.112149036s" podCreationTimestamp="2026-03-12 13:26:12 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.04869609 +0000 UTC m=+996.738768061" lastFinishedPulling="2026-03-12 13:26:27.483134504 +0000 UTC m=+1010.173206485" observedRunningTime="2026-03-12 13:26:29.103368351 +0000 UTC m=+1011.793440322" watchObservedRunningTime="2026-03-12 13:26:29.112149036 +0000 UTC m=+1011.802221007" Mar 12 13:26:29 crc kubenswrapper[4921]: I0312 13:26:29.131303 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jt7c" podStartSLOduration=3.799062719 podStartE2EDuration="17.131284146s" podCreationTimestamp="2026-03-12 13:26:12 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.150294637 +0000 UTC m=+996.840366608" lastFinishedPulling="2026-03-12 13:26:27.482516054 +0000 UTC m=+1010.172588035" observedRunningTime="2026-03-12 13:26:28.442689855 +0000 UTC m=+1011.132761826" watchObservedRunningTime="2026-03-12 13:26:29.131284146 +0000 UTC m=+1011.821356107" Mar 12 13:26:29 crc kubenswrapper[4921]: I0312 13:26:29.393444 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:29 crc kubenswrapper[4921]: I0312 13:26:29.393516 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:29 crc kubenswrapper[4921]: E0312 13:26:29.393676 4921 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 13:26:29 crc kubenswrapper[4921]: E0312 13:26:29.393769 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs podName:9b888138-4648-48a6-9364-639fb0e0c8b6 nodeName:}" failed. No retries permitted until 2026-03-12 13:26:45.39374851 +0000 UTC m=+1028.083820481 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs") pod "openstack-operator-controller-manager-5785b7957-24wxp" (UID: "9b888138-4648-48a6-9364-639fb0e0c8b6") : secret "webhook-server-cert" not found Mar 12 13:26:29 crc kubenswrapper[4921]: E0312 13:26:29.393686 4921 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 13:26:29 crc kubenswrapper[4921]: E0312 13:26:29.393880 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs podName:9b888138-4648-48a6-9364-639fb0e0c8b6 nodeName:}" failed. No retries permitted until 2026-03-12 13:26:45.393860093 +0000 UTC m=+1028.083932064 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs") pod "openstack-operator-controller-manager-5785b7957-24wxp" (UID: "9b888138-4648-48a6-9364-639fb0e0c8b6") : secret "metrics-server-cert" not found Mar 12 13:26:29 crc kubenswrapper[4921]: I0312 13:26:29.490309 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv"] Mar 12 13:26:30 crc kubenswrapper[4921]: I0312 13:26:30.103035 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" event={"ID":"c09491c8-72c5-4019-91bf-37ee1a3a937c","Type":"ContainerStarted","Data":"74be47e50408425cc345379c4736c7f5e9f537b6fbcf5c7044ec618be7e5260e"} Mar 12 13:26:33 crc kubenswrapper[4921]: I0312 13:26:33.126198 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-dmwhv" Mar 12 13:26:33 crc kubenswrapper[4921]: I0312 13:26:33.151495 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-zmq56" Mar 12 13:26:33 crc kubenswrapper[4921]: I0312 13:26:33.162566 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-j46tf" Mar 12 13:26:33 crc kubenswrapper[4921]: I0312 13:26:33.197424 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-5jt7c" Mar 12 13:26:33 crc kubenswrapper[4921]: I0312 13:26:33.227969 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fp4rs" Mar 12 13:26:33 crc kubenswrapper[4921]: I0312 13:26:33.245276 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-nq8wj" Mar 12 13:26:33 crc kubenswrapper[4921]: I0312 13:26:33.322629 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-67xqg" Mar 12 13:26:33 crc kubenswrapper[4921]: I0312 13:26:33.549570 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-xzm8h" Mar 12 13:26:33 crc kubenswrapper[4921]: I0312 13:26:33.569199 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-692s5" Mar 12 13:26:33 crc kubenswrapper[4921]: I0312 13:26:33.581793 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-kzh67" Mar 12 13:26:33 crc kubenswrapper[4921]: I0312 13:26:33.602644 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-64dcj" Mar 12 13:26:33 crc kubenswrapper[4921]: I0312 13:26:33.674637 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bz8j7" Mar 12 13:26:38 crc kubenswrapper[4921]: I0312 13:26:38.830294 4921 scope.go:117] "RemoveContainer" containerID="1e1266d4e3cb9f6d4ada0273b076dc709587dba584b297b496c13f57a8b2cc16" Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.162636 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v" event={"ID":"ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b","Type":"ContainerStarted","Data":"06557480f2beab7cd943442187865c3b5ff1be69ed6b1cbd45cda4a356e0aaa4"} Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.163148 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v" Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.164208 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m842c" event={"ID":"f2c81917-4047-4d0b-baed-45afa8a53a60","Type":"ContainerStarted","Data":"2614ac7135706e2aa87639c3db68b8024775fae116c63168f776ee34426735a0"} Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.164759 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m842c" Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.165843 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" event={"ID":"c09491c8-72c5-4019-91bf-37ee1a3a937c","Type":"ContainerStarted","Data":"c8492c6a78d953642347e1b08fc49f85fc0b1dc2d38930a30bfde440251d0974"} Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.166175 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.167992 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx" event={"ID":"1a0b0ff9-21c3-452f-9ded-00d374fbbcbe","Type":"ContainerStarted","Data":"dca51f0c94e6e67a0b24078c02fc10ea3d1f39da9169562549c653a0a1c0aaf4"} Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.168199 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx" Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.170855 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj" event={"ID":"fe35cc9d-bfc6-4a4d-b21f-06ab55672726","Type":"ContainerStarted","Data":"19252047e8057399d52c6402a1f96c4611e37e1fc4858341509ab2a931be49d4"} Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.171011 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj" Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.172438 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm" event={"ID":"2db21a73-26d9-44d6-aa91-ba8068b0525a","Type":"ContainerStarted","Data":"89b1089981a2fc371f0204bca5e593e8ac037e08158c0cd633fd48cdf6441808"} Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.172636 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm" Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.182154 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v" podStartSLOduration=2.615970335 podStartE2EDuration="26.182121394s" podCreationTimestamp="2026-03-12 13:26:13 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.869803898 +0000 UTC m=+997.559875869" lastFinishedPulling="2026-03-12 13:26:38.435954957 +0000 UTC m=+1021.126026928" observedRunningTime="2026-03-12 13:26:39.178932534 +0000 UTC m=+1021.869004505" watchObservedRunningTime="2026-03-12 13:26:39.182121394 +0000 UTC m=+1021.872193365" Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.194681 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" podStartSLOduration=18.372546245 podStartE2EDuration="27.194667058s" podCreationTimestamp="2026-03-12 13:26:12 +0000 UTC" firstStartedPulling="2026-03-12 13:26:29.524034867 +0000 UTC m=+1012.214106838" lastFinishedPulling="2026-03-12 13:26:38.34615568 +0000 UTC m=+1021.036227651" observedRunningTime="2026-03-12 13:26:39.192435968 +0000 UTC m=+1021.882507939" watchObservedRunningTime="2026-03-12 13:26:39.194667058 +0000 UTC m=+1021.884739029" Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.214183 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj" podStartSLOduration=2.562645053 podStartE2EDuration="26.21416596s" podCreationTimestamp="2026-03-12 13:26:13 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.749411101 +0000 UTC m=+997.439483072" lastFinishedPulling="2026-03-12 13:26:38.400932008 +0000 UTC m=+1021.091003979" observedRunningTime="2026-03-12 13:26:39.210737512 +0000 UTC m=+1021.900809493" watchObservedRunningTime="2026-03-12 13:26:39.21416596 +0000 UTC m=+1021.904237931" Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.226960 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm" podStartSLOduration=3.354224795 podStartE2EDuration="26.22693886s" podCreationTimestamp="2026-03-12 13:26:13 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.754731408 +0000 UTC m=+997.444803379" lastFinishedPulling="2026-03-12 13:26:37.627445463 +0000 UTC m=+1020.317517444" observedRunningTime="2026-03-12 13:26:39.226470435 +0000 UTC m=+1021.916542406" watchObservedRunningTime="2026-03-12 13:26:39.22693886 +0000 UTC m=+1021.917010831" Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.251740 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m842c" podStartSLOduration=2.671970702 podStartE2EDuration="26.251719358s" podCreationTimestamp="2026-03-12 13:26:13 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.858796933 +0000 UTC m=+997.548868904" lastFinishedPulling="2026-03-12 13:26:38.438545589 +0000 UTC m=+1021.128617560" observedRunningTime="2026-03-12 13:26:39.250637274 +0000 UTC m=+1021.940709245" watchObservedRunningTime="2026-03-12 13:26:39.251719358 +0000 UTC m=+1021.941791329" Mar 12 13:26:39 crc kubenswrapper[4921]: I0312 13:26:39.266503 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx" podStartSLOduration=3.673707467 podStartE2EDuration="27.266488311s" podCreationTimestamp="2026-03-12 13:26:12 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.753371106 +0000 UTC m=+997.443443077" lastFinishedPulling="2026-03-12 13:26:38.34615195 +0000 UTC m=+1021.036223921" observedRunningTime="2026-03-12 13:26:39.265151709 +0000 UTC m=+1021.955223680" watchObservedRunningTime="2026-03-12 13:26:39.266488311 +0000 UTC m=+1021.956560282" Mar 12 13:26:40 crc kubenswrapper[4921]: I0312 13:26:40.178966 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2" event={"ID":"d4de9b0c-3812-462a-aa80-ffe00e6d47ca","Type":"ContainerStarted","Data":"e3cd50264259a4f0f52a7098874ab7a2a9ae5fc1228f30bcba24d64549efad29"} Mar 12 13:26:40 crc kubenswrapper[4921]: I0312 13:26:40.180178 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2" Mar 12 13:26:40 crc kubenswrapper[4921]: I0312 13:26:40.180413 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4" event={"ID":"994c3a47-47a7-4fbe-9f9c-df011597775b","Type":"ContainerStarted","Data":"525903b0a430ec5f5532ef86479d4ad54a4eeb0f0871a98f03a592ba865cc29d"} Mar 12 13:26:40 crc kubenswrapper[4921]: I0312 13:26:40.198319 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2" podStartSLOduration=2.979997635 podStartE2EDuration="28.198297022s" podCreationTimestamp="2026-03-12 13:26:12 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.154479249 +0000 UTC m=+996.844551210" lastFinishedPulling="2026-03-12 13:26:39.372778626 +0000 UTC m=+1022.062850597" observedRunningTime="2026-03-12 13:26:40.197077294 +0000 UTC m=+1022.887149275" watchObservedRunningTime="2026-03-12 13:26:40.198297022 +0000 UTC m=+1022.888368993" Mar 12 13:26:40 crc kubenswrapper[4921]: I0312 13:26:40.220525 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4" podStartSLOduration=2.3914871939999998 podStartE2EDuration="27.220508829s" podCreationTimestamp="2026-03-12 13:26:13 +0000 UTC" firstStartedPulling="2026-03-12 13:26:14.743977451 +0000 UTC m=+997.434049422" lastFinishedPulling="2026-03-12 13:26:39.572999086 +0000 UTC m=+1022.263071057" observedRunningTime="2026-03-12 13:26:40.214089197 +0000 UTC m=+1022.904161168" watchObservedRunningTime="2026-03-12 13:26:40.220508829 +0000 UTC m=+1022.910580800" Mar 12 13:26:43 crc kubenswrapper[4921]: I0312 13:26:43.576381 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4" Mar 12 13:26:43 crc kubenswrapper[4921]: I0312 13:26:43.612882 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m842c" Mar 12 13:26:43 crc kubenswrapper[4921]: I0312 13:26:43.675749 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-dlgkj" Mar 12 13:26:43 crc kubenswrapper[4921]: I0312 13:26:43.675803 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-686d5f9fbd-hmkmx" Mar 12 13:26:43 crc kubenswrapper[4921]: I0312 13:26:43.689401 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-2sf7v" Mar 12 13:26:43 crc kubenswrapper[4921]: I0312 13:26:43.710752 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-7l7sm" Mar 12 13:26:45 crc kubenswrapper[4921]: I0312 13:26:45.143652 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8\" (UID: \"0c9cd39f-8440-4f22-82ce-d3be95bea1be\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:45 crc kubenswrapper[4921]: I0312 13:26:45.156302 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c9cd39f-8440-4f22-82ce-d3be95bea1be-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8\" (UID: \"0c9cd39f-8440-4f22-82ce-d3be95bea1be\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:45 crc kubenswrapper[4921]: I0312 13:26:45.324702 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-dwtg9" Mar 12 13:26:45 crc kubenswrapper[4921]: I0312 13:26:45.333673 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:45 crc kubenswrapper[4921]: I0312 13:26:45.447739 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:45 crc kubenswrapper[4921]: I0312 13:26:45.448170 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:45 crc kubenswrapper[4921]: I0312 13:26:45.452473 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-webhook-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:45 crc kubenswrapper[4921]: I0312 13:26:45.454660 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b888138-4648-48a6-9364-639fb0e0c8b6-metrics-certs\") pod \"openstack-operator-controller-manager-5785b7957-24wxp\" (UID: \"9b888138-4648-48a6-9364-639fb0e0c8b6\") " pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:45 crc kubenswrapper[4921]: I0312 13:26:45.523902 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-8cqnd" Mar 12 13:26:45 crc kubenswrapper[4921]: I0312 13:26:45.533124 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:45 crc kubenswrapper[4921]: I0312 13:26:45.561285 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8"] Mar 12 13:26:45 crc kubenswrapper[4921]: W0312 13:26:45.568951 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c9cd39f_8440_4f22_82ce_d3be95bea1be.slice/crio-abd5dec7817923acbfca9ea546637c3e03e2b9c40a95559e57afcbccd8112185 WatchSource:0}: Error finding container abd5dec7817923acbfca9ea546637c3e03e2b9c40a95559e57afcbccd8112185: Status 404 returned error can't find the container with id abd5dec7817923acbfca9ea546637c3e03e2b9c40a95559e57afcbccd8112185 Mar 12 13:26:45 crc kubenswrapper[4921]: I0312 13:26:45.983184 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp"] Mar 12 13:26:45 crc kubenswrapper[4921]: W0312 13:26:45.989714 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b888138_4648_48a6_9364_639fb0e0c8b6.slice/crio-d8efe80d7c373f9fb0d88be794b461d2c4a4b782a4f65c216bd5dba20de21701 WatchSource:0}: Error finding container d8efe80d7c373f9fb0d88be794b461d2c4a4b782a4f65c216bd5dba20de21701: Status 404 returned error can't find the container with id d8efe80d7c373f9fb0d88be794b461d2c4a4b782a4f65c216bd5dba20de21701 Mar 12 13:26:46 crc kubenswrapper[4921]: I0312 13:26:46.230503 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" event={"ID":"9b888138-4648-48a6-9364-639fb0e0c8b6","Type":"ContainerStarted","Data":"d8efe80d7c373f9fb0d88be794b461d2c4a4b782a4f65c216bd5dba20de21701"} Mar 12 13:26:46 crc kubenswrapper[4921]: I0312 13:26:46.231845 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" event={"ID":"0c9cd39f-8440-4f22-82ce-d3be95bea1be","Type":"ContainerStarted","Data":"abd5dec7817923acbfca9ea546637c3e03e2b9c40a95559e57afcbccd8112185"} Mar 12 13:26:47 crc kubenswrapper[4921]: I0312 13:26:47.239522 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" event={"ID":"9b888138-4648-48a6-9364-639fb0e0c8b6","Type":"ContainerStarted","Data":"1c7b3c89329612d16ed36457e73d91d545b9a7b2334fae3755a08df345021faa"} Mar 12 13:26:47 crc kubenswrapper[4921]: I0312 13:26:47.239932 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:47 crc kubenswrapper[4921]: I0312 13:26:47.266774 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" podStartSLOduration=34.266752782 podStartE2EDuration="34.266752782s" podCreationTimestamp="2026-03-12 13:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:26:47.263710787 +0000 UTC m=+1029.953782758" watchObservedRunningTime="2026-03-12 13:26:47.266752782 +0000 UTC m=+1029.956824753" Mar 12 13:26:48 crc kubenswrapper[4921]: I0312 13:26:48.903579 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-9tkrv" Mar 12 13:26:49 crc kubenswrapper[4921]: I0312 13:26:49.255531 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" event={"ID":"0c9cd39f-8440-4f22-82ce-d3be95bea1be","Type":"ContainerStarted","Data":"1e42a7e2edd4f0a7fcb5f2e9b398d9ebc7ae6d96d0dfd5ff3cc4c53a4dbdbb30"} Mar 12 13:26:49 crc kubenswrapper[4921]: I0312 13:26:49.255693 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:49 crc kubenswrapper[4921]: I0312 13:26:49.280515 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" podStartSLOduration=33.430363054 podStartE2EDuration="36.280494334s" podCreationTimestamp="2026-03-12 13:26:13 +0000 UTC" firstStartedPulling="2026-03-12 13:26:45.576971063 +0000 UTC m=+1028.267043034" lastFinishedPulling="2026-03-12 13:26:48.427102343 +0000 UTC m=+1031.117174314" observedRunningTime="2026-03-12 13:26:49.27654168 +0000 UTC m=+1031.966613681" watchObservedRunningTime="2026-03-12 13:26:49.280494334 +0000 UTC m=+1031.970566315" Mar 12 13:26:53 crc kubenswrapper[4921]: I0312 13:26:53.532001 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-v42m2" Mar 12 13:26:53 crc kubenswrapper[4921]: I0312 13:26:53.578855 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-x4tf4" Mar 12 13:26:55 crc kubenswrapper[4921]: I0312 13:26:55.343804 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8" Mar 12 13:26:55 crc kubenswrapper[4921]: I0312 13:26:55.539573 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5785b7957-24wxp" Mar 12 13:26:56 crc kubenswrapper[4921]: I0312 13:26:56.324534 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:26:56 crc kubenswrapper[4921]: I0312 13:26:56.324628 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:26:56 crc kubenswrapper[4921]: I0312 13:26:56.324702 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:26:56 crc kubenswrapper[4921]: I0312 13:26:56.325735 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d3b38af5e8a74ac4ff0f8e664ea487d80830b0618d599c37b78cc47d7d985662"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 13:26:56 crc kubenswrapper[4921]: I0312 13:26:56.325922 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://d3b38af5e8a74ac4ff0f8e664ea487d80830b0618d599c37b78cc47d7d985662" gracePeriod=600 Mar 12 13:26:57 crc kubenswrapper[4921]: I0312 13:26:57.329593 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="d3b38af5e8a74ac4ff0f8e664ea487d80830b0618d599c37b78cc47d7d985662" exitCode=0 Mar 12 13:26:57 crc kubenswrapper[4921]: I0312 13:26:57.329669 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"d3b38af5e8a74ac4ff0f8e664ea487d80830b0618d599c37b78cc47d7d985662"} Mar 12 13:26:57 crc kubenswrapper[4921]: I0312 13:26:57.330265 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"f7722c7345ffa51f6b2d5016c3d605416a6961812caddc8f13639d2d6299573d"} Mar 12 13:26:57 crc kubenswrapper[4921]: I0312 13:26:57.330295 4921 scope.go:117] "RemoveContainer" containerID="107f4a8503d4c0486ad2c3402e1b2b2b1ceede9b611f44e27a27f3de56a8e4cf" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.108635 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g24hv"] Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.112265 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g24hv" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.114065 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.114300 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rwlf7" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.114460 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.114690 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.125854 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g24hv"] Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.175239 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwjdp\" (UniqueName: \"kubernetes.io/projected/3783026e-fb94-4df1-91af-59307a31aa5c-kube-api-access-qwjdp\") pod \"dnsmasq-dns-675f4bcbfc-g24hv\" (UID: \"3783026e-fb94-4df1-91af-59307a31aa5c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g24hv" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.175329 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3783026e-fb94-4df1-91af-59307a31aa5c-config\") pod \"dnsmasq-dns-675f4bcbfc-g24hv\" (UID: \"3783026e-fb94-4df1-91af-59307a31aa5c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g24hv" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.176432 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s66tt"] Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.177549 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.180041 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.186411 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s66tt"] Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.276884 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3783026e-fb94-4df1-91af-59307a31aa5c-config\") pod \"dnsmasq-dns-675f4bcbfc-g24hv\" (UID: \"3783026e-fb94-4df1-91af-59307a31aa5c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g24hv" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.276938 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/905479e9-7a1c-4313-ba3e-c341e271a465-config\") pod \"dnsmasq-dns-78dd6ddcc-s66tt\" (UID: \"905479e9-7a1c-4313-ba3e-c341e271a465\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.276971 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6w6c\" (UniqueName: \"kubernetes.io/projected/905479e9-7a1c-4313-ba3e-c341e271a465-kube-api-access-q6w6c\") pod \"dnsmasq-dns-78dd6ddcc-s66tt\" (UID: \"905479e9-7a1c-4313-ba3e-c341e271a465\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.277014 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwjdp\" (UniqueName: \"kubernetes.io/projected/3783026e-fb94-4df1-91af-59307a31aa5c-kube-api-access-qwjdp\") pod \"dnsmasq-dns-675f4bcbfc-g24hv\" (UID: \"3783026e-fb94-4df1-91af-59307a31aa5c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g24hv" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.277049 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/905479e9-7a1c-4313-ba3e-c341e271a465-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-s66tt\" (UID: \"905479e9-7a1c-4313-ba3e-c341e271a465\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.278018 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3783026e-fb94-4df1-91af-59307a31aa5c-config\") pod \"dnsmasq-dns-675f4bcbfc-g24hv\" (UID: \"3783026e-fb94-4df1-91af-59307a31aa5c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g24hv" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.295567 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwjdp\" (UniqueName: \"kubernetes.io/projected/3783026e-fb94-4df1-91af-59307a31aa5c-kube-api-access-qwjdp\") pod \"dnsmasq-dns-675f4bcbfc-g24hv\" (UID: \"3783026e-fb94-4df1-91af-59307a31aa5c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-g24hv" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.379041 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/905479e9-7a1c-4313-ba3e-c341e271a465-config\") pod \"dnsmasq-dns-78dd6ddcc-s66tt\" (UID: \"905479e9-7a1c-4313-ba3e-c341e271a465\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.379200 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6w6c\" (UniqueName: \"kubernetes.io/projected/905479e9-7a1c-4313-ba3e-c341e271a465-kube-api-access-q6w6c\") pod \"dnsmasq-dns-78dd6ddcc-s66tt\" (UID: \"905479e9-7a1c-4313-ba3e-c341e271a465\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.379320 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/905479e9-7a1c-4313-ba3e-c341e271a465-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-s66tt\" (UID: \"905479e9-7a1c-4313-ba3e-c341e271a465\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.380419 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/905479e9-7a1c-4313-ba3e-c341e271a465-config\") pod \"dnsmasq-dns-78dd6ddcc-s66tt\" (UID: \"905479e9-7a1c-4313-ba3e-c341e271a465\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.380475 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/905479e9-7a1c-4313-ba3e-c341e271a465-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-s66tt\" (UID: \"905479e9-7a1c-4313-ba3e-c341e271a465\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.406058 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6w6c\" (UniqueName: \"kubernetes.io/projected/905479e9-7a1c-4313-ba3e-c341e271a465-kube-api-access-q6w6c\") pod \"dnsmasq-dns-78dd6ddcc-s66tt\" (UID: \"905479e9-7a1c-4313-ba3e-c341e271a465\") " pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.438284 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g24hv" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.495692 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.812947 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s66tt"] Mar 12 13:27:14 crc kubenswrapper[4921]: W0312 13:27:14.817462 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod905479e9_7a1c_4313_ba3e_c341e271a465.slice/crio-90553e6621f30c8efc14cb0303ea752520cbff402e088f1b3c767d46a5be117b WatchSource:0}: Error finding container 90553e6621f30c8efc14cb0303ea752520cbff402e088f1b3c767d46a5be117b: Status 404 returned error can't find the container with id 90553e6621f30c8efc14cb0303ea752520cbff402e088f1b3c767d46a5be117b Mar 12 13:27:14 crc kubenswrapper[4921]: I0312 13:27:14.958316 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g24hv"] Mar 12 13:27:15 crc kubenswrapper[4921]: I0312 13:27:15.502577 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" event={"ID":"905479e9-7a1c-4313-ba3e-c341e271a465","Type":"ContainerStarted","Data":"90553e6621f30c8efc14cb0303ea752520cbff402e088f1b3c767d46a5be117b"} Mar 12 13:27:15 crc kubenswrapper[4921]: I0312 13:27:15.504079 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-g24hv" event={"ID":"3783026e-fb94-4df1-91af-59307a31aa5c","Type":"ContainerStarted","Data":"ffe97cdaac9f6fe8bd7e09112a342b4f3465a084405facdcf4c0b9536ba1f956"} Mar 12 13:27:16 crc kubenswrapper[4921]: I0312 13:27:16.830266 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g24hv"] Mar 12 13:27:16 crc kubenswrapper[4921]: I0312 13:27:16.853683 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xwdrz"] Mar 12 13:27:16 crc kubenswrapper[4921]: I0312 13:27:16.854993 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:16 crc kubenswrapper[4921]: I0312 13:27:16.871910 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xwdrz"] Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.022028 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-624qx\" (UniqueName: \"kubernetes.io/projected/fb951944-74c3-4beb-9368-6b67ada02c98-kube-api-access-624qx\") pod \"dnsmasq-dns-666b6646f7-xwdrz\" (UID: \"fb951944-74c3-4beb-9368-6b67ada02c98\") " pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.022445 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb951944-74c3-4beb-9368-6b67ada02c98-dns-svc\") pod \"dnsmasq-dns-666b6646f7-xwdrz\" (UID: \"fb951944-74c3-4beb-9368-6b67ada02c98\") " pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.022476 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb951944-74c3-4beb-9368-6b67ada02c98-config\") pod \"dnsmasq-dns-666b6646f7-xwdrz\" (UID: \"fb951944-74c3-4beb-9368-6b67ada02c98\") " pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.125043 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-624qx\" (UniqueName: \"kubernetes.io/projected/fb951944-74c3-4beb-9368-6b67ada02c98-kube-api-access-624qx\") pod \"dnsmasq-dns-666b6646f7-xwdrz\" (UID: \"fb951944-74c3-4beb-9368-6b67ada02c98\") " pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.125111 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb951944-74c3-4beb-9368-6b67ada02c98-dns-svc\") pod \"dnsmasq-dns-666b6646f7-xwdrz\" (UID: \"fb951944-74c3-4beb-9368-6b67ada02c98\") " pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.125147 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb951944-74c3-4beb-9368-6b67ada02c98-config\") pod \"dnsmasq-dns-666b6646f7-xwdrz\" (UID: \"fb951944-74c3-4beb-9368-6b67ada02c98\") " pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.126174 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb951944-74c3-4beb-9368-6b67ada02c98-config\") pod \"dnsmasq-dns-666b6646f7-xwdrz\" (UID: \"fb951944-74c3-4beb-9368-6b67ada02c98\") " pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.130145 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb951944-74c3-4beb-9368-6b67ada02c98-dns-svc\") pod \"dnsmasq-dns-666b6646f7-xwdrz\" (UID: \"fb951944-74c3-4beb-9368-6b67ada02c98\") " pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.160551 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-624qx\" (UniqueName: \"kubernetes.io/projected/fb951944-74c3-4beb-9368-6b67ada02c98-kube-api-access-624qx\") pod \"dnsmasq-dns-666b6646f7-xwdrz\" (UID: \"fb951944-74c3-4beb-9368-6b67ada02c98\") " pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.182956 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.187032 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s66tt"] Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.214531 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ncsx9"] Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.231426 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.231765 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ncsx9"] Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.334714 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d79pd\" (UniqueName: \"kubernetes.io/projected/0347f6f0-0cbe-4543-8f1f-939b159b8652-kube-api-access-d79pd\") pod \"dnsmasq-dns-57d769cc4f-ncsx9\" (UID: \"0347f6f0-0cbe-4543-8f1f-939b159b8652\") " pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.335043 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0347f6f0-0cbe-4543-8f1f-939b159b8652-config\") pod \"dnsmasq-dns-57d769cc4f-ncsx9\" (UID: \"0347f6f0-0cbe-4543-8f1f-939b159b8652\") " pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.335105 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0347f6f0-0cbe-4543-8f1f-939b159b8652-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ncsx9\" (UID: \"0347f6f0-0cbe-4543-8f1f-939b159b8652\") " pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.435900 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d79pd\" (UniqueName: \"kubernetes.io/projected/0347f6f0-0cbe-4543-8f1f-939b159b8652-kube-api-access-d79pd\") pod \"dnsmasq-dns-57d769cc4f-ncsx9\" (UID: \"0347f6f0-0cbe-4543-8f1f-939b159b8652\") " pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.436211 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0347f6f0-0cbe-4543-8f1f-939b159b8652-config\") pod \"dnsmasq-dns-57d769cc4f-ncsx9\" (UID: \"0347f6f0-0cbe-4543-8f1f-939b159b8652\") " pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.436245 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0347f6f0-0cbe-4543-8f1f-939b159b8652-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ncsx9\" (UID: \"0347f6f0-0cbe-4543-8f1f-939b159b8652\") " pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.436999 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0347f6f0-0cbe-4543-8f1f-939b159b8652-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ncsx9\" (UID: \"0347f6f0-0cbe-4543-8f1f-939b159b8652\") " pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.438105 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0347f6f0-0cbe-4543-8f1f-939b159b8652-config\") pod \"dnsmasq-dns-57d769cc4f-ncsx9\" (UID: \"0347f6f0-0cbe-4543-8f1f-939b159b8652\") " pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.458035 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d79pd\" (UniqueName: \"kubernetes.io/projected/0347f6f0-0cbe-4543-8f1f-939b159b8652-kube-api-access-d79pd\") pod \"dnsmasq-dns-57d769cc4f-ncsx9\" (UID: \"0347f6f0-0cbe-4543-8f1f-939b159b8652\") " pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.600507 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:27:17 crc kubenswrapper[4921]: I0312 13:27:17.852248 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xwdrz"] Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.035916 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.042070 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.048419 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.048580 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.048769 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.048765 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.048936 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.049313 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.049517 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-4npht" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.080643 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.086967 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ncsx9"] Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.149064 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-config-data\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.149122 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.149167 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf4146bb-5512-4a8d-81a6-b462a508be2f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.149187 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.149203 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf4146bb-5512-4a8d-81a6-b462a508be2f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.149277 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.149306 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.149338 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbrwz\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-kube-api-access-vbrwz\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.149358 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.149376 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.149394 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.250455 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.250514 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.250545 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbrwz\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-kube-api-access-vbrwz\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.250566 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.250588 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.250607 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.250637 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-config-data\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.250662 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.250702 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.250718 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf4146bb-5512-4a8d-81a6-b462a508be2f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.250738 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf4146bb-5512-4a8d-81a6-b462a508be2f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.251051 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.251170 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.251777 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-config-data\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.253373 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.254767 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.254940 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") device mount path \"/mnt/openstack/pv19\"" pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.258570 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.258699 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.264854 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf4146bb-5512-4a8d-81a6-b462a508be2f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.267407 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf4146bb-5512-4a8d-81a6-b462a508be2f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.269074 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbrwz\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-kube-api-access-vbrwz\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.312195 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-server-0\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.365630 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.369125 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.369843 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.372561 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.372731 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5m2pc" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.372808 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.372890 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.373030 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.373350 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.373453 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.378908 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.528993 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" event={"ID":"fb951944-74c3-4beb-9368-6b67ada02c98","Type":"ContainerStarted","Data":"e6953a59492939ac81d47945099a70fd57800a82e2d9abc9509b1fdb9497b54e"} Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.530029 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" event={"ID":"0347f6f0-0cbe-4543-8f1f-939b159b8652","Type":"ContainerStarted","Data":"f506dfe7cc455027ab0c0fe7cf1467150cffdba0e0494eba0e32ba610207daf0"} Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.555249 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.555494 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v79p\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-kube-api-access-8v79p\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.555634 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.555866 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.556026 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.556184 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.556327 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.556575 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.556750 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.556938 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.557054 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.660300 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.660354 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.660400 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v79p\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-kube-api-access-8v79p\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.660426 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.660448 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.660477 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") device mount path \"/mnt/openstack/pv20\"" pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.660479 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.661072 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.661120 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.661407 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.661500 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.661572 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.661739 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.663745 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.664060 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.667428 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.669483 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.670026 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.670049 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.671074 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.671964 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.681850 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v79p\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-kube-api-access-8v79p\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.683212 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:18 crc kubenswrapper[4921]: I0312 13:27:18.709419 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.445896 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.449059 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.451434 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.453189 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.453574 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-nmstl" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.458947 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.459119 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.462793 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.579590 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-298m4\" (UniqueName: \"kubernetes.io/projected/ab9571cc-4c2d-4462-adc5-f84bd590bcca-kube-api-access-298m4\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.579650 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab9571cc-4c2d-4462-adc5-f84bd590bcca-kolla-config\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.579718 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9571cc-4c2d-4462-adc5-f84bd590bcca-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.579772 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ab9571cc-4c2d-4462-adc5-f84bd590bcca-config-data-default\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.579827 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9571cc-4c2d-4462-adc5-f84bd590bcca-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.579848 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab9571cc-4c2d-4462-adc5-f84bd590bcca-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.579883 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ab9571cc-4c2d-4462-adc5-f84bd590bcca-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.579915 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.682527 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-298m4\" (UniqueName: \"kubernetes.io/projected/ab9571cc-4c2d-4462-adc5-f84bd590bcca-kube-api-access-298m4\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.682658 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab9571cc-4c2d-4462-adc5-f84bd590bcca-kolla-config\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.683828 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab9571cc-4c2d-4462-adc5-f84bd590bcca-kolla-config\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.683947 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9571cc-4c2d-4462-adc5-f84bd590bcca-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.684016 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ab9571cc-4c2d-4462-adc5-f84bd590bcca-config-data-default\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.684062 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9571cc-4c2d-4462-adc5-f84bd590bcca-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.684095 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab9571cc-4c2d-4462-adc5-f84bd590bcca-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.684138 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ab9571cc-4c2d-4462-adc5-f84bd590bcca-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.684175 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.685181 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.685458 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ab9571cc-4c2d-4462-adc5-f84bd590bcca-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.686126 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ab9571cc-4c2d-4462-adc5-f84bd590bcca-config-data-default\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.686613 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab9571cc-4c2d-4462-adc5-f84bd590bcca-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.690064 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9571cc-4c2d-4462-adc5-f84bd590bcca-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.695119 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9571cc-4c2d-4462-adc5-f84bd590bcca-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.707745 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-298m4\" (UniqueName: \"kubernetes.io/projected/ab9571cc-4c2d-4462-adc5-f84bd590bcca-kube-api-access-298m4\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.732316 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"ab9571cc-4c2d-4462-adc5-f84bd590bcca\") " pod="openstack/openstack-galera-0" Mar 12 13:27:19 crc kubenswrapper[4921]: I0312 13:27:19.776140 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 12 13:27:20 crc kubenswrapper[4921]: I0312 13:27:20.833076 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 13:27:20 crc kubenswrapper[4921]: I0312 13:27:20.838636 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:20 crc kubenswrapper[4921]: I0312 13:27:20.840268 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-p94dh" Mar 12 13:27:20 crc kubenswrapper[4921]: I0312 13:27:20.842717 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 12 13:27:20 crc kubenswrapper[4921]: I0312 13:27:20.841202 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 12 13:27:20 crc kubenswrapper[4921]: I0312 13:27:20.845645 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 12 13:27:20 crc kubenswrapper[4921]: I0312 13:27:20.851573 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.006339 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69b5525a-14c6-453f-9673-11d9e63dd25a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.006631 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69b5525a-14c6-453f-9673-11d9e63dd25a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.006672 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.006729 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b5525a-14c6-453f-9673-11d9e63dd25a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.006745 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69b5525a-14c6-453f-9673-11d9e63dd25a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.006777 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69b5525a-14c6-453f-9673-11d9e63dd25a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.006795 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg8nb\" (UniqueName: \"kubernetes.io/projected/69b5525a-14c6-453f-9673-11d9e63dd25a-kube-api-access-hg8nb\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.006843 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b5525a-14c6-453f-9673-11d9e63dd25a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.108343 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b5525a-14c6-453f-9673-11d9e63dd25a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.108392 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69b5525a-14c6-453f-9673-11d9e63dd25a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.108430 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69b5525a-14c6-453f-9673-11d9e63dd25a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.108446 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg8nb\" (UniqueName: \"kubernetes.io/projected/69b5525a-14c6-453f-9673-11d9e63dd25a-kube-api-access-hg8nb\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.108479 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b5525a-14c6-453f-9673-11d9e63dd25a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.108507 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69b5525a-14c6-453f-9673-11d9e63dd25a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.108523 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69b5525a-14c6-453f-9673-11d9e63dd25a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.108553 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.108888 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.116302 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/69b5525a-14c6-453f-9673-11d9e63dd25a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.116555 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/69b5525a-14c6-453f-9673-11d9e63dd25a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.116574 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69b5525a-14c6-453f-9673-11d9e63dd25a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.117769 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69b5525a-14c6-453f-9673-11d9e63dd25a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.118408 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/69b5525a-14c6-453f-9673-11d9e63dd25a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.138305 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/69b5525a-14c6-453f-9673-11d9e63dd25a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.144072 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg8nb\" (UniqueName: \"kubernetes.io/projected/69b5525a-14c6-453f-9673-11d9e63dd25a-kube-api-access-hg8nb\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.144265 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"69b5525a-14c6-453f-9673-11d9e63dd25a\") " pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.172162 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.232097 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.234340 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.236108 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-q7qtc" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.236430 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.236570 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.239509 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.311375 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0c221da-6e02-450a-a048-9c8292c208ff-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.311435 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0c221da-6e02-450a-a048-9c8292c208ff-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.311546 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f0c221da-6e02-450a-a048-9c8292c208ff-kolla-config\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.311588 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0c221da-6e02-450a-a048-9c8292c208ff-config-data\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.311618 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkmbt\" (UniqueName: \"kubernetes.io/projected/f0c221da-6e02-450a-a048-9c8292c208ff-kube-api-access-kkmbt\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.413067 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0c221da-6e02-450a-a048-9c8292c208ff-config-data\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.413113 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkmbt\" (UniqueName: \"kubernetes.io/projected/f0c221da-6e02-450a-a048-9c8292c208ff-kube-api-access-kkmbt\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.413164 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0c221da-6e02-450a-a048-9c8292c208ff-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.413189 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0c221da-6e02-450a-a048-9c8292c208ff-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.413279 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f0c221da-6e02-450a-a048-9c8292c208ff-kolla-config\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.413887 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f0c221da-6e02-450a-a048-9c8292c208ff-kolla-config\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.413959 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0c221da-6e02-450a-a048-9c8292c208ff-config-data\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.419726 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0c221da-6e02-450a-a048-9c8292c208ff-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.427945 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0c221da-6e02-450a-a048-9c8292c208ff-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.432277 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkmbt\" (UniqueName: \"kubernetes.io/projected/f0c221da-6e02-450a-a048-9c8292c208ff-kube-api-access-kkmbt\") pod \"memcached-0\" (UID: \"f0c221da-6e02-450a-a048-9c8292c208ff\") " pod="openstack/memcached-0" Mar 12 13:27:21 crc kubenswrapper[4921]: I0312 13:27:21.555640 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 12 13:27:23 crc kubenswrapper[4921]: I0312 13:27:23.476609 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 13:27:23 crc kubenswrapper[4921]: I0312 13:27:23.477878 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 13:27:23 crc kubenswrapper[4921]: I0312 13:27:23.479727 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-xn7pb" Mar 12 13:27:23 crc kubenswrapper[4921]: I0312 13:27:23.488802 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 13:27:23 crc kubenswrapper[4921]: I0312 13:27:23.645634 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nkvb\" (UniqueName: \"kubernetes.io/projected/0f49cecf-a341-4a70-b7f7-e2f61c313f0a-kube-api-access-4nkvb\") pod \"kube-state-metrics-0\" (UID: \"0f49cecf-a341-4a70-b7f7-e2f61c313f0a\") " pod="openstack/kube-state-metrics-0" Mar 12 13:27:23 crc kubenswrapper[4921]: I0312 13:27:23.747130 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nkvb\" (UniqueName: \"kubernetes.io/projected/0f49cecf-a341-4a70-b7f7-e2f61c313f0a-kube-api-access-4nkvb\") pod \"kube-state-metrics-0\" (UID: \"0f49cecf-a341-4a70-b7f7-e2f61c313f0a\") " pod="openstack/kube-state-metrics-0" Mar 12 13:27:23 crc kubenswrapper[4921]: I0312 13:27:23.778522 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nkvb\" (UniqueName: \"kubernetes.io/projected/0f49cecf-a341-4a70-b7f7-e2f61c313f0a-kube-api-access-4nkvb\") pod \"kube-state-metrics-0\" (UID: \"0f49cecf-a341-4a70-b7f7-e2f61c313f0a\") " pod="openstack/kube-state-metrics-0" Mar 12 13:27:23 crc kubenswrapper[4921]: I0312 13:27:23.792956 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 13:27:26 crc kubenswrapper[4921]: I0312 13:27:26.972552 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s4mtb"] Mar 12 13:27:26 crc kubenswrapper[4921]: I0312 13:27:26.973991 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:26 crc kubenswrapper[4921]: I0312 13:27:26.976515 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-b9k7f" Mar 12 13:27:26 crc kubenswrapper[4921]: I0312 13:27:26.982184 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 12 13:27:26 crc kubenswrapper[4921]: I0312 13:27:26.982433 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 12 13:27:26 crc kubenswrapper[4921]: I0312 13:27:26.985741 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-z4nmg"] Mar 12 13:27:26 crc kubenswrapper[4921]: I0312 13:27:26.987375 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:26 crc kubenswrapper[4921]: I0312 13:27:26.996791 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s4mtb"] Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.002850 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-z4nmg"] Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.112561 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2c49e53-e8d4-4f9b-a05e-f44516144d43-scripts\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.112611 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-scripts\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.112640 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f2c49e53-e8d4-4f9b-a05e-f44516144d43-var-log\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.112664 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f2c49e53-e8d4-4f9b-a05e-f44516144d43-var-lib\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.112695 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-ovn-controller-tls-certs\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.112718 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-var-run\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.112741 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-combined-ca-bundle\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.112757 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f2c49e53-e8d4-4f9b-a05e-f44516144d43-var-run\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.112783 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f2c49e53-e8d4-4f9b-a05e-f44516144d43-etc-ovs\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.112801 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-var-log-ovn\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.112856 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t757\" (UniqueName: \"kubernetes.io/projected/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-kube-api-access-2t757\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.112880 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-var-run-ovn\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.112907 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29g4f\" (UniqueName: \"kubernetes.io/projected/f2c49e53-e8d4-4f9b-a05e-f44516144d43-kube-api-access-29g4f\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.214829 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f2c49e53-e8d4-4f9b-a05e-f44516144d43-var-log\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.214915 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f2c49e53-e8d4-4f9b-a05e-f44516144d43-var-lib\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.214958 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-ovn-controller-tls-certs\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.214989 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-var-run\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.215020 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-combined-ca-bundle\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.215053 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f2c49e53-e8d4-4f9b-a05e-f44516144d43-var-run\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.215097 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f2c49e53-e8d4-4f9b-a05e-f44516144d43-etc-ovs\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.215139 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-var-log-ovn\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.215199 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t757\" (UniqueName: \"kubernetes.io/projected/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-kube-api-access-2t757\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.215244 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-var-run-ovn\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.215293 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29g4f\" (UniqueName: \"kubernetes.io/projected/f2c49e53-e8d4-4f9b-a05e-f44516144d43-kube-api-access-29g4f\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.215357 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2c49e53-e8d4-4f9b-a05e-f44516144d43-scripts\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.215383 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-scripts\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.215462 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-var-run\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.215561 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f2c49e53-e8d4-4f9b-a05e-f44516144d43-var-run\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.215641 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f2c49e53-e8d4-4f9b-a05e-f44516144d43-var-lib\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.215859 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f2c49e53-e8d4-4f9b-a05e-f44516144d43-etc-ovs\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.215974 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-var-run-ovn\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.216024 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-var-log-ovn\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.216638 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f2c49e53-e8d4-4f9b-a05e-f44516144d43-var-log\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.219315 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-scripts\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.219598 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2c49e53-e8d4-4f9b-a05e-f44516144d43-scripts\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.222416 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-ovn-controller-tls-certs\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.237189 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-combined-ca-bundle\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.239902 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29g4f\" (UniqueName: \"kubernetes.io/projected/f2c49e53-e8d4-4f9b-a05e-f44516144d43-kube-api-access-29g4f\") pod \"ovn-controller-ovs-z4nmg\" (UID: \"f2c49e53-e8d4-4f9b-a05e-f44516144d43\") " pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.245465 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t757\" (UniqueName: \"kubernetes.io/projected/6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb-kube-api-access-2t757\") pod \"ovn-controller-s4mtb\" (UID: \"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb\") " pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.294009 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.300524 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.846584 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.851109 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.854765 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.855695 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-tfmlc" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.855748 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.855838 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.856231 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 12 13:27:27 crc kubenswrapper[4921]: I0312 13:27:27.857535 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.036317 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.036373 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.036398 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.036465 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.036493 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl5mq\" (UniqueName: \"kubernetes.io/projected/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-kube-api-access-fl5mq\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.036514 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-config\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.036542 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.036560 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.138490 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.138550 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.138662 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.138699 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.138730 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.138784 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.138805 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl5mq\" (UniqueName: \"kubernetes.io/projected/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-kube-api-access-fl5mq\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.138849 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-config\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.138902 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.140367 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.140481 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-config\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.141505 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.143701 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.144378 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.145592 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.155105 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl5mq\" (UniqueName: \"kubernetes.io/projected/ed0ceb5e-c541-4d3f-99b9-1865684ffa9d-kube-api-access-fl5mq\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.160956 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d\") " pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:28 crc kubenswrapper[4921]: I0312 13:27:28.173490 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:29 crc kubenswrapper[4921]: I0312 13:27:29.923810 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 13:27:29 crc kubenswrapper[4921]: I0312 13:27:29.925686 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:29 crc kubenswrapper[4921]: I0312 13:27:29.931566 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 12 13:27:29 crc kubenswrapper[4921]: I0312 13:27:29.931665 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 12 13:27:29 crc kubenswrapper[4921]: I0312 13:27:29.931908 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 12 13:27:29 crc kubenswrapper[4921]: I0312 13:27:29.932088 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2pnq6" Mar 12 13:27:29 crc kubenswrapper[4921]: I0312 13:27:29.958365 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 13:27:29 crc kubenswrapper[4921]: E0312 13:27:29.987381 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 12 13:27:29 crc kubenswrapper[4921]: E0312 13:27:29.987519 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6w6c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-s66tt_openstack(905479e9-7a1c-4313-ba3e-c341e271a465): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 13:27:29 crc kubenswrapper[4921]: E0312 13:27:29.988906 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" podUID="905479e9-7a1c-4313-ba3e-c341e271a465" Mar 12 13:27:30 crc kubenswrapper[4921]: E0312 13:27:30.020490 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 12 13:27:30 crc kubenswrapper[4921]: E0312 13:27:30.020653 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qwjdp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-g24hv_openstack(3783026e-fb94-4df1-91af-59307a31aa5c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 13:27:30 crc kubenswrapper[4921]: E0312 13:27:30.022058 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-g24hv" podUID="3783026e-fb94-4df1-91af-59307a31aa5c" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.072470 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/228e4171-a3c9-483e-bfa6-1e0cef68384c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.072672 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.072713 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/228e4171-a3c9-483e-bfa6-1e0cef68384c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.072739 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vspld\" (UniqueName: \"kubernetes.io/projected/228e4171-a3c9-483e-bfa6-1e0cef68384c-kube-api-access-vspld\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.072777 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228e4171-a3c9-483e-bfa6-1e0cef68384c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.072793 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/228e4171-a3c9-483e-bfa6-1e0cef68384c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.072830 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/228e4171-a3c9-483e-bfa6-1e0cef68384c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.072874 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/228e4171-a3c9-483e-bfa6-1e0cef68384c-config\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.173780 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228e4171-a3c9-483e-bfa6-1e0cef68384c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.173827 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/228e4171-a3c9-483e-bfa6-1e0cef68384c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.173855 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/228e4171-a3c9-483e-bfa6-1e0cef68384c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.173927 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/228e4171-a3c9-483e-bfa6-1e0cef68384c-config\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.173956 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/228e4171-a3c9-483e-bfa6-1e0cef68384c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.173974 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.174009 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/228e4171-a3c9-483e-bfa6-1e0cef68384c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.174033 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vspld\" (UniqueName: \"kubernetes.io/projected/228e4171-a3c9-483e-bfa6-1e0cef68384c-kube-api-access-vspld\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.174949 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.175341 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/228e4171-a3c9-483e-bfa6-1e0cef68384c-config\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.176240 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/228e4171-a3c9-483e-bfa6-1e0cef68384c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.176479 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/228e4171-a3c9-483e-bfa6-1e0cef68384c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.190053 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/228e4171-a3c9-483e-bfa6-1e0cef68384c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.192074 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/228e4171-a3c9-483e-bfa6-1e0cef68384c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.194788 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/228e4171-a3c9-483e-bfa6-1e0cef68384c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.195285 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.195634 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vspld\" (UniqueName: \"kubernetes.io/projected/228e4171-a3c9-483e-bfa6-1e0cef68384c-kube-api-access-vspld\") pod \"ovsdbserver-sb-0\" (UID: \"228e4171-a3c9-483e-bfa6-1e0cef68384c\") " pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.313396 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.439623 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 13:27:30 crc kubenswrapper[4921]: W0312 13:27:30.496889 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf4146bb_5512_4a8d_81a6_b462a508be2f.slice/crio-62643b0de7f19420512cbfdd05ebbfd924dfd73b565268f9f942cd53ca8d1a75 WatchSource:0}: Error finding container 62643b0de7f19420512cbfdd05ebbfd924dfd73b565268f9f942cd53ca8d1a75: Status 404 returned error can't find the container with id 62643b0de7f19420512cbfdd05ebbfd924dfd73b565268f9f942cd53ca8d1a75 Mar 12 13:27:30 crc kubenswrapper[4921]: W0312 13:27:30.603839 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0c221da_6e02_450a_a048_9c8292c208ff.slice/crio-e1dc5d5839c2eb9a5479bcd2947904609563fc880cb3a4ab49f0bd2e204d95f1 WatchSource:0}: Error finding container e1dc5d5839c2eb9a5479bcd2947904609563fc880cb3a4ab49f0bd2e204d95f1: Status 404 returned error can't find the container with id e1dc5d5839c2eb9a5479bcd2947904609563fc880cb3a4ab49f0bd2e204d95f1 Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.604095 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 12 13:27:30 crc kubenswrapper[4921]: W0312 13:27:30.610751 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc83f4404_c7af_4fb6_aa92_6ac4e691a27f.slice/crio-447c2eb8ab99e771dedb952f242b8cce5ba9a0f567f4183ce92b7da4ae1fa3e9 WatchSource:0}: Error finding container 447c2eb8ab99e771dedb952f242b8cce5ba9a0f567f4183ce92b7da4ae1fa3e9: Status 404 returned error can't find the container with id 447c2eb8ab99e771dedb952f242b8cce5ba9a0f567f4183ce92b7da4ae1fa3e9 Mar 12 13:27:30 crc kubenswrapper[4921]: W0312 13:27:30.616550 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab9571cc_4c2d_4462_adc5_f84bd590bcca.slice/crio-998a267750b18238548980205a4007a05f1343588989b2c47ee6dce262ae6f32 WatchSource:0}: Error finding container 998a267750b18238548980205a4007a05f1343588989b2c47ee6dce262ae6f32: Status 404 returned error can't find the container with id 998a267750b18238548980205a4007a05f1343588989b2c47ee6dce262ae6f32 Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.617969 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.625678 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.632538 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.675069 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf4146bb-5512-4a8d-81a6-b462a508be2f","Type":"ContainerStarted","Data":"62643b0de7f19420512cbfdd05ebbfd924dfd73b565268f9f942cd53ca8d1a75"} Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.676801 4921 generic.go:334] "Generic (PLEG): container finished" podID="0347f6f0-0cbe-4543-8f1f-939b159b8652" containerID="af5ca2083ab37de437b2b7fdaf9265ad25cf5658eeec0ed53eba53bbef36477d" exitCode=0 Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.676872 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" event={"ID":"0347f6f0-0cbe-4543-8f1f-939b159b8652","Type":"ContainerDied","Data":"af5ca2083ab37de437b2b7fdaf9265ad25cf5658eeec0ed53eba53bbef36477d"} Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.678699 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f0c221da-6e02-450a-a048-9c8292c208ff","Type":"ContainerStarted","Data":"e1dc5d5839c2eb9a5479bcd2947904609563fc880cb3a4ab49f0bd2e204d95f1"} Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.686104 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ab9571cc-4c2d-4462-adc5-f84bd590bcca","Type":"ContainerStarted","Data":"998a267750b18238548980205a4007a05f1343588989b2c47ee6dce262ae6f32"} Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.705224 4921 generic.go:334] "Generic (PLEG): container finished" podID="fb951944-74c3-4beb-9368-6b67ada02c98" containerID="a311ea697bcf1dc74b040a955733fdbb760d28d48632be9755c3570863f6748a" exitCode=0 Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.705339 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" event={"ID":"fb951944-74c3-4beb-9368-6b67ada02c98","Type":"ContainerDied","Data":"a311ea697bcf1dc74b040a955733fdbb760d28d48632be9755c3570863f6748a"} Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.717658 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c83f4404-c7af-4fb6-aa92-6ac4e691a27f","Type":"ContainerStarted","Data":"447c2eb8ab99e771dedb952f242b8cce5ba9a0f567f4183ce92b7da4ae1fa3e9"} Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.723273 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"69b5525a-14c6-453f-9673-11d9e63dd25a","Type":"ContainerStarted","Data":"95fc21c3a70f92d3f13c016a955174d7d7ebf77b601e9707a833e351304dadca"} Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.753941 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s4mtb"] Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.787433 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.829992 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 13:27:30 crc kubenswrapper[4921]: I0312 13:27:30.941079 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 13:27:30 crc kubenswrapper[4921]: W0312 13:27:30.951426 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod228e4171_a3c9_483e_bfa6_1e0cef68384c.slice/crio-76fcff77264d15e14fe72821d9bc4c1b3e96e22a6e4feae45e12533f08b04844 WatchSource:0}: Error finding container 76fcff77264d15e14fe72821d9bc4c1b3e96e22a6e4feae45e12533f08b04844: Status 404 returned error can't find the container with id 76fcff77264d15e14fe72821d9bc4c1b3e96e22a6e4feae45e12533f08b04844 Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.150154 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g24hv" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.158295 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.313009 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3783026e-fb94-4df1-91af-59307a31aa5c-config\") pod \"3783026e-fb94-4df1-91af-59307a31aa5c\" (UID: \"3783026e-fb94-4df1-91af-59307a31aa5c\") " Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.313078 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwjdp\" (UniqueName: \"kubernetes.io/projected/3783026e-fb94-4df1-91af-59307a31aa5c-kube-api-access-qwjdp\") pod \"3783026e-fb94-4df1-91af-59307a31aa5c\" (UID: \"3783026e-fb94-4df1-91af-59307a31aa5c\") " Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.313110 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/905479e9-7a1c-4313-ba3e-c341e271a465-dns-svc\") pod \"905479e9-7a1c-4313-ba3e-c341e271a465\" (UID: \"905479e9-7a1c-4313-ba3e-c341e271a465\") " Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.313168 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/905479e9-7a1c-4313-ba3e-c341e271a465-config\") pod \"905479e9-7a1c-4313-ba3e-c341e271a465\" (UID: \"905479e9-7a1c-4313-ba3e-c341e271a465\") " Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.313202 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6w6c\" (UniqueName: \"kubernetes.io/projected/905479e9-7a1c-4313-ba3e-c341e271a465-kube-api-access-q6w6c\") pod \"905479e9-7a1c-4313-ba3e-c341e271a465\" (UID: \"905479e9-7a1c-4313-ba3e-c341e271a465\") " Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.313652 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/905479e9-7a1c-4313-ba3e-c341e271a465-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "905479e9-7a1c-4313-ba3e-c341e271a465" (UID: "905479e9-7a1c-4313-ba3e-c341e271a465"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.313683 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/905479e9-7a1c-4313-ba3e-c341e271a465-config" (OuterVolumeSpecName: "config") pod "905479e9-7a1c-4313-ba3e-c341e271a465" (UID: "905479e9-7a1c-4313-ba3e-c341e271a465"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.314591 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3783026e-fb94-4df1-91af-59307a31aa5c-config" (OuterVolumeSpecName: "config") pod "3783026e-fb94-4df1-91af-59307a31aa5c" (UID: "3783026e-fb94-4df1-91af-59307a31aa5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.318571 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3783026e-fb94-4df1-91af-59307a31aa5c-kube-api-access-qwjdp" (OuterVolumeSpecName: "kube-api-access-qwjdp") pod "3783026e-fb94-4df1-91af-59307a31aa5c" (UID: "3783026e-fb94-4df1-91af-59307a31aa5c"). InnerVolumeSpecName "kube-api-access-qwjdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.333887 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/905479e9-7a1c-4313-ba3e-c341e271a465-kube-api-access-q6w6c" (OuterVolumeSpecName: "kube-api-access-q6w6c") pod "905479e9-7a1c-4313-ba3e-c341e271a465" (UID: "905479e9-7a1c-4313-ba3e-c341e271a465"). InnerVolumeSpecName "kube-api-access-q6w6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.415371 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwjdp\" (UniqueName: \"kubernetes.io/projected/3783026e-fb94-4df1-91af-59307a31aa5c-kube-api-access-qwjdp\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.415422 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/905479e9-7a1c-4313-ba3e-c341e271a465-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.415434 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/905479e9-7a1c-4313-ba3e-c341e271a465-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.415442 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6w6c\" (UniqueName: \"kubernetes.io/projected/905479e9-7a1c-4313-ba3e-c341e271a465-kube-api-access-q6w6c\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.415452 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3783026e-fb94-4df1-91af-59307a31aa5c-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.570696 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-z4nmg"] Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.731477 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d","Type":"ContainerStarted","Data":"b3b7928b4962eb4523bb5036fab46365640d1f93946fecdd0b0fa56017d866b8"} Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.734117 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" event={"ID":"0347f6f0-0cbe-4543-8f1f-939b159b8652","Type":"ContainerStarted","Data":"e12f25f79ae1e784f729f694004415e9f80ee2feef5a89994474aad49d36ce1a"} Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.734180 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.735416 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s4mtb" event={"ID":"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb","Type":"ContainerStarted","Data":"012b864a96ff9dcb020470e3801ab9ab66d0732bc38d9ffe4c8130314c46ce14"} Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.739163 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" event={"ID":"fb951944-74c3-4beb-9368-6b67ada02c98","Type":"ContainerStarted","Data":"5e0d18480d08eb517524619e33899f726e8c388783c4e4102b07290aeeb06082"} Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.740008 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.740994 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" event={"ID":"905479e9-7a1c-4313-ba3e-c341e271a465","Type":"ContainerDied","Data":"90553e6621f30c8efc14cb0303ea752520cbff402e088f1b3c767d46a5be117b"} Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.741054 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-s66tt" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.742649 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0f49cecf-a341-4a70-b7f7-e2f61c313f0a","Type":"ContainerStarted","Data":"21e3ee46f781bab5c6a324ec045f8a8d497763cde1001138980f05c6078e90f6"} Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.744118 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-g24hv" event={"ID":"3783026e-fb94-4df1-91af-59307a31aa5c","Type":"ContainerDied","Data":"ffe97cdaac9f6fe8bd7e09112a342b4f3465a084405facdcf4c0b9536ba1f956"} Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.744259 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-g24hv" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.745504 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"228e4171-a3c9-483e-bfa6-1e0cef68384c","Type":"ContainerStarted","Data":"76fcff77264d15e14fe72821d9bc4c1b3e96e22a6e4feae45e12533f08b04844"} Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.754070 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" podStartSLOduration=2.7285000310000003 podStartE2EDuration="14.754048416s" podCreationTimestamp="2026-03-12 13:27:17 +0000 UTC" firstStartedPulling="2026-03-12 13:27:18.07052241 +0000 UTC m=+1060.760594371" lastFinishedPulling="2026-03-12 13:27:30.096070785 +0000 UTC m=+1072.786142756" observedRunningTime="2026-03-12 13:27:31.748225217 +0000 UTC m=+1074.438297208" watchObservedRunningTime="2026-03-12 13:27:31.754048416 +0000 UTC m=+1074.444120387" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.777537 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" podStartSLOduration=3.565856441 podStartE2EDuration="15.777515729s" podCreationTimestamp="2026-03-12 13:27:16 +0000 UTC" firstStartedPulling="2026-03-12 13:27:17.859308373 +0000 UTC m=+1060.549380344" lastFinishedPulling="2026-03-12 13:27:30.070967661 +0000 UTC m=+1072.761039632" observedRunningTime="2026-03-12 13:27:31.768592234 +0000 UTC m=+1074.458664205" watchObservedRunningTime="2026-03-12 13:27:31.777515729 +0000 UTC m=+1074.467587700" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.828925 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g24hv"] Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.834925 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-g24hv"] Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.844926 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s66tt"] Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.851024 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-s66tt"] Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.995226 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3783026e-fb94-4df1-91af-59307a31aa5c" path="/var/lib/kubelet/pods/3783026e-fb94-4df1-91af-59307a31aa5c/volumes" Mar 12 13:27:31 crc kubenswrapper[4921]: I0312 13:27:31.995603 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="905479e9-7a1c-4313-ba3e-c341e271a465" path="/var/lib/kubelet/pods/905479e9-7a1c-4313-ba3e-c341e271a465/volumes" Mar 12 13:27:32 crc kubenswrapper[4921]: W0312 13:27:32.402616 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2c49e53_e8d4_4f9b_a05e_f44516144d43.slice/crio-3922b90a1078e67b46758911ea05e06d4b75e98eedae31e8b208ee3b2efac067 WatchSource:0}: Error finding container 3922b90a1078e67b46758911ea05e06d4b75e98eedae31e8b208ee3b2efac067: Status 404 returned error can't find the container with id 3922b90a1078e67b46758911ea05e06d4b75e98eedae31e8b208ee3b2efac067 Mar 12 13:27:32 crc kubenswrapper[4921]: I0312 13:27:32.773614 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z4nmg" event={"ID":"f2c49e53-e8d4-4f9b-a05e-f44516144d43","Type":"ContainerStarted","Data":"3922b90a1078e67b46758911ea05e06d4b75e98eedae31e8b208ee3b2efac067"} Mar 12 13:27:37 crc kubenswrapper[4921]: I0312 13:27:37.185033 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:37 crc kubenswrapper[4921]: I0312 13:27:37.601931 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:27:37 crc kubenswrapper[4921]: I0312 13:27:37.671225 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xwdrz"] Mar 12 13:27:37 crc kubenswrapper[4921]: I0312 13:27:37.811921 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" podUID="fb951944-74c3-4beb-9368-6b67ada02c98" containerName="dnsmasq-dns" containerID="cri-o://5e0d18480d08eb517524619e33899f726e8c388783c4e4102b07290aeeb06082" gracePeriod=10 Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.360346 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.444415 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-624qx\" (UniqueName: \"kubernetes.io/projected/fb951944-74c3-4beb-9368-6b67ada02c98-kube-api-access-624qx\") pod \"fb951944-74c3-4beb-9368-6b67ada02c98\" (UID: \"fb951944-74c3-4beb-9368-6b67ada02c98\") " Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.444533 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb951944-74c3-4beb-9368-6b67ada02c98-config\") pod \"fb951944-74c3-4beb-9368-6b67ada02c98\" (UID: \"fb951944-74c3-4beb-9368-6b67ada02c98\") " Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.445426 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb951944-74c3-4beb-9368-6b67ada02c98-dns-svc\") pod \"fb951944-74c3-4beb-9368-6b67ada02c98\" (UID: \"fb951944-74c3-4beb-9368-6b67ada02c98\") " Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.450992 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb951944-74c3-4beb-9368-6b67ada02c98-kube-api-access-624qx" (OuterVolumeSpecName: "kube-api-access-624qx") pod "fb951944-74c3-4beb-9368-6b67ada02c98" (UID: "fb951944-74c3-4beb-9368-6b67ada02c98"). InnerVolumeSpecName "kube-api-access-624qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.478751 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb951944-74c3-4beb-9368-6b67ada02c98-config" (OuterVolumeSpecName: "config") pod "fb951944-74c3-4beb-9368-6b67ada02c98" (UID: "fb951944-74c3-4beb-9368-6b67ada02c98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.479360 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb951944-74c3-4beb-9368-6b67ada02c98-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb951944-74c3-4beb-9368-6b67ada02c98" (UID: "fb951944-74c3-4beb-9368-6b67ada02c98"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.547097 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb951944-74c3-4beb-9368-6b67ada02c98-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.547142 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb951944-74c3-4beb-9368-6b67ada02c98-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.547156 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-624qx\" (UniqueName: \"kubernetes.io/projected/fb951944-74c3-4beb-9368-6b67ada02c98-kube-api-access-624qx\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.824929 4921 generic.go:334] "Generic (PLEG): container finished" podID="fb951944-74c3-4beb-9368-6b67ada02c98" containerID="5e0d18480d08eb517524619e33899f726e8c388783c4e4102b07290aeeb06082" exitCode=0 Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.825005 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" event={"ID":"fb951944-74c3-4beb-9368-6b67ada02c98","Type":"ContainerDied","Data":"5e0d18480d08eb517524619e33899f726e8c388783c4e4102b07290aeeb06082"} Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.825037 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" event={"ID":"fb951944-74c3-4beb-9368-6b67ada02c98","Type":"ContainerDied","Data":"e6953a59492939ac81d47945099a70fd57800a82e2d9abc9509b1fdb9497b54e"} Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.825059 4921 scope.go:117] "RemoveContainer" containerID="5e0d18480d08eb517524619e33899f726e8c388783c4e4102b07290aeeb06082" Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.825176 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-xwdrz" Mar 12 13:27:38 crc kubenswrapper[4921]: I0312 13:27:38.828979 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"69b5525a-14c6-453f-9673-11d9e63dd25a","Type":"ContainerStarted","Data":"f7a2cf519bedda3d29c5253ce2fcdcb4ae83fe0abb3568c4b0ade347f8aa798f"} Mar 12 13:27:39 crc kubenswrapper[4921]: I0312 13:27:39.716113 4921 scope.go:117] "RemoveContainer" containerID="a311ea697bcf1dc74b040a955733fdbb760d28d48632be9755c3570863f6748a" Mar 12 13:27:39 crc kubenswrapper[4921]: I0312 13:27:39.839185 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d","Type":"ContainerStarted","Data":"44845ae9888737f5fbb8d1db265ebd57344aae085aa09dee2a71442a5bd85f2c"} Mar 12 13:27:39 crc kubenswrapper[4921]: I0312 13:27:39.841243 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f0c221da-6e02-450a-a048-9c8292c208ff","Type":"ContainerStarted","Data":"388532a12157f3eb68eb15bacfaad0b7a3d309afb7504e65f4f0446b83bae694"} Mar 12 13:27:39 crc kubenswrapper[4921]: I0312 13:27:39.841384 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 12 13:27:39 crc kubenswrapper[4921]: I0312 13:27:39.845135 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf4146bb-5512-4a8d-81a6-b462a508be2f","Type":"ContainerStarted","Data":"84135f38b17f95a53f553d4468a52434f6006e30f8307b952806cef3b61cebdd"} Mar 12 13:27:39 crc kubenswrapper[4921]: I0312 13:27:39.870996 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.467852398 podStartE2EDuration="18.870976806s" podCreationTimestamp="2026-03-12 13:27:21 +0000 UTC" firstStartedPulling="2026-03-12 13:27:30.610115152 +0000 UTC m=+1073.300187123" lastFinishedPulling="2026-03-12 13:27:37.01323956 +0000 UTC m=+1079.703311531" observedRunningTime="2026-03-12 13:27:39.858711539 +0000 UTC m=+1082.548783510" watchObservedRunningTime="2026-03-12 13:27:39.870976806 +0000 UTC m=+1082.561048767" Mar 12 13:27:39 crc kubenswrapper[4921]: I0312 13:27:39.941905 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xwdrz"] Mar 12 13:27:39 crc kubenswrapper[4921]: I0312 13:27:39.947034 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-xwdrz"] Mar 12 13:27:39 crc kubenswrapper[4921]: I0312 13:27:39.992598 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb951944-74c3-4beb-9368-6b67ada02c98" path="/var/lib/kubelet/pods/fb951944-74c3-4beb-9368-6b67ada02c98/volumes" Mar 12 13:27:40 crc kubenswrapper[4921]: I0312 13:27:40.041980 4921 scope.go:117] "RemoveContainer" containerID="5e0d18480d08eb517524619e33899f726e8c388783c4e4102b07290aeeb06082" Mar 12 13:27:40 crc kubenswrapper[4921]: E0312 13:27:40.042780 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e0d18480d08eb517524619e33899f726e8c388783c4e4102b07290aeeb06082\": container with ID starting with 5e0d18480d08eb517524619e33899f726e8c388783c4e4102b07290aeeb06082 not found: ID does not exist" containerID="5e0d18480d08eb517524619e33899f726e8c388783c4e4102b07290aeeb06082" Mar 12 13:27:40 crc kubenswrapper[4921]: I0312 13:27:40.042833 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e0d18480d08eb517524619e33899f726e8c388783c4e4102b07290aeeb06082"} err="failed to get container status \"5e0d18480d08eb517524619e33899f726e8c388783c4e4102b07290aeeb06082\": rpc error: code = NotFound desc = could not find container \"5e0d18480d08eb517524619e33899f726e8c388783c4e4102b07290aeeb06082\": container with ID starting with 5e0d18480d08eb517524619e33899f726e8c388783c4e4102b07290aeeb06082 not found: ID does not exist" Mar 12 13:27:40 crc kubenswrapper[4921]: I0312 13:27:40.042861 4921 scope.go:117] "RemoveContainer" containerID="a311ea697bcf1dc74b040a955733fdbb760d28d48632be9755c3570863f6748a" Mar 12 13:27:40 crc kubenswrapper[4921]: E0312 13:27:40.043087 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a311ea697bcf1dc74b040a955733fdbb760d28d48632be9755c3570863f6748a\": container with ID starting with a311ea697bcf1dc74b040a955733fdbb760d28d48632be9755c3570863f6748a not found: ID does not exist" containerID="a311ea697bcf1dc74b040a955733fdbb760d28d48632be9755c3570863f6748a" Mar 12 13:27:40 crc kubenswrapper[4921]: I0312 13:27:40.043105 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a311ea697bcf1dc74b040a955733fdbb760d28d48632be9755c3570863f6748a"} err="failed to get container status \"a311ea697bcf1dc74b040a955733fdbb760d28d48632be9755c3570863f6748a\": rpc error: code = NotFound desc = could not find container \"a311ea697bcf1dc74b040a955733fdbb760d28d48632be9755c3570863f6748a\": container with ID starting with a311ea697bcf1dc74b040a955733fdbb760d28d48632be9755c3570863f6748a not found: ID does not exist" Mar 12 13:27:40 crc kubenswrapper[4921]: I0312 13:27:40.854788 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ab9571cc-4c2d-4462-adc5-f84bd590bcca","Type":"ContainerStarted","Data":"4fc05f91e7b0d3dac2ce710337ca89f16cd85f5b23a77148bdfb2f29d1ced007"} Mar 12 13:27:40 crc kubenswrapper[4921]: I0312 13:27:40.867449 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c83f4404-c7af-4fb6-aa92-6ac4e691a27f","Type":"ContainerStarted","Data":"cfab788e9f5ce4b3ba25a82075975900efb79d705a9bb5a1bdfdd4a9183dccb7"} Mar 12 13:27:40 crc kubenswrapper[4921]: I0312 13:27:40.871254 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"228e4171-a3c9-483e-bfa6-1e0cef68384c","Type":"ContainerStarted","Data":"fd66239b86a1905e701f71bf1d178daf4271ddfade6e03d04157d07868cb689e"} Mar 12 13:27:40 crc kubenswrapper[4921]: I0312 13:27:40.880703 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s4mtb" event={"ID":"6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb","Type":"ContainerStarted","Data":"675841939d8e015e571a21f0defa0bcc2aae505a87b957565ab57e439323d4ac"} Mar 12 13:27:40 crc kubenswrapper[4921]: I0312 13:27:40.880754 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-s4mtb" Mar 12 13:27:40 crc kubenswrapper[4921]: I0312 13:27:40.883235 4921 generic.go:334] "Generic (PLEG): container finished" podID="f2c49e53-e8d4-4f9b-a05e-f44516144d43" containerID="c9c14d4c1c926343dbc5908eda56df54e15e1c4d647df3a26e93e49a901e5f1f" exitCode=0 Mar 12 13:27:40 crc kubenswrapper[4921]: I0312 13:27:40.883320 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z4nmg" event={"ID":"f2c49e53-e8d4-4f9b-a05e-f44516144d43","Type":"ContainerDied","Data":"c9c14d4c1c926343dbc5908eda56df54e15e1c4d647df3a26e93e49a901e5f1f"} Mar 12 13:27:40 crc kubenswrapper[4921]: I0312 13:27:40.912716 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-s4mtb" podStartSLOduration=7.584595394 podStartE2EDuration="14.912698401s" podCreationTimestamp="2026-03-12 13:27:26 +0000 UTC" firstStartedPulling="2026-03-12 13:27:30.749534097 +0000 UTC m=+1073.439606068" lastFinishedPulling="2026-03-12 13:27:38.077637104 +0000 UTC m=+1080.767709075" observedRunningTime="2026-03-12 13:27:40.911772023 +0000 UTC m=+1083.601843994" watchObservedRunningTime="2026-03-12 13:27:40.912698401 +0000 UTC m=+1083.602770372" Mar 12 13:27:41 crc kubenswrapper[4921]: I0312 13:27:41.891233 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z4nmg" event={"ID":"f2c49e53-e8d4-4f9b-a05e-f44516144d43","Type":"ContainerStarted","Data":"196bd9eba6c3326622a85ed3c050ebf9691cc8d0cd557fe95bf13eacaca3b352"} Mar 12 13:27:42 crc kubenswrapper[4921]: I0312 13:27:42.901306 4921 generic.go:334] "Generic (PLEG): container finished" podID="69b5525a-14c6-453f-9673-11d9e63dd25a" containerID="f7a2cf519bedda3d29c5253ce2fcdcb4ae83fe0abb3568c4b0ade347f8aa798f" exitCode=0 Mar 12 13:27:42 crc kubenswrapper[4921]: I0312 13:27:42.901609 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"69b5525a-14c6-453f-9673-11d9e63dd25a","Type":"ContainerDied","Data":"f7a2cf519bedda3d29c5253ce2fcdcb4ae83fe0abb3568c4b0ade347f8aa798f"} Mar 12 13:27:43 crc kubenswrapper[4921]: I0312 13:27:43.914336 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"69b5525a-14c6-453f-9673-11d9e63dd25a","Type":"ContainerStarted","Data":"21540c9fa2fdb70aee1a23fda9dc1faf43bd955a13fc7562b56d666b552db0bd"} Mar 12 13:27:43 crc kubenswrapper[4921]: I0312 13:27:43.923131 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"228e4171-a3c9-483e-bfa6-1e0cef68384c","Type":"ContainerStarted","Data":"f736bd21f0787289e15b2c07a86d3f1f7755a3f3b3921c8bd6fcb647f2b48105"} Mar 12 13:27:43 crc kubenswrapper[4921]: I0312 13:27:43.925840 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ed0ceb5e-c541-4d3f-99b9-1865684ffa9d","Type":"ContainerStarted","Data":"3d429b622c207ac0291e9a1fb7289f8d29e24f28c4a30089417416f5e885a68d"} Mar 12 13:27:43 crc kubenswrapper[4921]: I0312 13:27:43.929476 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-z4nmg" event={"ID":"f2c49e53-e8d4-4f9b-a05e-f44516144d43","Type":"ContainerStarted","Data":"82f0e8c7ef7971bb8430e45ccae86762e47ec9642f47378525bdd2b9c257224c"} Mar 12 13:27:43 crc kubenswrapper[4921]: I0312 13:27:43.929765 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:43 crc kubenswrapper[4921]: I0312 13:27:43.932461 4921 generic.go:334] "Generic (PLEG): container finished" podID="ab9571cc-4c2d-4462-adc5-f84bd590bcca" containerID="4fc05f91e7b0d3dac2ce710337ca89f16cd85f5b23a77148bdfb2f29d1ced007" exitCode=0 Mar 12 13:27:43 crc kubenswrapper[4921]: I0312 13:27:43.932502 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ab9571cc-4c2d-4462-adc5-f84bd590bcca","Type":"ContainerDied","Data":"4fc05f91e7b0d3dac2ce710337ca89f16cd85f5b23a77148bdfb2f29d1ced007"} Mar 12 13:27:43 crc kubenswrapper[4921]: I0312 13:27:43.935446 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0f49cecf-a341-4a70-b7f7-e2f61c313f0a","Type":"ContainerStarted","Data":"eee5f029f99aabaec6892b8ba75bc02e91da92cdf3d217f8b389f467fbd7c8e4"} Mar 12 13:27:43 crc kubenswrapper[4921]: I0312 13:27:43.935580 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 12 13:27:43 crc kubenswrapper[4921]: I0312 13:27:43.954715 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.938399054 podStartE2EDuration="24.954696004s" podCreationTimestamp="2026-03-12 13:27:19 +0000 UTC" firstStartedPulling="2026-03-12 13:27:30.623150353 +0000 UTC m=+1073.313222324" lastFinishedPulling="2026-03-12 13:27:36.639447283 +0000 UTC m=+1079.329519274" observedRunningTime="2026-03-12 13:27:43.95128574 +0000 UTC m=+1086.641357741" watchObservedRunningTime="2026-03-12 13:27:43.954696004 +0000 UTC m=+1086.644767985" Mar 12 13:27:43 crc kubenswrapper[4921]: I0312 13:27:43.970265 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.538362492 podStartE2EDuration="20.970242854s" podCreationTimestamp="2026-03-12 13:27:23 +0000 UTC" firstStartedPulling="2026-03-12 13:27:30.800328282 +0000 UTC m=+1073.490400253" lastFinishedPulling="2026-03-12 13:27:43.232208644 +0000 UTC m=+1085.922280615" observedRunningTime="2026-03-12 13:27:43.969284104 +0000 UTC m=+1086.659356075" watchObservedRunningTime="2026-03-12 13:27:43.970242854 +0000 UTC m=+1086.660314845" Mar 12 13:27:44 crc kubenswrapper[4921]: I0312 13:27:44.063693 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-z4nmg" podStartSLOduration=12.743166699 podStartE2EDuration="18.063664432s" podCreationTimestamp="2026-03-12 13:27:26 +0000 UTC" firstStartedPulling="2026-03-12 13:27:32.405331412 +0000 UTC m=+1075.095403383" lastFinishedPulling="2026-03-12 13:27:37.725829145 +0000 UTC m=+1080.415901116" observedRunningTime="2026-03-12 13:27:44.019259203 +0000 UTC m=+1086.709331184" watchObservedRunningTime="2026-03-12 13:27:44.063664432 +0000 UTC m=+1086.753736423" Mar 12 13:27:44 crc kubenswrapper[4921]: I0312 13:27:44.065724 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.667257393 podStartE2EDuration="16.065713025s" podCreationTimestamp="2026-03-12 13:27:28 +0000 UTC" firstStartedPulling="2026-03-12 13:27:30.955506114 +0000 UTC m=+1073.645578085" lastFinishedPulling="2026-03-12 13:27:43.353961746 +0000 UTC m=+1086.044033717" observedRunningTime="2026-03-12 13:27:44.051869589 +0000 UTC m=+1086.741941570" watchObservedRunningTime="2026-03-12 13:27:44.065713025 +0000 UTC m=+1086.755785006" Mar 12 13:27:44 crc kubenswrapper[4921]: I0312 13:27:44.090192 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.612067992 podStartE2EDuration="18.090146338s" podCreationTimestamp="2026-03-12 13:27:26 +0000 UTC" firstStartedPulling="2026-03-12 13:27:30.848543738 +0000 UTC m=+1073.538615709" lastFinishedPulling="2026-03-12 13:27:43.326622084 +0000 UTC m=+1086.016694055" observedRunningTime="2026-03-12 13:27:44.075795215 +0000 UTC m=+1086.765867196" watchObservedRunningTime="2026-03-12 13:27:44.090146338 +0000 UTC m=+1086.780218319" Mar 12 13:27:44 crc kubenswrapper[4921]: I0312 13:27:44.951678 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ab9571cc-4c2d-4462-adc5-f84bd590bcca","Type":"ContainerStarted","Data":"058c438f41481e458f8a07669212172c325a644d6601a228b2ded5020576c1a7"} Mar 12 13:27:44 crc kubenswrapper[4921]: I0312 13:27:44.952964 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:27:44 crc kubenswrapper[4921]: I0312 13:27:44.989217 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.529522886 podStartE2EDuration="26.989179067s" podCreationTimestamp="2026-03-12 13:27:18 +0000 UTC" firstStartedPulling="2026-03-12 13:27:30.617956423 +0000 UTC m=+1073.308028404" lastFinishedPulling="2026-03-12 13:27:38.077612614 +0000 UTC m=+1080.767684585" observedRunningTime="2026-03-12 13:27:44.979016704 +0000 UTC m=+1087.669088685" watchObservedRunningTime="2026-03-12 13:27:44.989179067 +0000 UTC m=+1087.679251038" Mar 12 13:27:45 crc kubenswrapper[4921]: I0312 13:27:45.314670 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:45 crc kubenswrapper[4921]: I0312 13:27:45.315574 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:45 crc kubenswrapper[4921]: I0312 13:27:45.365448 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.017794 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.174652 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.227638 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.335199 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-m75rq"] Mar 12 13:27:46 crc kubenswrapper[4921]: E0312 13:27:46.335765 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb951944-74c3-4beb-9368-6b67ada02c98" containerName="dnsmasq-dns" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.335792 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb951944-74c3-4beb-9368-6b67ada02c98" containerName="dnsmasq-dns" Mar 12 13:27:46 crc kubenswrapper[4921]: E0312 13:27:46.335848 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb951944-74c3-4beb-9368-6b67ada02c98" containerName="init" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.335855 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb951944-74c3-4beb-9368-6b67ada02c98" containerName="init" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.336050 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb951944-74c3-4beb-9368-6b67ada02c98" containerName="dnsmasq-dns" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.337184 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.344573 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.359164 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-m75rq"] Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.365725 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-zhfgt"] Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.367054 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.379133 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.389335 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-zhfgt"] Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.454284 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-ovn-rundir\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.454343 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-m75rq\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.454394 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-m75rq\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.454430 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.454463 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-config\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.454490 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr5tq\" (UniqueName: \"kubernetes.io/projected/ee5ae924-0c20-44be-ad57-1aec571386f0-kube-api-access-hr5tq\") pod \"dnsmasq-dns-7f896c8c65-m75rq\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.454521 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-combined-ca-bundle\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.454568 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k52d\" (UniqueName: \"kubernetes.io/projected/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-kube-api-access-9k52d\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.454588 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-ovs-rundir\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.454617 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-config\") pod \"dnsmasq-dns-7f896c8c65-m75rq\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.557008 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-config\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.557143 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr5tq\" (UniqueName: \"kubernetes.io/projected/ee5ae924-0c20-44be-ad57-1aec571386f0-kube-api-access-hr5tq\") pod \"dnsmasq-dns-7f896c8c65-m75rq\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.557208 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-combined-ca-bundle\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.557238 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k52d\" (UniqueName: \"kubernetes.io/projected/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-kube-api-access-9k52d\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.557277 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-ovs-rundir\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.557332 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-config\") pod \"dnsmasq-dns-7f896c8c65-m75rq\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.557406 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-ovn-rundir\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.557795 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-ovs-rundir\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.557898 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-ovn-rundir\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.558191 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-config\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.558796 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-config\") pod \"dnsmasq-dns-7f896c8c65-m75rq\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.559032 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.559159 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-m75rq\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.559398 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-m75rq\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.559574 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.559901 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-m75rq\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.560331 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-m75rq\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.567862 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.577499 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k52d\" (UniqueName: \"kubernetes.io/projected/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-kube-api-access-9k52d\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.581835 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr5tq\" (UniqueName: \"kubernetes.io/projected/ee5ae924-0c20-44be-ad57-1aec571386f0-kube-api-access-hr5tq\") pod \"dnsmasq-dns-7f896c8c65-m75rq\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.582266 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb-combined-ca-bundle\") pod \"ovn-controller-metrics-zhfgt\" (UID: \"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb\") " pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.658637 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.688085 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-zhfgt" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.785382 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-m75rq"] Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.836020 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rgqch"] Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.837141 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.840232 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 12 13:27:46 crc kubenswrapper[4921]: I0312 13:27:46.901503 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rgqch"] Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.027151 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.029189 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-config\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.029430 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.029522 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsvml\" (UniqueName: \"kubernetes.io/projected/464806fa-ec1f-477a-bd5e-bae85b7eaff3-kube-api-access-tsvml\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.029627 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.030033 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.104912 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.132405 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.132452 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-config\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.132631 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.132653 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsvml\" (UniqueName: \"kubernetes.io/projected/464806fa-ec1f-477a-bd5e-bae85b7eaff3-kube-api-access-tsvml\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.132706 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.133750 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.134586 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.136141 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-config\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.137324 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.158128 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsvml\" (UniqueName: \"kubernetes.io/projected/464806fa-ec1f-477a-bd5e-bae85b7eaff3-kube-api-access-tsvml\") pod \"dnsmasq-dns-86db49b7ff-rgqch\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.160228 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.347157 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.348797 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.353228 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.353436 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.353616 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.353628 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-sx5cf" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.360178 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.393610 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-m75rq"] Mar 12 13:27:47 crc kubenswrapper[4921]: W0312 13:27:47.400347 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee5ae924_0c20_44be_ad57_1aec571386f0.slice/crio-1009f6d6a94cfe11d23f439439981d4e037522d4e4fbe1a2c1b88e3e31ac3fe0 WatchSource:0}: Error finding container 1009f6d6a94cfe11d23f439439981d4e037522d4e4fbe1a2c1b88e3e31ac3fe0: Status 404 returned error can't find the container with id 1009f6d6a94cfe11d23f439439981d4e037522d4e4fbe1a2c1b88e3e31ac3fe0 Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.447109 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff3ad30a-89e1-4463-b43b-97d8af948926-scripts\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.447172 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff3ad30a-89e1-4463-b43b-97d8af948926-config\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.447228 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d88vt\" (UniqueName: \"kubernetes.io/projected/ff3ad30a-89e1-4463-b43b-97d8af948926-kube-api-access-d88vt\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.447255 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.447373 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.447418 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.447469 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ff3ad30a-89e1-4463-b43b-97d8af948926-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.477968 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-zhfgt"] Mar 12 13:27:47 crc kubenswrapper[4921]: W0312 13:27:47.489751 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b5f8311_11bc_477e_b80a_ed2fa2ebc3bb.slice/crio-65a70baca42f74c9218cec1aa83b6b1d24820296a722103f607519542476ca4c WatchSource:0}: Error finding container 65a70baca42f74c9218cec1aa83b6b1d24820296a722103f607519542476ca4c: Status 404 returned error can't find the container with id 65a70baca42f74c9218cec1aa83b6b1d24820296a722103f607519542476ca4c Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.550034 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ff3ad30a-89e1-4463-b43b-97d8af948926-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.550114 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.550161 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff3ad30a-89e1-4463-b43b-97d8af948926-scripts\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.550193 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff3ad30a-89e1-4463-b43b-97d8af948926-config\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.550229 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d88vt\" (UniqueName: \"kubernetes.io/projected/ff3ad30a-89e1-4463-b43b-97d8af948926-kube-api-access-d88vt\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.550273 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.550385 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.555536 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ff3ad30a-89e1-4463-b43b-97d8af948926-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.557279 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.559112 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff3ad30a-89e1-4463-b43b-97d8af948926-scripts\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.559413 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff3ad30a-89e1-4463-b43b-97d8af948926-config\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.565135 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.567936 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.571740 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d88vt\" (UniqueName: \"kubernetes.io/projected/ff3ad30a-89e1-4463-b43b-97d8af948926-kube-api-access-d88vt\") pod \"ovn-northd-0\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.620949 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rgqch"] Mar 12 13:27:47 crc kubenswrapper[4921]: W0312 13:27:47.623958 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod464806fa_ec1f_477a_bd5e_bae85b7eaff3.slice/crio-b48f1a704572fe236faccd4a47f2b2c53fe3eb6154b3c751d4fd96638b6212f9 WatchSource:0}: Error finding container b48f1a704572fe236faccd4a47f2b2c53fe3eb6154b3c751d4fd96638b6212f9: Status 404 returned error can't find the container with id b48f1a704572fe236faccd4a47f2b2c53fe3eb6154b3c751d4fd96638b6212f9 Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.664407 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 12 13:27:47 crc kubenswrapper[4921]: I0312 13:27:47.947207 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 13:27:47 crc kubenswrapper[4921]: W0312 13:27:47.953718 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff3ad30a_89e1_4463_b43b_97d8af948926.slice/crio-7817df3f00068386c62bd5cf111d5f2e9b098f29bb543a3f47ef58fe790eb185 WatchSource:0}: Error finding container 7817df3f00068386c62bd5cf111d5f2e9b098f29bb543a3f47ef58fe790eb185: Status 404 returned error can't find the container with id 7817df3f00068386c62bd5cf111d5f2e9b098f29bb543a3f47ef58fe790eb185 Mar 12 13:27:48 crc kubenswrapper[4921]: I0312 13:27:48.041953 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" event={"ID":"ee5ae924-0c20-44be-ad57-1aec571386f0","Type":"ContainerStarted","Data":"1009f6d6a94cfe11d23f439439981d4e037522d4e4fbe1a2c1b88e3e31ac3fe0"} Mar 12 13:27:48 crc kubenswrapper[4921]: I0312 13:27:48.044587 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-zhfgt" event={"ID":"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb","Type":"ContainerStarted","Data":"65a70baca42f74c9218cec1aa83b6b1d24820296a722103f607519542476ca4c"} Mar 12 13:27:48 crc kubenswrapper[4921]: I0312 13:27:48.048867 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" event={"ID":"464806fa-ec1f-477a-bd5e-bae85b7eaff3","Type":"ContainerStarted","Data":"b48f1a704572fe236faccd4a47f2b2c53fe3eb6154b3c751d4fd96638b6212f9"} Mar 12 13:27:48 crc kubenswrapper[4921]: I0312 13:27:48.052127 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ff3ad30a-89e1-4463-b43b-97d8af948926","Type":"ContainerStarted","Data":"7817df3f00068386c62bd5cf111d5f2e9b098f29bb543a3f47ef58fe790eb185"} Mar 12 13:27:49 crc kubenswrapper[4921]: I0312 13:27:49.776766 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 12 13:27:49 crc kubenswrapper[4921]: I0312 13:27:49.777147 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 12 13:27:51 crc kubenswrapper[4921]: I0312 13:27:51.083622 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" event={"ID":"ee5ae924-0c20-44be-ad57-1aec571386f0","Type":"ContainerStarted","Data":"ac5f63c41102ea94c89978f84cc4d74739f4b6b0fb1d5143891f83b69f49f454"} Mar 12 13:27:51 crc kubenswrapper[4921]: I0312 13:27:51.172346 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:51 crc kubenswrapper[4921]: I0312 13:27:51.172743 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:51 crc kubenswrapper[4921]: I0312 13:27:51.260049 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:52 crc kubenswrapper[4921]: I0312 13:27:52.200840 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 12 13:27:52 crc kubenswrapper[4921]: E0312 13:27:52.823180 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee5ae924_0c20_44be_ad57_1aec571386f0.slice/crio-conmon-ac5f63c41102ea94c89978f84cc4d74739f4b6b0fb1d5143891f83b69f49f454.scope\": RecentStats: unable to find data in memory cache]" Mar 12 13:27:53 crc kubenswrapper[4921]: I0312 13:27:53.107768 4921 generic.go:334] "Generic (PLEG): container finished" podID="ee5ae924-0c20-44be-ad57-1aec571386f0" containerID="ac5f63c41102ea94c89978f84cc4d74739f4b6b0fb1d5143891f83b69f49f454" exitCode=0 Mar 12 13:27:53 crc kubenswrapper[4921]: I0312 13:27:53.107905 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" event={"ID":"ee5ae924-0c20-44be-ad57-1aec571386f0","Type":"ContainerDied","Data":"ac5f63c41102ea94c89978f84cc4d74739f4b6b0fb1d5143891f83b69f49f454"} Mar 12 13:27:53 crc kubenswrapper[4921]: I0312 13:27:53.110114 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-zhfgt" event={"ID":"0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb","Type":"ContainerStarted","Data":"c6deec749b38fa5ed57a86644d0b2a2c09fdbf74df5e27e1831c4d8784ad01cb"} Mar 12 13:27:53 crc kubenswrapper[4921]: I0312 13:27:53.112440 4921 generic.go:334] "Generic (PLEG): container finished" podID="464806fa-ec1f-477a-bd5e-bae85b7eaff3" containerID="8d7f31fcf1ba37287e343ddead93fac5db87fe0a5684cc4b322feb78276403a7" exitCode=0 Mar 12 13:27:53 crc kubenswrapper[4921]: I0312 13:27:53.112550 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" event={"ID":"464806fa-ec1f-477a-bd5e-bae85b7eaff3","Type":"ContainerDied","Data":"8d7f31fcf1ba37287e343ddead93fac5db87fe0a5684cc4b322feb78276403a7"} Mar 12 13:27:53 crc kubenswrapper[4921]: I0312 13:27:53.166807 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-zhfgt" podStartSLOduration=7.166772656 podStartE2EDuration="7.166772656s" podCreationTimestamp="2026-03-12 13:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:27:53.154622581 +0000 UTC m=+1095.844694552" watchObservedRunningTime="2026-03-12 13:27:53.166772656 +0000 UTC m=+1095.856844627" Mar 12 13:27:53 crc kubenswrapper[4921]: I0312 13:27:53.170398 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 12 13:27:53 crc kubenswrapper[4921]: I0312 13:27:53.307975 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 12 13:27:53 crc kubenswrapper[4921]: I0312 13:27:53.803467 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 12 13:27:53 crc kubenswrapper[4921]: I0312 13:27:53.917806 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.059640 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-ovsdbserver-sb\") pod \"ee5ae924-0c20-44be-ad57-1aec571386f0\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.059722 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-dns-svc\") pod \"ee5ae924-0c20-44be-ad57-1aec571386f0\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.059764 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-config\") pod \"ee5ae924-0c20-44be-ad57-1aec571386f0\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.059803 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr5tq\" (UniqueName: \"kubernetes.io/projected/ee5ae924-0c20-44be-ad57-1aec571386f0-kube-api-access-hr5tq\") pod \"ee5ae924-0c20-44be-ad57-1aec571386f0\" (UID: \"ee5ae924-0c20-44be-ad57-1aec571386f0\") " Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.063017 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee5ae924-0c20-44be-ad57-1aec571386f0-kube-api-access-hr5tq" (OuterVolumeSpecName: "kube-api-access-hr5tq") pod "ee5ae924-0c20-44be-ad57-1aec571386f0" (UID: "ee5ae924-0c20-44be-ad57-1aec571386f0"). InnerVolumeSpecName "kube-api-access-hr5tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.094480 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee5ae924-0c20-44be-ad57-1aec571386f0" (UID: "ee5ae924-0c20-44be-ad57-1aec571386f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.094739 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-config" (OuterVolumeSpecName: "config") pod "ee5ae924-0c20-44be-ad57-1aec571386f0" (UID: "ee5ae924-0c20-44be-ad57-1aec571386f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.109309 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee5ae924-0c20-44be-ad57-1aec571386f0" (UID: "ee5ae924-0c20-44be-ad57-1aec571386f0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.161166 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.161403 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.161509 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee5ae924-0c20-44be-ad57-1aec571386f0-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.161578 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr5tq\" (UniqueName: \"kubernetes.io/projected/ee5ae924-0c20-44be-ad57-1aec571386f0-kube-api-access-hr5tq\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.172253 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" event={"ID":"464806fa-ec1f-477a-bd5e-bae85b7eaff3","Type":"ContainerStarted","Data":"8aeae923b81d66363f05f5405c44d0a1f69eeceb60cd2780d03e1d897bb894c8"} Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.172933 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.194379 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ff3ad30a-89e1-4463-b43b-97d8af948926","Type":"ContainerStarted","Data":"9c218a731fd8dacf110058c351211ce6df9c54ef138d6df7e4c76c8e4559a3bc"} Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.201073 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" event={"ID":"ee5ae924-0c20-44be-ad57-1aec571386f0","Type":"ContainerDied","Data":"1009f6d6a94cfe11d23f439439981d4e037522d4e4fbe1a2c1b88e3e31ac3fe0"} Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.201149 4921 scope.go:117] "RemoveContainer" containerID="ac5f63c41102ea94c89978f84cc4d74739f4b6b0fb1d5143891f83b69f49f454" Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.201979 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-m75rq" Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.202205 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" podStartSLOduration=8.202189506 podStartE2EDuration="8.202189506s" podCreationTimestamp="2026-03-12 13:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:27:54.199502144 +0000 UTC m=+1096.889574105" watchObservedRunningTime="2026-03-12 13:27:54.202189506 +0000 UTC m=+1096.892261477" Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.271138 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-m75rq"] Mar 12 13:27:54 crc kubenswrapper[4921]: I0312 13:27:54.279668 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-m75rq"] Mar 12 13:27:55 crc kubenswrapper[4921]: I0312 13:27:55.210481 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ff3ad30a-89e1-4463-b43b-97d8af948926","Type":"ContainerStarted","Data":"13bffa3502335915c744aaf64a05c7953e648d43e83f1a96fa0906a41832bb73"} Mar 12 13:27:55 crc kubenswrapper[4921]: I0312 13:27:55.211674 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 12 13:27:55 crc kubenswrapper[4921]: I0312 13:27:55.233085 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.420144963 podStartE2EDuration="8.233068008s" podCreationTimestamp="2026-03-12 13:27:47 +0000 UTC" firstStartedPulling="2026-03-12 13:27:47.956435977 +0000 UTC m=+1090.646507948" lastFinishedPulling="2026-03-12 13:27:53.769359022 +0000 UTC m=+1096.459430993" observedRunningTime="2026-03-12 13:27:55.229385584 +0000 UTC m=+1097.919457545" watchObservedRunningTime="2026-03-12 13:27:55.233068008 +0000 UTC m=+1097.923139969" Mar 12 13:27:55 crc kubenswrapper[4921]: I0312 13:27:55.991086 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee5ae924-0c20-44be-ad57-1aec571386f0" path="/var/lib/kubelet/pods/ee5ae924-0c20-44be-ad57-1aec571386f0/volumes" Mar 12 13:27:56 crc kubenswrapper[4921]: I0312 13:27:56.864182 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-zd7sn"] Mar 12 13:27:56 crc kubenswrapper[4921]: E0312 13:27:56.864956 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee5ae924-0c20-44be-ad57-1aec571386f0" containerName="init" Mar 12 13:27:56 crc kubenswrapper[4921]: I0312 13:27:56.864972 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee5ae924-0c20-44be-ad57-1aec571386f0" containerName="init" Mar 12 13:27:56 crc kubenswrapper[4921]: I0312 13:27:56.865149 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee5ae924-0c20-44be-ad57-1aec571386f0" containerName="init" Mar 12 13:27:56 crc kubenswrapper[4921]: I0312 13:27:56.865698 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zd7sn" Mar 12 13:27:56 crc kubenswrapper[4921]: I0312 13:27:56.880967 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-527f-account-create-update-j7dk7"] Mar 12 13:27:56 crc kubenswrapper[4921]: I0312 13:27:56.882104 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-527f-account-create-update-j7dk7" Mar 12 13:27:56 crc kubenswrapper[4921]: I0312 13:27:56.883632 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 12 13:27:56 crc kubenswrapper[4921]: I0312 13:27:56.895517 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-zd7sn"] Mar 12 13:27:56 crc kubenswrapper[4921]: I0312 13:27:56.905014 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-527f-account-create-update-j7dk7"] Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.005333 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10be574f-fcea-4cd1-9ce9-7146709cc274-operator-scripts\") pod \"glance-527f-account-create-update-j7dk7\" (UID: \"10be574f-fcea-4cd1-9ce9-7146709cc274\") " pod="openstack/glance-527f-account-create-update-j7dk7" Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.005409 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xznz\" (UniqueName: \"kubernetes.io/projected/a7ee872b-3556-4e4b-912a-4124b76e5ccc-kube-api-access-2xznz\") pod \"glance-db-create-zd7sn\" (UID: \"a7ee872b-3556-4e4b-912a-4124b76e5ccc\") " pod="openstack/glance-db-create-zd7sn" Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.005460 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ee872b-3556-4e4b-912a-4124b76e5ccc-operator-scripts\") pod \"glance-db-create-zd7sn\" (UID: \"a7ee872b-3556-4e4b-912a-4124b76e5ccc\") " pod="openstack/glance-db-create-zd7sn" Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.005496 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rwhd\" (UniqueName: \"kubernetes.io/projected/10be574f-fcea-4cd1-9ce9-7146709cc274-kube-api-access-9rwhd\") pod \"glance-527f-account-create-update-j7dk7\" (UID: \"10be574f-fcea-4cd1-9ce9-7146709cc274\") " pod="openstack/glance-527f-account-create-update-j7dk7" Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.107145 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xznz\" (UniqueName: \"kubernetes.io/projected/a7ee872b-3556-4e4b-912a-4124b76e5ccc-kube-api-access-2xznz\") pod \"glance-db-create-zd7sn\" (UID: \"a7ee872b-3556-4e4b-912a-4124b76e5ccc\") " pod="openstack/glance-db-create-zd7sn" Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.107274 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ee872b-3556-4e4b-912a-4124b76e5ccc-operator-scripts\") pod \"glance-db-create-zd7sn\" (UID: \"a7ee872b-3556-4e4b-912a-4124b76e5ccc\") " pod="openstack/glance-db-create-zd7sn" Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.107341 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rwhd\" (UniqueName: \"kubernetes.io/projected/10be574f-fcea-4cd1-9ce9-7146709cc274-kube-api-access-9rwhd\") pod \"glance-527f-account-create-update-j7dk7\" (UID: \"10be574f-fcea-4cd1-9ce9-7146709cc274\") " pod="openstack/glance-527f-account-create-update-j7dk7" Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.107497 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10be574f-fcea-4cd1-9ce9-7146709cc274-operator-scripts\") pod \"glance-527f-account-create-update-j7dk7\" (UID: \"10be574f-fcea-4cd1-9ce9-7146709cc274\") " pod="openstack/glance-527f-account-create-update-j7dk7" Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.108270 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10be574f-fcea-4cd1-9ce9-7146709cc274-operator-scripts\") pod \"glance-527f-account-create-update-j7dk7\" (UID: \"10be574f-fcea-4cd1-9ce9-7146709cc274\") " pod="openstack/glance-527f-account-create-update-j7dk7" Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.108273 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ee872b-3556-4e4b-912a-4124b76e5ccc-operator-scripts\") pod \"glance-db-create-zd7sn\" (UID: \"a7ee872b-3556-4e4b-912a-4124b76e5ccc\") " pod="openstack/glance-db-create-zd7sn" Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.128027 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xznz\" (UniqueName: \"kubernetes.io/projected/a7ee872b-3556-4e4b-912a-4124b76e5ccc-kube-api-access-2xznz\") pod \"glance-db-create-zd7sn\" (UID: \"a7ee872b-3556-4e4b-912a-4124b76e5ccc\") " pod="openstack/glance-db-create-zd7sn" Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.131238 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rwhd\" (UniqueName: \"kubernetes.io/projected/10be574f-fcea-4cd1-9ce9-7146709cc274-kube-api-access-9rwhd\") pod \"glance-527f-account-create-update-j7dk7\" (UID: \"10be574f-fcea-4cd1-9ce9-7146709cc274\") " pod="openstack/glance-527f-account-create-update-j7dk7" Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.184345 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zd7sn" Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.201287 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-527f-account-create-update-j7dk7" Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.700967 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-zd7sn"] Mar 12 13:27:57 crc kubenswrapper[4921]: I0312 13:27:57.763450 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-527f-account-create-update-j7dk7"] Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.236754 4921 generic.go:334] "Generic (PLEG): container finished" podID="10be574f-fcea-4cd1-9ce9-7146709cc274" containerID="c5a788ad4a878e35801a780683af5419bc1e72b1a39b63de7f8324361e451cc8" exitCode=0 Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.236865 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-527f-account-create-update-j7dk7" event={"ID":"10be574f-fcea-4cd1-9ce9-7146709cc274","Type":"ContainerDied","Data":"c5a788ad4a878e35801a780683af5419bc1e72b1a39b63de7f8324361e451cc8"} Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.236913 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-527f-account-create-update-j7dk7" event={"ID":"10be574f-fcea-4cd1-9ce9-7146709cc274","Type":"ContainerStarted","Data":"ee1d6d1c9a71bb599c96c029ad375eb7c610f5b5c48260f22717c4cda9ae7d69"} Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.239033 4921 generic.go:334] "Generic (PLEG): container finished" podID="a7ee872b-3556-4e4b-912a-4124b76e5ccc" containerID="97a9d21a513fc667b427eec317df9c0a344aab77b82686f22a9fe4bcced47675" exitCode=0 Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.239059 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zd7sn" event={"ID":"a7ee872b-3556-4e4b-912a-4124b76e5ccc","Type":"ContainerDied","Data":"97a9d21a513fc667b427eec317df9c0a344aab77b82686f22a9fe4bcced47675"} Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.239072 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zd7sn" event={"ID":"a7ee872b-3556-4e4b-912a-4124b76e5ccc","Type":"ContainerStarted","Data":"682a8dc0ed04bdbe3189178915bbb4ac4965991406c1adcacb912d0b53293651"} Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.404949 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-8sbv9"] Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.406832 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8sbv9" Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.409447 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.417031 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8sbv9"] Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.534228 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3-operator-scripts\") pod \"root-account-create-update-8sbv9\" (UID: \"5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3\") " pod="openstack/root-account-create-update-8sbv9" Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.534296 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr9bf\" (UniqueName: \"kubernetes.io/projected/5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3-kube-api-access-lr9bf\") pod \"root-account-create-update-8sbv9\" (UID: \"5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3\") " pod="openstack/root-account-create-update-8sbv9" Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.636092 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3-operator-scripts\") pod \"root-account-create-update-8sbv9\" (UID: \"5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3\") " pod="openstack/root-account-create-update-8sbv9" Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.636528 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr9bf\" (UniqueName: \"kubernetes.io/projected/5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3-kube-api-access-lr9bf\") pod \"root-account-create-update-8sbv9\" (UID: \"5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3\") " pod="openstack/root-account-create-update-8sbv9" Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.639662 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3-operator-scripts\") pod \"root-account-create-update-8sbv9\" (UID: \"5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3\") " pod="openstack/root-account-create-update-8sbv9" Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.657016 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr9bf\" (UniqueName: \"kubernetes.io/projected/5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3-kube-api-access-lr9bf\") pod \"root-account-create-update-8sbv9\" (UID: \"5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3\") " pod="openstack/root-account-create-update-8sbv9" Mar 12 13:27:58 crc kubenswrapper[4921]: I0312 13:27:58.727516 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8sbv9" Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.001198 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8sbv9"] Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.261737 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8sbv9" event={"ID":"5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3","Type":"ContainerStarted","Data":"3d0fdc0ecffd9277de528115a9925fd78f5e48b685197844c2b4614363ef8dc0"} Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.262134 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8sbv9" event={"ID":"5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3","Type":"ContainerStarted","Data":"9fa25ae6f88ef611e8d65ff7759297747110050fce13e46ef9ae19d27b8643a1"} Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.284386 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-8sbv9" podStartSLOduration=1.2843583170000001 podStartE2EDuration="1.284358317s" podCreationTimestamp="2026-03-12 13:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:27:59.277852256 +0000 UTC m=+1101.967924237" watchObservedRunningTime="2026-03-12 13:27:59.284358317 +0000 UTC m=+1101.974430308" Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.688276 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-527f-account-create-update-j7dk7" Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.693844 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zd7sn" Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.753703 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10be574f-fcea-4cd1-9ce9-7146709cc274-operator-scripts\") pod \"10be574f-fcea-4cd1-9ce9-7146709cc274\" (UID: \"10be574f-fcea-4cd1-9ce9-7146709cc274\") " Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.753859 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rwhd\" (UniqueName: \"kubernetes.io/projected/10be574f-fcea-4cd1-9ce9-7146709cc274-kube-api-access-9rwhd\") pod \"10be574f-fcea-4cd1-9ce9-7146709cc274\" (UID: \"10be574f-fcea-4cd1-9ce9-7146709cc274\") " Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.754578 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10be574f-fcea-4cd1-9ce9-7146709cc274-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10be574f-fcea-4cd1-9ce9-7146709cc274" (UID: "10be574f-fcea-4cd1-9ce9-7146709cc274"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.759524 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10be574f-fcea-4cd1-9ce9-7146709cc274-kube-api-access-9rwhd" (OuterVolumeSpecName: "kube-api-access-9rwhd") pod "10be574f-fcea-4cd1-9ce9-7146709cc274" (UID: "10be574f-fcea-4cd1-9ce9-7146709cc274"). InnerVolumeSpecName "kube-api-access-9rwhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.855456 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xznz\" (UniqueName: \"kubernetes.io/projected/a7ee872b-3556-4e4b-912a-4124b76e5ccc-kube-api-access-2xznz\") pod \"a7ee872b-3556-4e4b-912a-4124b76e5ccc\" (UID: \"a7ee872b-3556-4e4b-912a-4124b76e5ccc\") " Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.855977 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ee872b-3556-4e4b-912a-4124b76e5ccc-operator-scripts\") pod \"a7ee872b-3556-4e4b-912a-4124b76e5ccc\" (UID: \"a7ee872b-3556-4e4b-912a-4124b76e5ccc\") " Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.856264 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10be574f-fcea-4cd1-9ce9-7146709cc274-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.856282 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rwhd\" (UniqueName: \"kubernetes.io/projected/10be574f-fcea-4cd1-9ce9-7146709cc274-kube-api-access-9rwhd\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.856557 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7ee872b-3556-4e4b-912a-4124b76e5ccc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7ee872b-3556-4e4b-912a-4124b76e5ccc" (UID: "a7ee872b-3556-4e4b-912a-4124b76e5ccc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.858300 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ee872b-3556-4e4b-912a-4124b76e5ccc-kube-api-access-2xznz" (OuterVolumeSpecName: "kube-api-access-2xznz") pod "a7ee872b-3556-4e4b-912a-4124b76e5ccc" (UID: "a7ee872b-3556-4e4b-912a-4124b76e5ccc"). InnerVolumeSpecName "kube-api-access-2xznz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.963016 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7ee872b-3556-4e4b-912a-4124b76e5ccc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:27:59 crc kubenswrapper[4921]: I0312 13:27:59.963060 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xznz\" (UniqueName: \"kubernetes.io/projected/a7ee872b-3556-4e4b-912a-4124b76e5ccc-kube-api-access-2xznz\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.134099 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555368-82pj9"] Mar 12 13:28:00 crc kubenswrapper[4921]: E0312 13:28:00.134613 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ee872b-3556-4e4b-912a-4124b76e5ccc" containerName="mariadb-database-create" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.134641 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ee872b-3556-4e4b-912a-4124b76e5ccc" containerName="mariadb-database-create" Mar 12 13:28:00 crc kubenswrapper[4921]: E0312 13:28:00.134686 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10be574f-fcea-4cd1-9ce9-7146709cc274" containerName="mariadb-account-create-update" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.134698 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="10be574f-fcea-4cd1-9ce9-7146709cc274" containerName="mariadb-account-create-update" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.135018 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="10be574f-fcea-4cd1-9ce9-7146709cc274" containerName="mariadb-account-create-update" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.135050 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ee872b-3556-4e4b-912a-4124b76e5ccc" containerName="mariadb-database-create" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.135907 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555368-82pj9" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.139922 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.140243 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.142519 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.143887 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555368-82pj9"] Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.267528 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz4d9\" (UniqueName: \"kubernetes.io/projected/982b1b07-1ac6-4431-9226-3bf8129423cd-kube-api-access-nz4d9\") pod \"auto-csr-approver-29555368-82pj9\" (UID: \"982b1b07-1ac6-4431-9226-3bf8129423cd\") " pod="openshift-infra/auto-csr-approver-29555368-82pj9" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.268960 4921 generic.go:334] "Generic (PLEG): container finished" podID="5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3" containerID="3d0fdc0ecffd9277de528115a9925fd78f5e48b685197844c2b4614363ef8dc0" exitCode=0 Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.269027 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8sbv9" event={"ID":"5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3","Type":"ContainerDied","Data":"3d0fdc0ecffd9277de528115a9925fd78f5e48b685197844c2b4614363ef8dc0"} Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.270921 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-527f-account-create-update-j7dk7" event={"ID":"10be574f-fcea-4cd1-9ce9-7146709cc274","Type":"ContainerDied","Data":"ee1d6d1c9a71bb599c96c029ad375eb7c610f5b5c48260f22717c4cda9ae7d69"} Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.270938 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-527f-account-create-update-j7dk7" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.270951 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee1d6d1c9a71bb599c96c029ad375eb7c610f5b5c48260f22717c4cda9ae7d69" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.272996 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zd7sn" event={"ID":"a7ee872b-3556-4e4b-912a-4124b76e5ccc","Type":"ContainerDied","Data":"682a8dc0ed04bdbe3189178915bbb4ac4965991406c1adcacb912d0b53293651"} Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.273026 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="682a8dc0ed04bdbe3189178915bbb4ac4965991406c1adcacb912d0b53293651" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.273041 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zd7sn" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.369417 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz4d9\" (UniqueName: \"kubernetes.io/projected/982b1b07-1ac6-4431-9226-3bf8129423cd-kube-api-access-nz4d9\") pod \"auto-csr-approver-29555368-82pj9\" (UID: \"982b1b07-1ac6-4431-9226-3bf8129423cd\") " pod="openshift-infra/auto-csr-approver-29555368-82pj9" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.388193 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz4d9\" (UniqueName: \"kubernetes.io/projected/982b1b07-1ac6-4431-9226-3bf8129423cd-kube-api-access-nz4d9\") pod \"auto-csr-approver-29555368-82pj9\" (UID: \"982b1b07-1ac6-4431-9226-3bf8129423cd\") " pod="openshift-infra/auto-csr-approver-29555368-82pj9" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.463688 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555368-82pj9" Mar 12 13:28:00 crc kubenswrapper[4921]: I0312 13:28:00.910283 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555368-82pj9"] Mar 12 13:28:01 crc kubenswrapper[4921]: I0312 13:28:01.283066 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555368-82pj9" event={"ID":"982b1b07-1ac6-4431-9226-3bf8129423cd","Type":"ContainerStarted","Data":"22cae3dcffba2eef61d0d6534403166db363586bdece2161ca80f555af7d8d9c"} Mar 12 13:28:01 crc kubenswrapper[4921]: I0312 13:28:01.687841 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8sbv9" Mar 12 13:28:01 crc kubenswrapper[4921]: I0312 13:28:01.806314 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr9bf\" (UniqueName: \"kubernetes.io/projected/5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3-kube-api-access-lr9bf\") pod \"5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3\" (UID: \"5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3\") " Mar 12 13:28:01 crc kubenswrapper[4921]: I0312 13:28:01.806367 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3-operator-scripts\") pod \"5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3\" (UID: \"5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3\") " Mar 12 13:28:01 crc kubenswrapper[4921]: I0312 13:28:01.807370 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3" (UID: "5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:01 crc kubenswrapper[4921]: I0312 13:28:01.811548 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3-kube-api-access-lr9bf" (OuterVolumeSpecName: "kube-api-access-lr9bf") pod "5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3" (UID: "5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3"). InnerVolumeSpecName "kube-api-access-lr9bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:01 crc kubenswrapper[4921]: I0312 13:28:01.908581 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr9bf\" (UniqueName: \"kubernetes.io/projected/5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3-kube-api-access-lr9bf\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:01 crc kubenswrapper[4921]: I0312 13:28:01.908620 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.051639 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hnqjm"] Mar 12 13:28:02 crc kubenswrapper[4921]: E0312 13:28:02.052079 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3" containerName="mariadb-account-create-update" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.052103 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3" containerName="mariadb-account-create-update" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.052332 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3" containerName="mariadb-account-create-update" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.053005 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.055140 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.055387 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rpw6q" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.067996 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hnqjm"] Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.161971 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.213737 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ncsx9"] Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.214034 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" podUID="0347f6f0-0cbe-4543-8f1f-939b159b8652" containerName="dnsmasq-dns" containerID="cri-o://e12f25f79ae1e784f729f694004415e9f80ee2feef5a89994474aad49d36ce1a" gracePeriod=10 Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.214677 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-config-data\") pod \"glance-db-sync-hnqjm\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.214860 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-db-sync-config-data\") pod \"glance-db-sync-hnqjm\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.215057 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-275cv\" (UniqueName: \"kubernetes.io/projected/7c4cb6d3-4372-4daa-bebb-49c822b98228-kube-api-access-275cv\") pod \"glance-db-sync-hnqjm\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.215142 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-combined-ca-bundle\") pod \"glance-db-sync-hnqjm\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.290783 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8sbv9" event={"ID":"5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3","Type":"ContainerDied","Data":"9fa25ae6f88ef611e8d65ff7759297747110050fce13e46ef9ae19d27b8643a1"} Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.290895 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fa25ae6f88ef611e8d65ff7759297747110050fce13e46ef9ae19d27b8643a1" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.290847 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8sbv9" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.316614 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-275cv\" (UniqueName: \"kubernetes.io/projected/7c4cb6d3-4372-4daa-bebb-49c822b98228-kube-api-access-275cv\") pod \"glance-db-sync-hnqjm\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.316672 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-combined-ca-bundle\") pod \"glance-db-sync-hnqjm\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.317052 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-config-data\") pod \"glance-db-sync-hnqjm\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.317453 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-db-sync-config-data\") pod \"glance-db-sync-hnqjm\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.322380 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-combined-ca-bundle\") pod \"glance-db-sync-hnqjm\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.322396 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-config-data\") pod \"glance-db-sync-hnqjm\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.322396 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-db-sync-config-data\") pod \"glance-db-sync-hnqjm\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.346759 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-275cv\" (UniqueName: \"kubernetes.io/projected/7c4cb6d3-4372-4daa-bebb-49c822b98228-kube-api-access-275cv\") pod \"glance-db-sync-hnqjm\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.369949 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.719475 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bb3b-account-create-update-bvs2n"] Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.743367 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bb3b-account-create-update-bvs2n" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.745300 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.746096 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.765547 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-c2x57"] Mar 12 13:28:02 crc kubenswrapper[4921]: E0312 13:28:02.766562 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0347f6f0-0cbe-4543-8f1f-939b159b8652" containerName="dnsmasq-dns" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.766738 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0347f6f0-0cbe-4543-8f1f-939b159b8652" containerName="dnsmasq-dns" Mar 12 13:28:02 crc kubenswrapper[4921]: E0312 13:28:02.767522 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0347f6f0-0cbe-4543-8f1f-939b159b8652" containerName="init" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.767650 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0347f6f0-0cbe-4543-8f1f-939b159b8652" containerName="init" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.768506 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0347f6f0-0cbe-4543-8f1f-939b159b8652" containerName="dnsmasq-dns" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.769509 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c2x57" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.784914 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bb3b-account-create-update-bvs2n"] Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.819874 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-c2x57"] Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.845431 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0347f6f0-0cbe-4543-8f1f-939b159b8652-dns-svc\") pod \"0347f6f0-0cbe-4543-8f1f-939b159b8652\" (UID: \"0347f6f0-0cbe-4543-8f1f-939b159b8652\") " Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.845597 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d79pd\" (UniqueName: \"kubernetes.io/projected/0347f6f0-0cbe-4543-8f1f-939b159b8652-kube-api-access-d79pd\") pod \"0347f6f0-0cbe-4543-8f1f-939b159b8652\" (UID: \"0347f6f0-0cbe-4543-8f1f-939b159b8652\") " Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.845673 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0347f6f0-0cbe-4543-8f1f-939b159b8652-config\") pod \"0347f6f0-0cbe-4543-8f1f-939b159b8652\" (UID: \"0347f6f0-0cbe-4543-8f1f-939b159b8652\") " Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.846215 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64049769-9653-4351-b97a-881100721f77-operator-scripts\") pod \"keystone-bb3b-account-create-update-bvs2n\" (UID: \"64049769-9653-4351-b97a-881100721f77\") " pod="openstack/keystone-bb3b-account-create-update-bvs2n" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.846333 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ngxb\" (UniqueName: \"kubernetes.io/projected/64049769-9653-4351-b97a-881100721f77-kube-api-access-6ngxb\") pod \"keystone-bb3b-account-create-update-bvs2n\" (UID: \"64049769-9653-4351-b97a-881100721f77\") " pod="openstack/keystone-bb3b-account-create-update-bvs2n" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.865098 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0347f6f0-0cbe-4543-8f1f-939b159b8652-kube-api-access-d79pd" (OuterVolumeSpecName: "kube-api-access-d79pd") pod "0347f6f0-0cbe-4543-8f1f-939b159b8652" (UID: "0347f6f0-0cbe-4543-8f1f-939b159b8652"). InnerVolumeSpecName "kube-api-access-d79pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.920003 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-frghj"] Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.921346 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-frghj" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.930493 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-frghj"] Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.933746 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0347f6f0-0cbe-4543-8f1f-939b159b8652-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0347f6f0-0cbe-4543-8f1f-939b159b8652" (UID: "0347f6f0-0cbe-4543-8f1f-939b159b8652"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.939465 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0711-account-create-update-8gxd6"] Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.942104 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0711-account-create-update-8gxd6" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.947483 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64049769-9653-4351-b97a-881100721f77-operator-scripts\") pod \"keystone-bb3b-account-create-update-bvs2n\" (UID: \"64049769-9653-4351-b97a-881100721f77\") " pod="openstack/keystone-bb3b-account-create-update-bvs2n" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.947550 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ngxb\" (UniqueName: \"kubernetes.io/projected/64049769-9653-4351-b97a-881100721f77-kube-api-access-6ngxb\") pod \"keystone-bb3b-account-create-update-bvs2n\" (UID: \"64049769-9653-4351-b97a-881100721f77\") " pod="openstack/keystone-bb3b-account-create-update-bvs2n" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.947576 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t7br\" (UniqueName: \"kubernetes.io/projected/6f8352a3-a443-4b74-b6a3-57b2074f7cef-kube-api-access-9t7br\") pod \"keystone-db-create-c2x57\" (UID: \"6f8352a3-a443-4b74-b6a3-57b2074f7cef\") " pod="openstack/keystone-db-create-c2x57" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.947649 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f8352a3-a443-4b74-b6a3-57b2074f7cef-operator-scripts\") pod \"keystone-db-create-c2x57\" (UID: \"6f8352a3-a443-4b74-b6a3-57b2074f7cef\") " pod="openstack/keystone-db-create-c2x57" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.947719 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0711-account-create-update-8gxd6"] Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.947726 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0347f6f0-0cbe-4543-8f1f-939b159b8652-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.947770 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.947781 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d79pd\" (UniqueName: \"kubernetes.io/projected/0347f6f0-0cbe-4543-8f1f-939b159b8652-kube-api-access-d79pd\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.948583 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64049769-9653-4351-b97a-881100721f77-operator-scripts\") pod \"keystone-bb3b-account-create-update-bvs2n\" (UID: \"64049769-9653-4351-b97a-881100721f77\") " pod="openstack/keystone-bb3b-account-create-update-bvs2n" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.964042 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ngxb\" (UniqueName: \"kubernetes.io/projected/64049769-9653-4351-b97a-881100721f77-kube-api-access-6ngxb\") pod \"keystone-bb3b-account-create-update-bvs2n\" (UID: \"64049769-9653-4351-b97a-881100721f77\") " pod="openstack/keystone-bb3b-account-create-update-bvs2n" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.971807 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0347f6f0-0cbe-4543-8f1f-939b159b8652-config" (OuterVolumeSpecName: "config") pod "0347f6f0-0cbe-4543-8f1f-939b159b8652" (UID: "0347f6f0-0cbe-4543-8f1f-939b159b8652"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.979696 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hnqjm"] Mar 12 13:28:02 crc kubenswrapper[4921]: I0312 13:28:02.986627 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.052554 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f8352a3-a443-4b74-b6a3-57b2074f7cef-operator-scripts\") pod \"keystone-db-create-c2x57\" (UID: \"6f8352a3-a443-4b74-b6a3-57b2074f7cef\") " pod="openstack/keystone-db-create-c2x57" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.053247 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa3bfb0c-259e-44db-a8ca-4ea73bd4493d-operator-scripts\") pod \"placement-db-create-frghj\" (UID: \"aa3bfb0c-259e-44db-a8ca-4ea73bd4493d\") " pod="openstack/placement-db-create-frghj" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.053416 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrcmn\" (UniqueName: \"kubernetes.io/projected/aa3bfb0c-259e-44db-a8ca-4ea73bd4493d-kube-api-access-wrcmn\") pod \"placement-db-create-frghj\" (UID: \"aa3bfb0c-259e-44db-a8ca-4ea73bd4493d\") " pod="openstack/placement-db-create-frghj" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.053469 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c025e13-55c5-4aab-8815-d7ab022219b7-operator-scripts\") pod \"placement-0711-account-create-update-8gxd6\" (UID: \"4c025e13-55c5-4aab-8815-d7ab022219b7\") " pod="openstack/placement-0711-account-create-update-8gxd6" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.053503 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cpb5\" (UniqueName: \"kubernetes.io/projected/4c025e13-55c5-4aab-8815-d7ab022219b7-kube-api-access-2cpb5\") pod \"placement-0711-account-create-update-8gxd6\" (UID: \"4c025e13-55c5-4aab-8815-d7ab022219b7\") " pod="openstack/placement-0711-account-create-update-8gxd6" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.053567 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t7br\" (UniqueName: \"kubernetes.io/projected/6f8352a3-a443-4b74-b6a3-57b2074f7cef-kube-api-access-9t7br\") pod \"keystone-db-create-c2x57\" (UID: \"6f8352a3-a443-4b74-b6a3-57b2074f7cef\") " pod="openstack/keystone-db-create-c2x57" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.053674 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0347f6f0-0cbe-4543-8f1f-939b159b8652-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.053938 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f8352a3-a443-4b74-b6a3-57b2074f7cef-operator-scripts\") pod \"keystone-db-create-c2x57\" (UID: \"6f8352a3-a443-4b74-b6a3-57b2074f7cef\") " pod="openstack/keystone-db-create-c2x57" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.059196 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bb3b-account-create-update-bvs2n" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.071801 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t7br\" (UniqueName: \"kubernetes.io/projected/6f8352a3-a443-4b74-b6a3-57b2074f7cef-kube-api-access-9t7br\") pod \"keystone-db-create-c2x57\" (UID: \"6f8352a3-a443-4b74-b6a3-57b2074f7cef\") " pod="openstack/keystone-db-create-c2x57" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.103737 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c2x57" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.154860 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa3bfb0c-259e-44db-a8ca-4ea73bd4493d-operator-scripts\") pod \"placement-db-create-frghj\" (UID: \"aa3bfb0c-259e-44db-a8ca-4ea73bd4493d\") " pod="openstack/placement-db-create-frghj" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.155219 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrcmn\" (UniqueName: \"kubernetes.io/projected/aa3bfb0c-259e-44db-a8ca-4ea73bd4493d-kube-api-access-wrcmn\") pod \"placement-db-create-frghj\" (UID: \"aa3bfb0c-259e-44db-a8ca-4ea73bd4493d\") " pod="openstack/placement-db-create-frghj" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.155259 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c025e13-55c5-4aab-8815-d7ab022219b7-operator-scripts\") pod \"placement-0711-account-create-update-8gxd6\" (UID: \"4c025e13-55c5-4aab-8815-d7ab022219b7\") " pod="openstack/placement-0711-account-create-update-8gxd6" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.155285 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cpb5\" (UniqueName: \"kubernetes.io/projected/4c025e13-55c5-4aab-8815-d7ab022219b7-kube-api-access-2cpb5\") pod \"placement-0711-account-create-update-8gxd6\" (UID: \"4c025e13-55c5-4aab-8815-d7ab022219b7\") " pod="openstack/placement-0711-account-create-update-8gxd6" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.155978 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa3bfb0c-259e-44db-a8ca-4ea73bd4493d-operator-scripts\") pod \"placement-db-create-frghj\" (UID: \"aa3bfb0c-259e-44db-a8ca-4ea73bd4493d\") " pod="openstack/placement-db-create-frghj" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.156731 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c025e13-55c5-4aab-8815-d7ab022219b7-operator-scripts\") pod \"placement-0711-account-create-update-8gxd6\" (UID: \"4c025e13-55c5-4aab-8815-d7ab022219b7\") " pod="openstack/placement-0711-account-create-update-8gxd6" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.172538 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrcmn\" (UniqueName: \"kubernetes.io/projected/aa3bfb0c-259e-44db-a8ca-4ea73bd4493d-kube-api-access-wrcmn\") pod \"placement-db-create-frghj\" (UID: \"aa3bfb0c-259e-44db-a8ca-4ea73bd4493d\") " pod="openstack/placement-db-create-frghj" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.172940 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cpb5\" (UniqueName: \"kubernetes.io/projected/4c025e13-55c5-4aab-8815-d7ab022219b7-kube-api-access-2cpb5\") pod \"placement-0711-account-create-update-8gxd6\" (UID: \"4c025e13-55c5-4aab-8815-d7ab022219b7\") " pod="openstack/placement-0711-account-create-update-8gxd6" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.259069 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-frghj" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.271594 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0711-account-create-update-8gxd6" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.315163 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555368-82pj9" event={"ID":"982b1b07-1ac6-4431-9226-3bf8129423cd","Type":"ContainerStarted","Data":"4a9bbae02a3363ca87334d69e34ab8c25f1a8f8b6ffc28003e05588479008ef7"} Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.317832 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hnqjm" event={"ID":"7c4cb6d3-4372-4daa-bebb-49c822b98228","Type":"ContainerStarted","Data":"36709a55cafead3c857f7d5410abed283e2ee903ca2b1c1aa5a799fb1b001e30"} Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.324332 4921 generic.go:334] "Generic (PLEG): container finished" podID="0347f6f0-0cbe-4543-8f1f-939b159b8652" containerID="e12f25f79ae1e784f729f694004415e9f80ee2feef5a89994474aad49d36ce1a" exitCode=0 Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.324411 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" event={"ID":"0347f6f0-0cbe-4543-8f1f-939b159b8652","Type":"ContainerDied","Data":"e12f25f79ae1e784f729f694004415e9f80ee2feef5a89994474aad49d36ce1a"} Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.324495 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" event={"ID":"0347f6f0-0cbe-4543-8f1f-939b159b8652","Type":"ContainerDied","Data":"f506dfe7cc455027ab0c0fe7cf1467150cffdba0e0494eba0e32ba610207daf0"} Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.324549 4921 scope.go:117] "RemoveContainer" containerID="e12f25f79ae1e784f729f694004415e9f80ee2feef5a89994474aad49d36ce1a" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.324443 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.332767 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555368-82pj9" podStartSLOduration=1.6639898020000001 podStartE2EDuration="3.332750527s" podCreationTimestamp="2026-03-12 13:28:00 +0000 UTC" firstStartedPulling="2026-03-12 13:28:00.920348491 +0000 UTC m=+1103.610420462" lastFinishedPulling="2026-03-12 13:28:02.589109226 +0000 UTC m=+1105.279181187" observedRunningTime="2026-03-12 13:28:03.327355191 +0000 UTC m=+1106.017427162" watchObservedRunningTime="2026-03-12 13:28:03.332750527 +0000 UTC m=+1106.022822498" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.366034 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ncsx9"] Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.367803 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ncsx9"] Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.369008 4921 scope.go:117] "RemoveContainer" containerID="af5ca2083ab37de437b2b7fdaf9265ad25cf5658eeec0ed53eba53bbef36477d" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.416504 4921 scope.go:117] "RemoveContainer" containerID="e12f25f79ae1e784f729f694004415e9f80ee2feef5a89994474aad49d36ce1a" Mar 12 13:28:03 crc kubenswrapper[4921]: E0312 13:28:03.417229 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e12f25f79ae1e784f729f694004415e9f80ee2feef5a89994474aad49d36ce1a\": container with ID starting with e12f25f79ae1e784f729f694004415e9f80ee2feef5a89994474aad49d36ce1a not found: ID does not exist" containerID="e12f25f79ae1e784f729f694004415e9f80ee2feef5a89994474aad49d36ce1a" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.417283 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e12f25f79ae1e784f729f694004415e9f80ee2feef5a89994474aad49d36ce1a"} err="failed to get container status \"e12f25f79ae1e784f729f694004415e9f80ee2feef5a89994474aad49d36ce1a\": rpc error: code = NotFound desc = could not find container \"e12f25f79ae1e784f729f694004415e9f80ee2feef5a89994474aad49d36ce1a\": container with ID starting with e12f25f79ae1e784f729f694004415e9f80ee2feef5a89994474aad49d36ce1a not found: ID does not exist" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.417314 4921 scope.go:117] "RemoveContainer" containerID="af5ca2083ab37de437b2b7fdaf9265ad25cf5658eeec0ed53eba53bbef36477d" Mar 12 13:28:03 crc kubenswrapper[4921]: E0312 13:28:03.417557 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af5ca2083ab37de437b2b7fdaf9265ad25cf5658eeec0ed53eba53bbef36477d\": container with ID starting with af5ca2083ab37de437b2b7fdaf9265ad25cf5658eeec0ed53eba53bbef36477d not found: ID does not exist" containerID="af5ca2083ab37de437b2b7fdaf9265ad25cf5658eeec0ed53eba53bbef36477d" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.417577 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af5ca2083ab37de437b2b7fdaf9265ad25cf5658eeec0ed53eba53bbef36477d"} err="failed to get container status \"af5ca2083ab37de437b2b7fdaf9265ad25cf5658eeec0ed53eba53bbef36477d\": rpc error: code = NotFound desc = could not find container \"af5ca2083ab37de437b2b7fdaf9265ad25cf5658eeec0ed53eba53bbef36477d\": container with ID starting with af5ca2083ab37de437b2b7fdaf9265ad25cf5658eeec0ed53eba53bbef36477d not found: ID does not exist" Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.537638 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bb3b-account-create-update-bvs2n"] Mar 12 13:28:03 crc kubenswrapper[4921]: W0312 13:28:03.539372 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64049769_9653_4351_b97a_881100721f77.slice/crio-d8da20b32dd9e21ddb5d9e6f492fa1af52ecfbd6b40a08f1541557e7e9a9fc0e WatchSource:0}: Error finding container d8da20b32dd9e21ddb5d9e6f492fa1af52ecfbd6b40a08f1541557e7e9a9fc0e: Status 404 returned error can't find the container with id d8da20b32dd9e21ddb5d9e6f492fa1af52ecfbd6b40a08f1541557e7e9a9fc0e Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.663838 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-c2x57"] Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.801331 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-frghj"] Mar 12 13:28:03 crc kubenswrapper[4921]: W0312 13:28:03.807633 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c025e13_55c5_4aab_8815_d7ab022219b7.slice/crio-24950c6ccb00f6d48d5f3e3d072104e57980ec8e4656c8d721f3947d8dceb0e4 WatchSource:0}: Error finding container 24950c6ccb00f6d48d5f3e3d072104e57980ec8e4656c8d721f3947d8dceb0e4: Status 404 returned error can't find the container with id 24950c6ccb00f6d48d5f3e3d072104e57980ec8e4656c8d721f3947d8dceb0e4 Mar 12 13:28:03 crc kubenswrapper[4921]: W0312 13:28:03.814090 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa3bfb0c_259e_44db_a8ca_4ea73bd4493d.slice/crio-fe564674942af74cf2c5f4853e05de0edf4b427b05afefb2516ca8f494de97fc WatchSource:0}: Error finding container fe564674942af74cf2c5f4853e05de0edf4b427b05afefb2516ca8f494de97fc: Status 404 returned error can't find the container with id fe564674942af74cf2c5f4853e05de0edf4b427b05afefb2516ca8f494de97fc Mar 12 13:28:03 crc kubenswrapper[4921]: I0312 13:28:03.817373 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0711-account-create-update-8gxd6"] Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.003202 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0347f6f0-0cbe-4543-8f1f-939b159b8652" path="/var/lib/kubelet/pods/0347f6f0-0cbe-4543-8f1f-939b159b8652/volumes" Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.337383 4921 generic.go:334] "Generic (PLEG): container finished" podID="982b1b07-1ac6-4431-9226-3bf8129423cd" containerID="4a9bbae02a3363ca87334d69e34ab8c25f1a8f8b6ffc28003e05588479008ef7" exitCode=0 Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.337614 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555368-82pj9" event={"ID":"982b1b07-1ac6-4431-9226-3bf8129423cd","Type":"ContainerDied","Data":"4a9bbae02a3363ca87334d69e34ab8c25f1a8f8b6ffc28003e05588479008ef7"} Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.341473 4921 generic.go:334] "Generic (PLEG): container finished" podID="4c025e13-55c5-4aab-8815-d7ab022219b7" containerID="8e321f1a10378780b629912a343c698799d64170b4292d09a4f72f1bada3bf74" exitCode=0 Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.341545 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0711-account-create-update-8gxd6" event={"ID":"4c025e13-55c5-4aab-8815-d7ab022219b7","Type":"ContainerDied","Data":"8e321f1a10378780b629912a343c698799d64170b4292d09a4f72f1bada3bf74"} Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.341573 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0711-account-create-update-8gxd6" event={"ID":"4c025e13-55c5-4aab-8815-d7ab022219b7","Type":"ContainerStarted","Data":"24950c6ccb00f6d48d5f3e3d072104e57980ec8e4656c8d721f3947d8dceb0e4"} Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.343263 4921 generic.go:334] "Generic (PLEG): container finished" podID="6f8352a3-a443-4b74-b6a3-57b2074f7cef" containerID="d03755685a95e15c9d405cf6b91bf49be24aff7ef896032795f7e72c3475ba95" exitCode=0 Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.343315 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c2x57" event={"ID":"6f8352a3-a443-4b74-b6a3-57b2074f7cef","Type":"ContainerDied","Data":"d03755685a95e15c9d405cf6b91bf49be24aff7ef896032795f7e72c3475ba95"} Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.343367 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c2x57" event={"ID":"6f8352a3-a443-4b74-b6a3-57b2074f7cef","Type":"ContainerStarted","Data":"e5aaffb1e3faf4e286db8f6651e443967cd6f623fcaa13f1df0b90a44e89854f"} Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.345178 4921 generic.go:334] "Generic (PLEG): container finished" podID="64049769-9653-4351-b97a-881100721f77" containerID="bc8fdd860ab4f86d44fc1e13fab924a2c27b7cbae950ef922b8e9f891f6f72b4" exitCode=0 Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.345231 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bb3b-account-create-update-bvs2n" event={"ID":"64049769-9653-4351-b97a-881100721f77","Type":"ContainerDied","Data":"bc8fdd860ab4f86d44fc1e13fab924a2c27b7cbae950ef922b8e9f891f6f72b4"} Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.345250 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bb3b-account-create-update-bvs2n" event={"ID":"64049769-9653-4351-b97a-881100721f77","Type":"ContainerStarted","Data":"d8da20b32dd9e21ddb5d9e6f492fa1af52ecfbd6b40a08f1541557e7e9a9fc0e"} Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.347297 4921 generic.go:334] "Generic (PLEG): container finished" podID="aa3bfb0c-259e-44db-a8ca-4ea73bd4493d" containerID="5b283a65f5056f6ae601c22ed7b82889c24f2d6afaa8210f42d6439b250c45ef" exitCode=0 Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.347980 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-frghj" event={"ID":"aa3bfb0c-259e-44db-a8ca-4ea73bd4493d","Type":"ContainerDied","Data":"5b283a65f5056f6ae601c22ed7b82889c24f2d6afaa8210f42d6439b250c45ef"} Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.348012 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-frghj" event={"ID":"aa3bfb0c-259e-44db-a8ca-4ea73bd4493d","Type":"ContainerStarted","Data":"fe564674942af74cf2c5f4853e05de0edf4b427b05afefb2516ca8f494de97fc"} Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.826336 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-8sbv9"] Mar 12 13:28:04 crc kubenswrapper[4921]: I0312 13:28:04.834126 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-8sbv9"] Mar 12 13:28:05 crc kubenswrapper[4921]: I0312 13:28:05.756162 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0711-account-create-update-8gxd6" Mar 12 13:28:05 crc kubenswrapper[4921]: I0312 13:28:05.910887 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c2x57" Mar 12 13:28:05 crc kubenswrapper[4921]: I0312 13:28:05.912613 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cpb5\" (UniqueName: \"kubernetes.io/projected/4c025e13-55c5-4aab-8815-d7ab022219b7-kube-api-access-2cpb5\") pod \"4c025e13-55c5-4aab-8815-d7ab022219b7\" (UID: \"4c025e13-55c5-4aab-8815-d7ab022219b7\") " Mar 12 13:28:05 crc kubenswrapper[4921]: I0312 13:28:05.912800 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c025e13-55c5-4aab-8815-d7ab022219b7-operator-scripts\") pod \"4c025e13-55c5-4aab-8815-d7ab022219b7\" (UID: \"4c025e13-55c5-4aab-8815-d7ab022219b7\") " Mar 12 13:28:05 crc kubenswrapper[4921]: I0312 13:28:05.913521 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c025e13-55c5-4aab-8815-d7ab022219b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c025e13-55c5-4aab-8815-d7ab022219b7" (UID: "4c025e13-55c5-4aab-8815-d7ab022219b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:05 crc kubenswrapper[4921]: I0312 13:28:05.918621 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bb3b-account-create-update-bvs2n" Mar 12 13:28:05 crc kubenswrapper[4921]: I0312 13:28:05.919302 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c025e13-55c5-4aab-8815-d7ab022219b7-kube-api-access-2cpb5" (OuterVolumeSpecName: "kube-api-access-2cpb5") pod "4c025e13-55c5-4aab-8815-d7ab022219b7" (UID: "4c025e13-55c5-4aab-8815-d7ab022219b7"). InnerVolumeSpecName "kube-api-access-2cpb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:05 crc kubenswrapper[4921]: I0312 13:28:05.924190 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-frghj" Mar 12 13:28:05 crc kubenswrapper[4921]: I0312 13:28:05.960495 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555368-82pj9" Mar 12 13:28:05 crc kubenswrapper[4921]: I0312 13:28:05.996775 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3" path="/var/lib/kubelet/pods/5b3c6d11-cc56-40e4-bde4-8bd89e69f7d3/volumes" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.014105 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f8352a3-a443-4b74-b6a3-57b2074f7cef-operator-scripts\") pod \"6f8352a3-a443-4b74-b6a3-57b2074f7cef\" (UID: \"6f8352a3-a443-4b74-b6a3-57b2074f7cef\") " Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.014243 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz4d9\" (UniqueName: \"kubernetes.io/projected/982b1b07-1ac6-4431-9226-3bf8129423cd-kube-api-access-nz4d9\") pod \"982b1b07-1ac6-4431-9226-3bf8129423cd\" (UID: \"982b1b07-1ac6-4431-9226-3bf8129423cd\") " Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.014287 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ngxb\" (UniqueName: \"kubernetes.io/projected/64049769-9653-4351-b97a-881100721f77-kube-api-access-6ngxb\") pod \"64049769-9653-4351-b97a-881100721f77\" (UID: \"64049769-9653-4351-b97a-881100721f77\") " Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.014333 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa3bfb0c-259e-44db-a8ca-4ea73bd4493d-operator-scripts\") pod \"aa3bfb0c-259e-44db-a8ca-4ea73bd4493d\" (UID: \"aa3bfb0c-259e-44db-a8ca-4ea73bd4493d\") " Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.014356 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64049769-9653-4351-b97a-881100721f77-operator-scripts\") pod \"64049769-9653-4351-b97a-881100721f77\" (UID: \"64049769-9653-4351-b97a-881100721f77\") " Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.014382 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrcmn\" (UniqueName: \"kubernetes.io/projected/aa3bfb0c-259e-44db-a8ca-4ea73bd4493d-kube-api-access-wrcmn\") pod \"aa3bfb0c-259e-44db-a8ca-4ea73bd4493d\" (UID: \"aa3bfb0c-259e-44db-a8ca-4ea73bd4493d\") " Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.014418 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t7br\" (UniqueName: \"kubernetes.io/projected/6f8352a3-a443-4b74-b6a3-57b2074f7cef-kube-api-access-9t7br\") pod \"6f8352a3-a443-4b74-b6a3-57b2074f7cef\" (UID: \"6f8352a3-a443-4b74-b6a3-57b2074f7cef\") " Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.014673 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f8352a3-a443-4b74-b6a3-57b2074f7cef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f8352a3-a443-4b74-b6a3-57b2074f7cef" (UID: "6f8352a3-a443-4b74-b6a3-57b2074f7cef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.014859 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c025e13-55c5-4aab-8815-d7ab022219b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.014884 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cpb5\" (UniqueName: \"kubernetes.io/projected/4c025e13-55c5-4aab-8815-d7ab022219b7-kube-api-access-2cpb5\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.014902 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f8352a3-a443-4b74-b6a3-57b2074f7cef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.016111 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa3bfb0c-259e-44db-a8ca-4ea73bd4493d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aa3bfb0c-259e-44db-a8ca-4ea73bd4493d" (UID: "aa3bfb0c-259e-44db-a8ca-4ea73bd4493d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.016384 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64049769-9653-4351-b97a-881100721f77-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64049769-9653-4351-b97a-881100721f77" (UID: "64049769-9653-4351-b97a-881100721f77"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.017938 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa3bfb0c-259e-44db-a8ca-4ea73bd4493d-kube-api-access-wrcmn" (OuterVolumeSpecName: "kube-api-access-wrcmn") pod "aa3bfb0c-259e-44db-a8ca-4ea73bd4493d" (UID: "aa3bfb0c-259e-44db-a8ca-4ea73bd4493d"). InnerVolumeSpecName "kube-api-access-wrcmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.017967 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64049769-9653-4351-b97a-881100721f77-kube-api-access-6ngxb" (OuterVolumeSpecName: "kube-api-access-6ngxb") pod "64049769-9653-4351-b97a-881100721f77" (UID: "64049769-9653-4351-b97a-881100721f77"). InnerVolumeSpecName "kube-api-access-6ngxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.019173 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f8352a3-a443-4b74-b6a3-57b2074f7cef-kube-api-access-9t7br" (OuterVolumeSpecName: "kube-api-access-9t7br") pod "6f8352a3-a443-4b74-b6a3-57b2074f7cef" (UID: "6f8352a3-a443-4b74-b6a3-57b2074f7cef"). InnerVolumeSpecName "kube-api-access-9t7br". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.021415 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/982b1b07-1ac6-4431-9226-3bf8129423cd-kube-api-access-nz4d9" (OuterVolumeSpecName: "kube-api-access-nz4d9") pod "982b1b07-1ac6-4431-9226-3bf8129423cd" (UID: "982b1b07-1ac6-4431-9226-3bf8129423cd"). InnerVolumeSpecName "kube-api-access-nz4d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.116837 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz4d9\" (UniqueName: \"kubernetes.io/projected/982b1b07-1ac6-4431-9226-3bf8129423cd-kube-api-access-nz4d9\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.116874 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ngxb\" (UniqueName: \"kubernetes.io/projected/64049769-9653-4351-b97a-881100721f77-kube-api-access-6ngxb\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.116886 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa3bfb0c-259e-44db-a8ca-4ea73bd4493d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.116896 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64049769-9653-4351-b97a-881100721f77-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.116906 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrcmn\" (UniqueName: \"kubernetes.io/projected/aa3bfb0c-259e-44db-a8ca-4ea73bd4493d-kube-api-access-wrcmn\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.116916 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t7br\" (UniqueName: \"kubernetes.io/projected/6f8352a3-a443-4b74-b6a3-57b2074f7cef-kube-api-access-9t7br\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.380658 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0711-account-create-update-8gxd6" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.380665 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0711-account-create-update-8gxd6" event={"ID":"4c025e13-55c5-4aab-8815-d7ab022219b7","Type":"ContainerDied","Data":"24950c6ccb00f6d48d5f3e3d072104e57980ec8e4656c8d721f3947d8dceb0e4"} Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.381241 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24950c6ccb00f6d48d5f3e3d072104e57980ec8e4656c8d721f3947d8dceb0e4" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.386483 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c2x57" event={"ID":"6f8352a3-a443-4b74-b6a3-57b2074f7cef","Type":"ContainerDied","Data":"e5aaffb1e3faf4e286db8f6651e443967cd6f623fcaa13f1df0b90a44e89854f"} Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.386536 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5aaffb1e3faf4e286db8f6651e443967cd6f623fcaa13f1df0b90a44e89854f" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.386496 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c2x57" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.389263 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bb3b-account-create-update-bvs2n" event={"ID":"64049769-9653-4351-b97a-881100721f77","Type":"ContainerDied","Data":"d8da20b32dd9e21ddb5d9e6f492fa1af52ecfbd6b40a08f1541557e7e9a9fc0e"} Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.389317 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8da20b32dd9e21ddb5d9e6f492fa1af52ecfbd6b40a08f1541557e7e9a9fc0e" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.389395 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bb3b-account-create-update-bvs2n" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.393863 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-frghj" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.393853 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-frghj" event={"ID":"aa3bfb0c-259e-44db-a8ca-4ea73bd4493d","Type":"ContainerDied","Data":"fe564674942af74cf2c5f4853e05de0edf4b427b05afefb2516ca8f494de97fc"} Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.394606 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe564674942af74cf2c5f4853e05de0edf4b427b05afefb2516ca8f494de97fc" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.401613 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555362-df84w"] Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.401703 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555368-82pj9" event={"ID":"982b1b07-1ac6-4431-9226-3bf8129423cd","Type":"ContainerDied","Data":"22cae3dcffba2eef61d0d6534403166db363586bdece2161ca80f555af7d8d9c"} Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.401745 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555368-82pj9" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.401767 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22cae3dcffba2eef61d0d6534403166db363586bdece2161ca80f555af7d8d9c" Mar 12 13:28:06 crc kubenswrapper[4921]: I0312 13:28:06.429491 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555362-df84w"] Mar 12 13:28:07 crc kubenswrapper[4921]: I0312 13:28:07.601286 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-ncsx9" podUID="0347f6f0-0cbe-4543-8f1f-939b159b8652" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.99:5353: i/o timeout" Mar 12 13:28:07 crc kubenswrapper[4921]: I0312 13:28:07.757306 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 12 13:28:07 crc kubenswrapper[4921]: I0312 13:28:07.996206 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90e979e7-b235-4702-b7b6-303d881df7bb" path="/var/lib/kubelet/pods/90e979e7-b235-4702-b7b6-303d881df7bb/volumes" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.822543 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bd2gn"] Mar 12 13:28:09 crc kubenswrapper[4921]: E0312 13:28:09.823265 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa3bfb0c-259e-44db-a8ca-4ea73bd4493d" containerName="mariadb-database-create" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.823283 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa3bfb0c-259e-44db-a8ca-4ea73bd4493d" containerName="mariadb-database-create" Mar 12 13:28:09 crc kubenswrapper[4921]: E0312 13:28:09.823310 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f8352a3-a443-4b74-b6a3-57b2074f7cef" containerName="mariadb-database-create" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.823317 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f8352a3-a443-4b74-b6a3-57b2074f7cef" containerName="mariadb-database-create" Mar 12 13:28:09 crc kubenswrapper[4921]: E0312 13:28:09.823350 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64049769-9653-4351-b97a-881100721f77" containerName="mariadb-account-create-update" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.823360 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="64049769-9653-4351-b97a-881100721f77" containerName="mariadb-account-create-update" Mar 12 13:28:09 crc kubenswrapper[4921]: E0312 13:28:09.823376 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982b1b07-1ac6-4431-9226-3bf8129423cd" containerName="oc" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.823385 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="982b1b07-1ac6-4431-9226-3bf8129423cd" containerName="oc" Mar 12 13:28:09 crc kubenswrapper[4921]: E0312 13:28:09.823393 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c025e13-55c5-4aab-8815-d7ab022219b7" containerName="mariadb-account-create-update" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.823402 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c025e13-55c5-4aab-8815-d7ab022219b7" containerName="mariadb-account-create-update" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.823562 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa3bfb0c-259e-44db-a8ca-4ea73bd4493d" containerName="mariadb-database-create" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.823581 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f8352a3-a443-4b74-b6a3-57b2074f7cef" containerName="mariadb-database-create" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.823594 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c025e13-55c5-4aab-8815-d7ab022219b7" containerName="mariadb-account-create-update" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.823601 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="64049769-9653-4351-b97a-881100721f77" containerName="mariadb-account-create-update" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.823613 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="982b1b07-1ac6-4431-9226-3bf8129423cd" containerName="oc" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.824149 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bd2gn" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.830544 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.845464 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bd2gn"] Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.884769 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7njh4\" (UniqueName: \"kubernetes.io/projected/97c3d1a4-1ecf-4c47-86e2-068336153e40-kube-api-access-7njh4\") pod \"root-account-create-update-bd2gn\" (UID: \"97c3d1a4-1ecf-4c47-86e2-068336153e40\") " pod="openstack/root-account-create-update-bd2gn" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.884868 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97c3d1a4-1ecf-4c47-86e2-068336153e40-operator-scripts\") pod \"root-account-create-update-bd2gn\" (UID: \"97c3d1a4-1ecf-4c47-86e2-068336153e40\") " pod="openstack/root-account-create-update-bd2gn" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.986955 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7njh4\" (UniqueName: \"kubernetes.io/projected/97c3d1a4-1ecf-4c47-86e2-068336153e40-kube-api-access-7njh4\") pod \"root-account-create-update-bd2gn\" (UID: \"97c3d1a4-1ecf-4c47-86e2-068336153e40\") " pod="openstack/root-account-create-update-bd2gn" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.987011 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97c3d1a4-1ecf-4c47-86e2-068336153e40-operator-scripts\") pod \"root-account-create-update-bd2gn\" (UID: \"97c3d1a4-1ecf-4c47-86e2-068336153e40\") " pod="openstack/root-account-create-update-bd2gn" Mar 12 13:28:09 crc kubenswrapper[4921]: I0312 13:28:09.987732 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97c3d1a4-1ecf-4c47-86e2-068336153e40-operator-scripts\") pod \"root-account-create-update-bd2gn\" (UID: \"97c3d1a4-1ecf-4c47-86e2-068336153e40\") " pod="openstack/root-account-create-update-bd2gn" Mar 12 13:28:10 crc kubenswrapper[4921]: I0312 13:28:10.008696 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7njh4\" (UniqueName: \"kubernetes.io/projected/97c3d1a4-1ecf-4c47-86e2-068336153e40-kube-api-access-7njh4\") pod \"root-account-create-update-bd2gn\" (UID: \"97c3d1a4-1ecf-4c47-86e2-068336153e40\") " pod="openstack/root-account-create-update-bd2gn" Mar 12 13:28:10 crc kubenswrapper[4921]: I0312 13:28:10.142825 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bd2gn" Mar 12 13:28:12 crc kubenswrapper[4921]: I0312 13:28:12.339621 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:28:12 crc kubenswrapper[4921]: I0312 13:28:12.342463 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s4mtb" podUID="6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb" containerName="ovn-controller" probeResult="failure" output=< Mar 12 13:28:12 crc kubenswrapper[4921]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 12 13:28:12 crc kubenswrapper[4921]: > Mar 12 13:28:12 crc kubenswrapper[4921]: I0312 13:28:12.474298 4921 generic.go:334] "Generic (PLEG): container finished" podID="bf4146bb-5512-4a8d-81a6-b462a508be2f" containerID="84135f38b17f95a53f553d4468a52434f6006e30f8307b952806cef3b61cebdd" exitCode=0 Mar 12 13:28:12 crc kubenswrapper[4921]: I0312 13:28:12.474399 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf4146bb-5512-4a8d-81a6-b462a508be2f","Type":"ContainerDied","Data":"84135f38b17f95a53f553d4468a52434f6006e30f8307b952806cef3b61cebdd"} Mar 12 13:28:12 crc kubenswrapper[4921]: I0312 13:28:12.488212 4921 generic.go:334] "Generic (PLEG): container finished" podID="c83f4404-c7af-4fb6-aa92-6ac4e691a27f" containerID="cfab788e9f5ce4b3ba25a82075975900efb79d705a9bb5a1bdfdd4a9183dccb7" exitCode=0 Mar 12 13:28:12 crc kubenswrapper[4921]: I0312 13:28:12.488310 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c83f4404-c7af-4fb6-aa92-6ac4e691a27f","Type":"ContainerDied","Data":"cfab788e9f5ce4b3ba25a82075975900efb79d705a9bb5a1bdfdd4a9183dccb7"} Mar 12 13:28:15 crc kubenswrapper[4921]: I0312 13:28:15.206408 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bd2gn"] Mar 12 13:28:15 crc kubenswrapper[4921]: W0312 13:28:15.237375 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97c3d1a4_1ecf_4c47_86e2_068336153e40.slice/crio-f6454927a71e9ce7e97e4fe3621a53db78fa610a0548aba26634f6a7ce0235c3 WatchSource:0}: Error finding container f6454927a71e9ce7e97e4fe3621a53db78fa610a0548aba26634f6a7ce0235c3: Status 404 returned error can't find the container with id f6454927a71e9ce7e97e4fe3621a53db78fa610a0548aba26634f6a7ce0235c3 Mar 12 13:28:15 crc kubenswrapper[4921]: I0312 13:28:15.511118 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hnqjm" event={"ID":"7c4cb6d3-4372-4daa-bebb-49c822b98228","Type":"ContainerStarted","Data":"f3c22bce66ad9c60d84382052c554ec84ad5fedfddff0fb93f94d516208b2105"} Mar 12 13:28:15 crc kubenswrapper[4921]: I0312 13:28:15.513781 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf4146bb-5512-4a8d-81a6-b462a508be2f","Type":"ContainerStarted","Data":"35e060ac61a718cdeaeac630edaa120fe7f0f7e9114bab16eb31527e2e1ff99d"} Mar 12 13:28:15 crc kubenswrapper[4921]: I0312 13:28:15.514036 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 12 13:28:15 crc kubenswrapper[4921]: I0312 13:28:15.516374 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c83f4404-c7af-4fb6-aa92-6ac4e691a27f","Type":"ContainerStarted","Data":"2622f1967762c5f954d8dadd8f0275d5bbe4135976e200c0bc017219c4bc6b92"} Mar 12 13:28:15 crc kubenswrapper[4921]: I0312 13:28:15.516591 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:28:15 crc kubenswrapper[4921]: I0312 13:28:15.518463 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bd2gn" event={"ID":"97c3d1a4-1ecf-4c47-86e2-068336153e40","Type":"ContainerStarted","Data":"9cec882aa5a1f8e0abc7ae0ff9eb59c325d554327f2270dd8000777bdbdd8629"} Mar 12 13:28:15 crc kubenswrapper[4921]: I0312 13:28:15.518489 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bd2gn" event={"ID":"97c3d1a4-1ecf-4c47-86e2-068336153e40","Type":"ContainerStarted","Data":"f6454927a71e9ce7e97e4fe3621a53db78fa610a0548aba26634f6a7ce0235c3"} Mar 12 13:28:15 crc kubenswrapper[4921]: I0312 13:28:15.528333 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hnqjm" podStartSLOduration=1.651780155 podStartE2EDuration="13.528310298s" podCreationTimestamp="2026-03-12 13:28:02 +0000 UTC" firstStartedPulling="2026-03-12 13:28:02.986383005 +0000 UTC m=+1105.676454976" lastFinishedPulling="2026-03-12 13:28:14.862913148 +0000 UTC m=+1117.552985119" observedRunningTime="2026-03-12 13:28:15.526624377 +0000 UTC m=+1118.216696338" watchObservedRunningTime="2026-03-12 13:28:15.528310298 +0000 UTC m=+1118.218382269" Mar 12 13:28:15 crc kubenswrapper[4921]: I0312 13:28:15.544299 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-bd2gn" podStartSLOduration=6.544279191 podStartE2EDuration="6.544279191s" podCreationTimestamp="2026-03-12 13:28:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:28:15.542936049 +0000 UTC m=+1118.233008040" watchObservedRunningTime="2026-03-12 13:28:15.544279191 +0000 UTC m=+1118.234351162" Mar 12 13:28:15 crc kubenswrapper[4921]: I0312 13:28:15.567580 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=51.889423975 podStartE2EDuration="58.567557607s" podCreationTimestamp="2026-03-12 13:27:17 +0000 UTC" firstStartedPulling="2026-03-12 13:27:30.612532196 +0000 UTC m=+1073.302604167" lastFinishedPulling="2026-03-12 13:27:37.290665828 +0000 UTC m=+1079.980737799" observedRunningTime="2026-03-12 13:28:15.563150292 +0000 UTC m=+1118.253222263" watchObservedRunningTime="2026-03-12 13:28:15.567557607 +0000 UTC m=+1118.257629578" Mar 12 13:28:16 crc kubenswrapper[4921]: I0312 13:28:16.532905 4921 generic.go:334] "Generic (PLEG): container finished" podID="97c3d1a4-1ecf-4c47-86e2-068336153e40" containerID="9cec882aa5a1f8e0abc7ae0ff9eb59c325d554327f2270dd8000777bdbdd8629" exitCode=0 Mar 12 13:28:16 crc kubenswrapper[4921]: I0312 13:28:16.533003 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bd2gn" event={"ID":"97c3d1a4-1ecf-4c47-86e2-068336153e40","Type":"ContainerDied","Data":"9cec882aa5a1f8e0abc7ae0ff9eb59c325d554327f2270dd8000777bdbdd8629"} Mar 12 13:28:16 crc kubenswrapper[4921]: I0312 13:28:16.568121 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=53.221897199 podStartE2EDuration="59.568094314s" podCreationTimestamp="2026-03-12 13:27:17 +0000 UTC" firstStartedPulling="2026-03-12 13:27:30.49840669 +0000 UTC m=+1073.188478661" lastFinishedPulling="2026-03-12 13:27:36.844603805 +0000 UTC m=+1079.534675776" observedRunningTime="2026-03-12 13:28:15.597681036 +0000 UTC m=+1118.287753027" watchObservedRunningTime="2026-03-12 13:28:16.568094314 +0000 UTC m=+1119.258166295" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.333017 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s4mtb" podUID="6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb" containerName="ovn-controller" probeResult="failure" output=< Mar 12 13:28:17 crc kubenswrapper[4921]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 12 13:28:17 crc kubenswrapper[4921]: > Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.342557 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-z4nmg" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.535885 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s4mtb-config-jbhgb"] Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.537001 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.538935 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.557382 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s4mtb-config-jbhgb"] Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.617547 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-run\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.617590 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-run-ovn\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.617637 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e002d00e-bebd-4959-85af-a09553a11bb8-additional-scripts\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.617670 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-log-ovn\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.617741 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8tv7\" (UniqueName: \"kubernetes.io/projected/e002d00e-bebd-4959-85af-a09553a11bb8-kube-api-access-x8tv7\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.617950 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e002d00e-bebd-4959-85af-a09553a11bb8-scripts\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.719251 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-run\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.719313 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-run-ovn\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.719375 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e002d00e-bebd-4959-85af-a09553a11bb8-additional-scripts\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.719406 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-log-ovn\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.719466 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8tv7\" (UniqueName: \"kubernetes.io/projected/e002d00e-bebd-4959-85af-a09553a11bb8-kube-api-access-x8tv7\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.719501 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e002d00e-bebd-4959-85af-a09553a11bb8-scripts\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.719597 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-run\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.719616 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-run-ovn\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.719643 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-log-ovn\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.720185 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e002d00e-bebd-4959-85af-a09553a11bb8-additional-scripts\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.721517 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e002d00e-bebd-4959-85af-a09553a11bb8-scripts\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.742184 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8tv7\" (UniqueName: \"kubernetes.io/projected/e002d00e-bebd-4959-85af-a09553a11bb8-kube-api-access-x8tv7\") pod \"ovn-controller-s4mtb-config-jbhgb\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.865645 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:17 crc kubenswrapper[4921]: I0312 13:28:17.873181 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bd2gn" Mar 12 13:28:18 crc kubenswrapper[4921]: I0312 13:28:18.025537 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97c3d1a4-1ecf-4c47-86e2-068336153e40-operator-scripts\") pod \"97c3d1a4-1ecf-4c47-86e2-068336153e40\" (UID: \"97c3d1a4-1ecf-4c47-86e2-068336153e40\") " Mar 12 13:28:18 crc kubenswrapper[4921]: I0312 13:28:18.026029 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7njh4\" (UniqueName: \"kubernetes.io/projected/97c3d1a4-1ecf-4c47-86e2-068336153e40-kube-api-access-7njh4\") pod \"97c3d1a4-1ecf-4c47-86e2-068336153e40\" (UID: \"97c3d1a4-1ecf-4c47-86e2-068336153e40\") " Mar 12 13:28:18 crc kubenswrapper[4921]: I0312 13:28:18.026269 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c3d1a4-1ecf-4c47-86e2-068336153e40-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97c3d1a4-1ecf-4c47-86e2-068336153e40" (UID: "97c3d1a4-1ecf-4c47-86e2-068336153e40"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:18 crc kubenswrapper[4921]: I0312 13:28:18.026613 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97c3d1a4-1ecf-4c47-86e2-068336153e40-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:18 crc kubenswrapper[4921]: I0312 13:28:18.029248 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c3d1a4-1ecf-4c47-86e2-068336153e40-kube-api-access-7njh4" (OuterVolumeSpecName: "kube-api-access-7njh4") pod "97c3d1a4-1ecf-4c47-86e2-068336153e40" (UID: "97c3d1a4-1ecf-4c47-86e2-068336153e40"). InnerVolumeSpecName "kube-api-access-7njh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:18 crc kubenswrapper[4921]: I0312 13:28:18.128777 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7njh4\" (UniqueName: \"kubernetes.io/projected/97c3d1a4-1ecf-4c47-86e2-068336153e40-kube-api-access-7njh4\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:18 crc kubenswrapper[4921]: I0312 13:28:18.337373 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s4mtb-config-jbhgb"] Mar 12 13:28:18 crc kubenswrapper[4921]: I0312 13:28:18.551717 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bd2gn" Mar 12 13:28:18 crc kubenswrapper[4921]: I0312 13:28:18.551705 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bd2gn" event={"ID":"97c3d1a4-1ecf-4c47-86e2-068336153e40","Type":"ContainerDied","Data":"f6454927a71e9ce7e97e4fe3621a53db78fa610a0548aba26634f6a7ce0235c3"} Mar 12 13:28:18 crc kubenswrapper[4921]: I0312 13:28:18.552660 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6454927a71e9ce7e97e4fe3621a53db78fa610a0548aba26634f6a7ce0235c3" Mar 12 13:28:18 crc kubenswrapper[4921]: I0312 13:28:18.555787 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s4mtb-config-jbhgb" event={"ID":"e002d00e-bebd-4959-85af-a09553a11bb8","Type":"ContainerStarted","Data":"4564077dc165e4f75f2f8edd0d9242902b77dd7bfe5b95aa9f942640af11bbe1"} Mar 12 13:28:19 crc kubenswrapper[4921]: I0312 13:28:19.567850 4921 generic.go:334] "Generic (PLEG): container finished" podID="e002d00e-bebd-4959-85af-a09553a11bb8" containerID="3c5c0ffc7161c468760a7964361fbe00731b21aae9c39610fb20c3dc3e9e3a7e" exitCode=0 Mar 12 13:28:19 crc kubenswrapper[4921]: I0312 13:28:19.567913 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s4mtb-config-jbhgb" event={"ID":"e002d00e-bebd-4959-85af-a09553a11bb8","Type":"ContainerDied","Data":"3c5c0ffc7161c468760a7964361fbe00731b21aae9c39610fb20c3dc3e9e3a7e"} Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.884217 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.974785 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-run-ovn\") pod \"e002d00e-bebd-4959-85af-a09553a11bb8\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.974852 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8tv7\" (UniqueName: \"kubernetes.io/projected/e002d00e-bebd-4959-85af-a09553a11bb8-kube-api-access-x8tv7\") pod \"e002d00e-bebd-4959-85af-a09553a11bb8\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.974880 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e002d00e-bebd-4959-85af-a09553a11bb8" (UID: "e002d00e-bebd-4959-85af-a09553a11bb8"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.974908 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e002d00e-bebd-4959-85af-a09553a11bb8-scripts\") pod \"e002d00e-bebd-4959-85af-a09553a11bb8\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.974942 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-run\") pod \"e002d00e-bebd-4959-85af-a09553a11bb8\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.974957 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e002d00e-bebd-4959-85af-a09553a11bb8-additional-scripts\") pod \"e002d00e-bebd-4959-85af-a09553a11bb8\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.975051 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-log-ovn\") pod \"e002d00e-bebd-4959-85af-a09553a11bb8\" (UID: \"e002d00e-bebd-4959-85af-a09553a11bb8\") " Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.975062 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-run" (OuterVolumeSpecName: "var-run") pod "e002d00e-bebd-4959-85af-a09553a11bb8" (UID: "e002d00e-bebd-4959-85af-a09553a11bb8"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.975234 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e002d00e-bebd-4959-85af-a09553a11bb8" (UID: "e002d00e-bebd-4959-85af-a09553a11bb8"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.975604 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e002d00e-bebd-4959-85af-a09553a11bb8-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e002d00e-bebd-4959-85af-a09553a11bb8" (UID: "e002d00e-bebd-4959-85af-a09553a11bb8"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.975698 4921 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.975724 4921 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.975736 4921 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e002d00e-bebd-4959-85af-a09553a11bb8-var-run\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.975869 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e002d00e-bebd-4959-85af-a09553a11bb8-scripts" (OuterVolumeSpecName: "scripts") pod "e002d00e-bebd-4959-85af-a09553a11bb8" (UID: "e002d00e-bebd-4959-85af-a09553a11bb8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:20 crc kubenswrapper[4921]: I0312 13:28:20.980622 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e002d00e-bebd-4959-85af-a09553a11bb8-kube-api-access-x8tv7" (OuterVolumeSpecName: "kube-api-access-x8tv7") pod "e002d00e-bebd-4959-85af-a09553a11bb8" (UID: "e002d00e-bebd-4959-85af-a09553a11bb8"). InnerVolumeSpecName "kube-api-access-x8tv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:21 crc kubenswrapper[4921]: I0312 13:28:21.077925 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8tv7\" (UniqueName: \"kubernetes.io/projected/e002d00e-bebd-4959-85af-a09553a11bb8-kube-api-access-x8tv7\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:21 crc kubenswrapper[4921]: I0312 13:28:21.077992 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e002d00e-bebd-4959-85af-a09553a11bb8-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:21 crc kubenswrapper[4921]: I0312 13:28:21.078018 4921 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e002d00e-bebd-4959-85af-a09553a11bb8-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:21 crc kubenswrapper[4921]: I0312 13:28:21.589120 4921 generic.go:334] "Generic (PLEG): container finished" podID="7c4cb6d3-4372-4daa-bebb-49c822b98228" containerID="f3c22bce66ad9c60d84382052c554ec84ad5fedfddff0fb93f94d516208b2105" exitCode=0 Mar 12 13:28:21 crc kubenswrapper[4921]: I0312 13:28:21.589307 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hnqjm" event={"ID":"7c4cb6d3-4372-4daa-bebb-49c822b98228","Type":"ContainerDied","Data":"f3c22bce66ad9c60d84382052c554ec84ad5fedfddff0fb93f94d516208b2105"} Mar 12 13:28:21 crc kubenswrapper[4921]: I0312 13:28:21.591767 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s4mtb-config-jbhgb" event={"ID":"e002d00e-bebd-4959-85af-a09553a11bb8","Type":"ContainerDied","Data":"4564077dc165e4f75f2f8edd0d9242902b77dd7bfe5b95aa9f942640af11bbe1"} Mar 12 13:28:21 crc kubenswrapper[4921]: I0312 13:28:21.591831 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4564077dc165e4f75f2f8edd0d9242902b77dd7bfe5b95aa9f942640af11bbe1" Mar 12 13:28:21 crc kubenswrapper[4921]: I0312 13:28:21.592109 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s4mtb-config-jbhgb" Mar 12 13:28:21 crc kubenswrapper[4921]: I0312 13:28:21.998899 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s4mtb-config-jbhgb"] Mar 12 13:28:21 crc kubenswrapper[4921]: I0312 13:28:21.998937 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-s4mtb-config-jbhgb"] Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.100548 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s4mtb-config-9prrv"] Mar 12 13:28:22 crc kubenswrapper[4921]: E0312 13:28:22.101214 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e002d00e-bebd-4959-85af-a09553a11bb8" containerName="ovn-config" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.101238 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e002d00e-bebd-4959-85af-a09553a11bb8" containerName="ovn-config" Mar 12 13:28:22 crc kubenswrapper[4921]: E0312 13:28:22.101271 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c3d1a4-1ecf-4c47-86e2-068336153e40" containerName="mariadb-account-create-update" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.101280 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c3d1a4-1ecf-4c47-86e2-068336153e40" containerName="mariadb-account-create-update" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.101461 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e002d00e-bebd-4959-85af-a09553a11bb8" containerName="ovn-config" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.101510 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c3d1a4-1ecf-4c47-86e2-068336153e40" containerName="mariadb-account-create-update" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.102149 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.103804 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.119316 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s4mtb-config-9prrv"] Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.200728 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmcjc\" (UniqueName: \"kubernetes.io/projected/36d72a20-0d83-4d49-958e-ea163376c405-kube-api-access-rmcjc\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.200787 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36d72a20-0d83-4d49-958e-ea163376c405-additional-scripts\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.200845 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36d72a20-0d83-4d49-958e-ea163376c405-scripts\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.200873 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-run\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.200924 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-run-ovn\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.200955 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-log-ovn\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.302790 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmcjc\" (UniqueName: \"kubernetes.io/projected/36d72a20-0d83-4d49-958e-ea163376c405-kube-api-access-rmcjc\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.302848 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36d72a20-0d83-4d49-958e-ea163376c405-additional-scripts\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.302884 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36d72a20-0d83-4d49-958e-ea163376c405-scripts\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.302917 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-run\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.302957 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-run-ovn\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.302984 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-log-ovn\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.303258 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-log-ovn\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.303259 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-run\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.303307 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-run-ovn\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.303792 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36d72a20-0d83-4d49-958e-ea163376c405-additional-scripts\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.306627 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36d72a20-0d83-4d49-958e-ea163376c405-scripts\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.341746 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmcjc\" (UniqueName: \"kubernetes.io/projected/36d72a20-0d83-4d49-958e-ea163376c405-kube-api-access-rmcjc\") pod \"ovn-controller-s4mtb-config-9prrv\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.377923 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-s4mtb" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.420573 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.689095 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s4mtb-config-9prrv"] Mar 12 13:28:22 crc kubenswrapper[4921]: W0312 13:28:22.698343 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36d72a20_0d83_4d49_958e_ea163376c405.slice/crio-a71f320f2cab13a9d017c0f23cff68ec5f317c0d457db7511d8cbbd060790b23 WatchSource:0}: Error finding container a71f320f2cab13a9d017c0f23cff68ec5f317c0d457db7511d8cbbd060790b23: Status 404 returned error can't find the container with id a71f320f2cab13a9d017c0f23cff68ec5f317c0d457db7511d8cbbd060790b23 Mar 12 13:28:22 crc kubenswrapper[4921]: I0312 13:28:22.945565 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.116999 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-combined-ca-bundle\") pod \"7c4cb6d3-4372-4daa-bebb-49c822b98228\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.117155 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-config-data\") pod \"7c4cb6d3-4372-4daa-bebb-49c822b98228\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.117197 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-275cv\" (UniqueName: \"kubernetes.io/projected/7c4cb6d3-4372-4daa-bebb-49c822b98228-kube-api-access-275cv\") pod \"7c4cb6d3-4372-4daa-bebb-49c822b98228\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.117229 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-db-sync-config-data\") pod \"7c4cb6d3-4372-4daa-bebb-49c822b98228\" (UID: \"7c4cb6d3-4372-4daa-bebb-49c822b98228\") " Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.123680 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c4cb6d3-4372-4daa-bebb-49c822b98228-kube-api-access-275cv" (OuterVolumeSpecName: "kube-api-access-275cv") pod "7c4cb6d3-4372-4daa-bebb-49c822b98228" (UID: "7c4cb6d3-4372-4daa-bebb-49c822b98228"). InnerVolumeSpecName "kube-api-access-275cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.123681 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7c4cb6d3-4372-4daa-bebb-49c822b98228" (UID: "7c4cb6d3-4372-4daa-bebb-49c822b98228"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.144458 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c4cb6d3-4372-4daa-bebb-49c822b98228" (UID: "7c4cb6d3-4372-4daa-bebb-49c822b98228"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.163902 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-config-data" (OuterVolumeSpecName: "config-data") pod "7c4cb6d3-4372-4daa-bebb-49c822b98228" (UID: "7c4cb6d3-4372-4daa-bebb-49c822b98228"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.219912 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.220078 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.220182 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-275cv\" (UniqueName: \"kubernetes.io/projected/7c4cb6d3-4372-4daa-bebb-49c822b98228-kube-api-access-275cv\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.220260 4921 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7c4cb6d3-4372-4daa-bebb-49c822b98228-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.618195 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hnqjm" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.619378 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hnqjm" event={"ID":"7c4cb6d3-4372-4daa-bebb-49c822b98228","Type":"ContainerDied","Data":"36709a55cafead3c857f7d5410abed283e2ee903ca2b1c1aa5a799fb1b001e30"} Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.620319 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36709a55cafead3c857f7d5410abed283e2ee903ca2b1c1aa5a799fb1b001e30" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.623602 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s4mtb-config-9prrv" event={"ID":"36d72a20-0d83-4d49-958e-ea163376c405","Type":"ContainerDied","Data":"ab86bbafa64b7b54ca717739766a6060392900703d9735765d290b4d35b9b56c"} Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.625964 4921 generic.go:334] "Generic (PLEG): container finished" podID="36d72a20-0d83-4d49-958e-ea163376c405" containerID="ab86bbafa64b7b54ca717739766a6060392900703d9735765d290b4d35b9b56c" exitCode=0 Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.626076 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s4mtb-config-9prrv" event={"ID":"36d72a20-0d83-4d49-958e-ea163376c405","Type":"ContainerStarted","Data":"a71f320f2cab13a9d017c0f23cff68ec5f317c0d457db7511d8cbbd060790b23"} Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.995195 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e002d00e-bebd-4959-85af-a09553a11bb8" path="/var/lib/kubelet/pods/e002d00e-bebd-4959-85af-a09553a11bb8/volumes" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.995982 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-srs62"] Mar 12 13:28:23 crc kubenswrapper[4921]: E0312 13:28:23.996273 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c4cb6d3-4372-4daa-bebb-49c822b98228" containerName="glance-db-sync" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.996288 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c4cb6d3-4372-4daa-bebb-49c822b98228" containerName="glance-db-sync" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.996458 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c4cb6d3-4372-4daa-bebb-49c822b98228" containerName="glance-db-sync" Mar 12 13:28:23 crc kubenswrapper[4921]: I0312 13:28:23.997432 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.012712 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-srs62"] Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.137758 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwmck\" (UniqueName: \"kubernetes.io/projected/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-kube-api-access-rwmck\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.137852 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.138026 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.138049 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-config\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.138086 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.239432 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwmck\" (UniqueName: \"kubernetes.io/projected/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-kube-api-access-rwmck\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.239501 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.239550 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.239569 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-config\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.239609 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.240678 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.240774 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.240795 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-config\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.240914 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.259861 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwmck\" (UniqueName: \"kubernetes.io/projected/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-kube-api-access-rwmck\") pod \"dnsmasq-dns-54f9b7b8d9-srs62\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.315203 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.767398 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-srs62"] Mar 12 13:28:24 crc kubenswrapper[4921]: I0312 13:28:24.904627 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.052009 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36d72a20-0d83-4d49-958e-ea163376c405-additional-scripts\") pod \"36d72a20-0d83-4d49-958e-ea163376c405\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.052111 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-log-ovn\") pod \"36d72a20-0d83-4d49-958e-ea163376c405\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.052159 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36d72a20-0d83-4d49-958e-ea163376c405-scripts\") pod \"36d72a20-0d83-4d49-958e-ea163376c405\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.052230 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmcjc\" (UniqueName: \"kubernetes.io/projected/36d72a20-0d83-4d49-958e-ea163376c405-kube-api-access-rmcjc\") pod \"36d72a20-0d83-4d49-958e-ea163376c405\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.052227 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "36d72a20-0d83-4d49-958e-ea163376c405" (UID: "36d72a20-0d83-4d49-958e-ea163376c405"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.052480 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "36d72a20-0d83-4d49-958e-ea163376c405" (UID: "36d72a20-0d83-4d49-958e-ea163376c405"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.052643 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d72a20-0d83-4d49-958e-ea163376c405-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "36d72a20-0d83-4d49-958e-ea163376c405" (UID: "36d72a20-0d83-4d49-958e-ea163376c405"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.052927 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d72a20-0d83-4d49-958e-ea163376c405-scripts" (OuterVolumeSpecName: "scripts") pod "36d72a20-0d83-4d49-958e-ea163376c405" (UID: "36d72a20-0d83-4d49-958e-ea163376c405"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.052957 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-run-ovn\") pod \"36d72a20-0d83-4d49-958e-ea163376c405\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.053034 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-run\") pod \"36d72a20-0d83-4d49-958e-ea163376c405\" (UID: \"36d72a20-0d83-4d49-958e-ea163376c405\") " Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.053076 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-run" (OuterVolumeSpecName: "var-run") pod "36d72a20-0d83-4d49-958e-ea163376c405" (UID: "36d72a20-0d83-4d49-958e-ea163376c405"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.053358 4921 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/36d72a20-0d83-4d49-958e-ea163376c405-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.053381 4921 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.053392 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36d72a20-0d83-4d49-958e-ea163376c405-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.053406 4921 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.053414 4921 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/36d72a20-0d83-4d49-958e-ea163376c405-var-run\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.055083 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d72a20-0d83-4d49-958e-ea163376c405-kube-api-access-rmcjc" (OuterVolumeSpecName: "kube-api-access-rmcjc") pod "36d72a20-0d83-4d49-958e-ea163376c405" (UID: "36d72a20-0d83-4d49-958e-ea163376c405"). InnerVolumeSpecName "kube-api-access-rmcjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.155278 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmcjc\" (UniqueName: \"kubernetes.io/projected/36d72a20-0d83-4d49-958e-ea163376c405-kube-api-access-rmcjc\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.639550 4921 generic.go:334] "Generic (PLEG): container finished" podID="f12d28a3-bd3a-4484-b5a2-98721ada3b7e" containerID="49f8886d7160b0c64fd0a031464306729473f2b2cfec92335eeaf84a2649b2af" exitCode=0 Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.639922 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" event={"ID":"f12d28a3-bd3a-4484-b5a2-98721ada3b7e","Type":"ContainerDied","Data":"49f8886d7160b0c64fd0a031464306729473f2b2cfec92335eeaf84a2649b2af"} Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.639950 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" event={"ID":"f12d28a3-bd3a-4484-b5a2-98721ada3b7e","Type":"ContainerStarted","Data":"7fb166f97251e07d8935a57cda6ad33720b184cd24e8d17e2939c8080b223a13"} Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.642439 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s4mtb-config-9prrv" event={"ID":"36d72a20-0d83-4d49-958e-ea163376c405","Type":"ContainerDied","Data":"a71f320f2cab13a9d017c0f23cff68ec5f317c0d457db7511d8cbbd060790b23"} Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.642475 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a71f320f2cab13a9d017c0f23cff68ec5f317c0d457db7511d8cbbd060790b23" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.642534 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s4mtb-config-9prrv" Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.982059 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s4mtb-config-9prrv"] Mar 12 13:28:25 crc kubenswrapper[4921]: I0312 13:28:25.993209 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-s4mtb-config-9prrv"] Mar 12 13:28:26 crc kubenswrapper[4921]: I0312 13:28:26.653388 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" event={"ID":"f12d28a3-bd3a-4484-b5a2-98721ada3b7e","Type":"ContainerStarted","Data":"d919f77d2517b14200444846ad80d74de444231258e626c9d3d8594c3eb01f3a"} Mar 12 13:28:26 crc kubenswrapper[4921]: I0312 13:28:26.653919 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:26 crc kubenswrapper[4921]: I0312 13:28:26.671188 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" podStartSLOduration=3.671169126 podStartE2EDuration="3.671169126s" podCreationTimestamp="2026-03-12 13:28:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:28:26.666770641 +0000 UTC m=+1129.356842652" watchObservedRunningTime="2026-03-12 13:28:26.671169126 +0000 UTC m=+1129.361241098" Mar 12 13:28:27 crc kubenswrapper[4921]: I0312 13:28:27.998466 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d72a20-0d83-4d49-958e-ea163376c405" path="/var/lib/kubelet/pods/36d72a20-0d83-4d49-958e-ea163376c405/volumes" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.372009 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.673409 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qs7p6"] Mar 12 13:28:28 crc kubenswrapper[4921]: E0312 13:28:28.674126 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d72a20-0d83-4d49-958e-ea163376c405" containerName="ovn-config" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.674152 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d72a20-0d83-4d49-958e-ea163376c405" containerName="ovn-config" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.674349 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d72a20-0d83-4d49-958e-ea163376c405" containerName="ovn-config" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.675036 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qs7p6" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.682115 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qs7p6"] Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.713074 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.799084 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b1f7-account-create-update-7qvml"] Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.800214 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b1f7-account-create-update-7qvml" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.801802 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.813852 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b1f7-account-create-update-7qvml"] Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.814693 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e61f83e-1a98-4c70-adfd-537d68cf4d62-operator-scripts\") pod \"cinder-db-create-qs7p6\" (UID: \"5e61f83e-1a98-4c70-adfd-537d68cf4d62\") " pod="openstack/cinder-db-create-qs7p6" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.814785 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtk9p\" (UniqueName: \"kubernetes.io/projected/5e61f83e-1a98-4c70-adfd-537d68cf4d62-kube-api-access-jtk9p\") pod \"cinder-db-create-qs7p6\" (UID: \"5e61f83e-1a98-4c70-adfd-537d68cf4d62\") " pod="openstack/cinder-db-create-qs7p6" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.915754 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e61f83e-1a98-4c70-adfd-537d68cf4d62-operator-scripts\") pod \"cinder-db-create-qs7p6\" (UID: \"5e61f83e-1a98-4c70-adfd-537d68cf4d62\") " pod="openstack/cinder-db-create-qs7p6" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.915834 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b224a4c1-1b4b-47d5-ac92-98560fbb0ca9-operator-scripts\") pod \"cinder-b1f7-account-create-update-7qvml\" (UID: \"b224a4c1-1b4b-47d5-ac92-98560fbb0ca9\") " pod="openstack/cinder-b1f7-account-create-update-7qvml" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.915888 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtk9p\" (UniqueName: \"kubernetes.io/projected/5e61f83e-1a98-4c70-adfd-537d68cf4d62-kube-api-access-jtk9p\") pod \"cinder-db-create-qs7p6\" (UID: \"5e61f83e-1a98-4c70-adfd-537d68cf4d62\") " pod="openstack/cinder-db-create-qs7p6" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.915961 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkps2\" (UniqueName: \"kubernetes.io/projected/b224a4c1-1b4b-47d5-ac92-98560fbb0ca9-kube-api-access-wkps2\") pod \"cinder-b1f7-account-create-update-7qvml\" (UID: \"b224a4c1-1b4b-47d5-ac92-98560fbb0ca9\") " pod="openstack/cinder-b1f7-account-create-update-7qvml" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.916562 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e61f83e-1a98-4c70-adfd-537d68cf4d62-operator-scripts\") pod \"cinder-db-create-qs7p6\" (UID: \"5e61f83e-1a98-4c70-adfd-537d68cf4d62\") " pod="openstack/cinder-db-create-qs7p6" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.948486 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtk9p\" (UniqueName: \"kubernetes.io/projected/5e61f83e-1a98-4c70-adfd-537d68cf4d62-kube-api-access-jtk9p\") pod \"cinder-db-create-qs7p6\" (UID: \"5e61f83e-1a98-4c70-adfd-537d68cf4d62\") " pod="openstack/cinder-db-create-qs7p6" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.986745 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-jwmtn"] Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.990278 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jwmtn" Mar 12 13:28:28 crc kubenswrapper[4921]: I0312 13:28:28.993956 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qs7p6" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.003527 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jwmtn"] Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.012037 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-5b2c-account-create-update-wtbtv"] Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.013040 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5b2c-account-create-update-wtbtv" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.016573 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.017397 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b224a4c1-1b4b-47d5-ac92-98560fbb0ca9-operator-scripts\") pod \"cinder-b1f7-account-create-update-7qvml\" (UID: \"b224a4c1-1b4b-47d5-ac92-98560fbb0ca9\") " pod="openstack/cinder-b1f7-account-create-update-7qvml" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.017475 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkps2\" (UniqueName: \"kubernetes.io/projected/b224a4c1-1b4b-47d5-ac92-98560fbb0ca9-kube-api-access-wkps2\") pod \"cinder-b1f7-account-create-update-7qvml\" (UID: \"b224a4c1-1b4b-47d5-ac92-98560fbb0ca9\") " pod="openstack/cinder-b1f7-account-create-update-7qvml" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.018362 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b224a4c1-1b4b-47d5-ac92-98560fbb0ca9-operator-scripts\") pod \"cinder-b1f7-account-create-update-7qvml\" (UID: \"b224a4c1-1b4b-47d5-ac92-98560fbb0ca9\") " pod="openstack/cinder-b1f7-account-create-update-7qvml" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.039885 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5b2c-account-create-update-wtbtv"] Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.042794 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkps2\" (UniqueName: \"kubernetes.io/projected/b224a4c1-1b4b-47d5-ac92-98560fbb0ca9-kube-api-access-wkps2\") pod \"cinder-b1f7-account-create-update-7qvml\" (UID: \"b224a4c1-1b4b-47d5-ac92-98560fbb0ca9\") " pod="openstack/cinder-b1f7-account-create-update-7qvml" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.086959 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-vdqdz"] Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.093514 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vdqdz" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.117296 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b1f7-account-create-update-7qvml" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.122899 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95af087a-ade7-45d2-b6a6-6ba5f6377393-operator-scripts\") pod \"barbican-5b2c-account-create-update-wtbtv\" (UID: \"95af087a-ade7-45d2-b6a6-6ba5f6377393\") " pod="openstack/barbican-5b2c-account-create-update-wtbtv" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.123024 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-vdqdz"] Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.123104 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79wcj\" (UniqueName: \"kubernetes.io/projected/95af087a-ade7-45d2-b6a6-6ba5f6377393-kube-api-access-79wcj\") pod \"barbican-5b2c-account-create-update-wtbtv\" (UID: \"95af087a-ade7-45d2-b6a6-6ba5f6377393\") " pod="openstack/barbican-5b2c-account-create-update-wtbtv" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.123185 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/234f532a-d318-49e4-91b9-731f2caa088d-operator-scripts\") pod \"barbican-db-create-jwmtn\" (UID: \"234f532a-d318-49e4-91b9-731f2caa088d\") " pod="openstack/barbican-db-create-jwmtn" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.123341 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5bjz\" (UniqueName: \"kubernetes.io/projected/234f532a-d318-49e4-91b9-731f2caa088d-kube-api-access-g5bjz\") pod \"barbican-db-create-jwmtn\" (UID: \"234f532a-d318-49e4-91b9-731f2caa088d\") " pod="openstack/barbican-db-create-jwmtn" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.138904 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ceba-account-create-update-wpb2z"] Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.141286 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ceba-account-create-update-wpb2z" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.151115 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.160987 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ceba-account-create-update-wpb2z"] Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.190206 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-9sscd"] Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.191300 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9sscd" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.202419 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.202454 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.202654 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.202907 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4ws54" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.219253 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9sscd"] Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.234686 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95af087a-ade7-45d2-b6a6-6ba5f6377393-operator-scripts\") pod \"barbican-5b2c-account-create-update-wtbtv\" (UID: \"95af087a-ade7-45d2-b6a6-6ba5f6377393\") " pod="openstack/barbican-5b2c-account-create-update-wtbtv" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.234783 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpt6h\" (UniqueName: \"kubernetes.io/projected/30f75b77-6080-41e8-a5db-9aa45c1c8fec-kube-api-access-vpt6h\") pod \"neutron-ceba-account-create-update-wpb2z\" (UID: \"30f75b77-6080-41e8-a5db-9aa45c1c8fec\") " pod="openstack/neutron-ceba-account-create-update-wpb2z" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.234821 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79wcj\" (UniqueName: \"kubernetes.io/projected/95af087a-ade7-45d2-b6a6-6ba5f6377393-kube-api-access-79wcj\") pod \"barbican-5b2c-account-create-update-wtbtv\" (UID: \"95af087a-ade7-45d2-b6a6-6ba5f6377393\") " pod="openstack/barbican-5b2c-account-create-update-wtbtv" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.234858 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmg29\" (UniqueName: \"kubernetes.io/projected/4063981c-ffb9-4312-887c-8ca83e478a9d-kube-api-access-xmg29\") pod \"neutron-db-create-vdqdz\" (UID: \"4063981c-ffb9-4312-887c-8ca83e478a9d\") " pod="openstack/neutron-db-create-vdqdz" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.234880 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4063981c-ffb9-4312-887c-8ca83e478a9d-operator-scripts\") pod \"neutron-db-create-vdqdz\" (UID: \"4063981c-ffb9-4312-887c-8ca83e478a9d\") " pod="openstack/neutron-db-create-vdqdz" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.234902 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/234f532a-d318-49e4-91b9-731f2caa088d-operator-scripts\") pod \"barbican-db-create-jwmtn\" (UID: \"234f532a-d318-49e4-91b9-731f2caa088d\") " pod="openstack/barbican-db-create-jwmtn" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.234925 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30f75b77-6080-41e8-a5db-9aa45c1c8fec-operator-scripts\") pod \"neutron-ceba-account-create-update-wpb2z\" (UID: \"30f75b77-6080-41e8-a5db-9aa45c1c8fec\") " pod="openstack/neutron-ceba-account-create-update-wpb2z" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.234941 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5bjz\" (UniqueName: \"kubernetes.io/projected/234f532a-d318-49e4-91b9-731f2caa088d-kube-api-access-g5bjz\") pod \"barbican-db-create-jwmtn\" (UID: \"234f532a-d318-49e4-91b9-731f2caa088d\") " pod="openstack/barbican-db-create-jwmtn" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.235805 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95af087a-ade7-45d2-b6a6-6ba5f6377393-operator-scripts\") pod \"barbican-5b2c-account-create-update-wtbtv\" (UID: \"95af087a-ade7-45d2-b6a6-6ba5f6377393\") " pod="openstack/barbican-5b2c-account-create-update-wtbtv" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.236421 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/234f532a-d318-49e4-91b9-731f2caa088d-operator-scripts\") pod \"barbican-db-create-jwmtn\" (UID: \"234f532a-d318-49e4-91b9-731f2caa088d\") " pod="openstack/barbican-db-create-jwmtn" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.271626 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79wcj\" (UniqueName: \"kubernetes.io/projected/95af087a-ade7-45d2-b6a6-6ba5f6377393-kube-api-access-79wcj\") pod \"barbican-5b2c-account-create-update-wtbtv\" (UID: \"95af087a-ade7-45d2-b6a6-6ba5f6377393\") " pod="openstack/barbican-5b2c-account-create-update-wtbtv" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.278966 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5bjz\" (UniqueName: \"kubernetes.io/projected/234f532a-d318-49e4-91b9-731f2caa088d-kube-api-access-g5bjz\") pod \"barbican-db-create-jwmtn\" (UID: \"234f532a-d318-49e4-91b9-731f2caa088d\") " pod="openstack/barbican-db-create-jwmtn" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.309340 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jwmtn" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.335839 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmg29\" (UniqueName: \"kubernetes.io/projected/4063981c-ffb9-4312-887c-8ca83e478a9d-kube-api-access-xmg29\") pod \"neutron-db-create-vdqdz\" (UID: \"4063981c-ffb9-4312-887c-8ca83e478a9d\") " pod="openstack/neutron-db-create-vdqdz" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.335898 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4063981c-ffb9-4312-887c-8ca83e478a9d-operator-scripts\") pod \"neutron-db-create-vdqdz\" (UID: \"4063981c-ffb9-4312-887c-8ca83e478a9d\") " pod="openstack/neutron-db-create-vdqdz" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.335929 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30f75b77-6080-41e8-a5db-9aa45c1c8fec-operator-scripts\") pod \"neutron-ceba-account-create-update-wpb2z\" (UID: \"30f75b77-6080-41e8-a5db-9aa45c1c8fec\") " pod="openstack/neutron-ceba-account-create-update-wpb2z" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.335966 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ztpl\" (UniqueName: \"kubernetes.io/projected/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-kube-api-access-4ztpl\") pod \"keystone-db-sync-9sscd\" (UID: \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\") " pod="openstack/keystone-db-sync-9sscd" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.335986 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-combined-ca-bundle\") pod \"keystone-db-sync-9sscd\" (UID: \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\") " pod="openstack/keystone-db-sync-9sscd" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.336009 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-config-data\") pod \"keystone-db-sync-9sscd\" (UID: \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\") " pod="openstack/keystone-db-sync-9sscd" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.336073 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpt6h\" (UniqueName: \"kubernetes.io/projected/30f75b77-6080-41e8-a5db-9aa45c1c8fec-kube-api-access-vpt6h\") pod \"neutron-ceba-account-create-update-wpb2z\" (UID: \"30f75b77-6080-41e8-a5db-9aa45c1c8fec\") " pod="openstack/neutron-ceba-account-create-update-wpb2z" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.336714 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4063981c-ffb9-4312-887c-8ca83e478a9d-operator-scripts\") pod \"neutron-db-create-vdqdz\" (UID: \"4063981c-ffb9-4312-887c-8ca83e478a9d\") " pod="openstack/neutron-db-create-vdqdz" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.336714 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30f75b77-6080-41e8-a5db-9aa45c1c8fec-operator-scripts\") pod \"neutron-ceba-account-create-update-wpb2z\" (UID: \"30f75b77-6080-41e8-a5db-9aa45c1c8fec\") " pod="openstack/neutron-ceba-account-create-update-wpb2z" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.360747 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpt6h\" (UniqueName: \"kubernetes.io/projected/30f75b77-6080-41e8-a5db-9aa45c1c8fec-kube-api-access-vpt6h\") pod \"neutron-ceba-account-create-update-wpb2z\" (UID: \"30f75b77-6080-41e8-a5db-9aa45c1c8fec\") " pod="openstack/neutron-ceba-account-create-update-wpb2z" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.361706 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmg29\" (UniqueName: \"kubernetes.io/projected/4063981c-ffb9-4312-887c-8ca83e478a9d-kube-api-access-xmg29\") pod \"neutron-db-create-vdqdz\" (UID: \"4063981c-ffb9-4312-887c-8ca83e478a9d\") " pod="openstack/neutron-db-create-vdqdz" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.417705 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5b2c-account-create-update-wtbtv" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.437622 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ztpl\" (UniqueName: \"kubernetes.io/projected/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-kube-api-access-4ztpl\") pod \"keystone-db-sync-9sscd\" (UID: \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\") " pod="openstack/keystone-db-sync-9sscd" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.437923 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-combined-ca-bundle\") pod \"keystone-db-sync-9sscd\" (UID: \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\") " pod="openstack/keystone-db-sync-9sscd" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.438014 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-config-data\") pod \"keystone-db-sync-9sscd\" (UID: \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\") " pod="openstack/keystone-db-sync-9sscd" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.440214 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vdqdz" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.443379 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-config-data\") pod \"keystone-db-sync-9sscd\" (UID: \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\") " pod="openstack/keystone-db-sync-9sscd" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.447580 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-combined-ca-bundle\") pod \"keystone-db-sync-9sscd\" (UID: \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\") " pod="openstack/keystone-db-sync-9sscd" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.454067 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ztpl\" (UniqueName: \"kubernetes.io/projected/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-kube-api-access-4ztpl\") pod \"keystone-db-sync-9sscd\" (UID: \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\") " pod="openstack/keystone-db-sync-9sscd" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.546420 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ceba-account-create-update-wpb2z" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.563047 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9sscd" Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.612206 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qs7p6"] Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.694937 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qs7p6" event={"ID":"5e61f83e-1a98-4c70-adfd-537d68cf4d62","Type":"ContainerStarted","Data":"4dc4fd2c779ed5d279a0f39dca3ffd38c8bc8b2163496a045a325a8c540f5951"} Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.723605 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b1f7-account-create-update-7qvml"] Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.824392 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jwmtn"] Mar 12 13:28:29 crc kubenswrapper[4921]: W0312 13:28:29.842324 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod234f532a_d318_49e4_91b9_731f2caa088d.slice/crio-971fae08ae6108dbd29dc61499ebd8ac73c8fff7c33c9aa1e67c8c3001531187 WatchSource:0}: Error finding container 971fae08ae6108dbd29dc61499ebd8ac73c8fff7c33c9aa1e67c8c3001531187: Status 404 returned error can't find the container with id 971fae08ae6108dbd29dc61499ebd8ac73c8fff7c33c9aa1e67c8c3001531187 Mar 12 13:28:29 crc kubenswrapper[4921]: I0312 13:28:29.923707 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-vdqdz"] Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.038307 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-5b2c-account-create-update-wtbtv"] Mar 12 13:28:30 crc kubenswrapper[4921]: W0312 13:28:30.071637 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95af087a_ade7_45d2_b6a6_6ba5f6377393.slice/crio-df8a77455de381f83f001cb0690b9647917a2866ab81aeb174c02eaea43145a6 WatchSource:0}: Error finding container df8a77455de381f83f001cb0690b9647917a2866ab81aeb174c02eaea43145a6: Status 404 returned error can't find the container with id df8a77455de381f83f001cb0690b9647917a2866ab81aeb174c02eaea43145a6 Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.123246 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ceba-account-create-update-wpb2z"] Mar 12 13:28:30 crc kubenswrapper[4921]: W0312 13:28:30.131451 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30f75b77_6080_41e8_a5db_9aa45c1c8fec.slice/crio-564631b955618960b246ada6addf49148264bd64b1c5af55fd0d693fae1f26ed WatchSource:0}: Error finding container 564631b955618960b246ada6addf49148264bd64b1c5af55fd0d693fae1f26ed: Status 404 returned error can't find the container with id 564631b955618960b246ada6addf49148264bd64b1c5af55fd0d693fae1f26ed Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.209084 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9sscd"] Mar 12 13:28:30 crc kubenswrapper[4921]: W0312 13:28:30.332431 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5b6fc72_721e_4dc3_9aa7_98707cfd403c.slice/crio-900024e52f41fca27de8008e3ac81b8f9f2f6588efea8c9f25501e7264d28aa3 WatchSource:0}: Error finding container 900024e52f41fca27de8008e3ac81b8f9f2f6588efea8c9f25501e7264d28aa3: Status 404 returned error can't find the container with id 900024e52f41fca27de8008e3ac81b8f9f2f6588efea8c9f25501e7264d28aa3 Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.702971 4921 generic.go:334] "Generic (PLEG): container finished" podID="30f75b77-6080-41e8-a5db-9aa45c1c8fec" containerID="31c30a1d36d9057df2e5dfe033bc499851831c8a1a89ae99c250e9ea97fdb240" exitCode=0 Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.703469 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ceba-account-create-update-wpb2z" event={"ID":"30f75b77-6080-41e8-a5db-9aa45c1c8fec","Type":"ContainerDied","Data":"31c30a1d36d9057df2e5dfe033bc499851831c8a1a89ae99c250e9ea97fdb240"} Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.704069 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ceba-account-create-update-wpb2z" event={"ID":"30f75b77-6080-41e8-a5db-9aa45c1c8fec","Type":"ContainerStarted","Data":"564631b955618960b246ada6addf49148264bd64b1c5af55fd0d693fae1f26ed"} Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.705380 4921 generic.go:334] "Generic (PLEG): container finished" podID="95af087a-ade7-45d2-b6a6-6ba5f6377393" containerID="b3bfff4e7e90150268e08900a3c490fdf92482415e78b8f9ef56da1dd9945d4c" exitCode=0 Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.705552 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5b2c-account-create-update-wtbtv" event={"ID":"95af087a-ade7-45d2-b6a6-6ba5f6377393","Type":"ContainerDied","Data":"b3bfff4e7e90150268e08900a3c490fdf92482415e78b8f9ef56da1dd9945d4c"} Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.705739 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5b2c-account-create-update-wtbtv" event={"ID":"95af087a-ade7-45d2-b6a6-6ba5f6377393","Type":"ContainerStarted","Data":"df8a77455de381f83f001cb0690b9647917a2866ab81aeb174c02eaea43145a6"} Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.706798 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9sscd" event={"ID":"f5b6fc72-721e-4dc3-9aa7-98707cfd403c","Type":"ContainerStarted","Data":"900024e52f41fca27de8008e3ac81b8f9f2f6588efea8c9f25501e7264d28aa3"} Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.708074 4921 generic.go:334] "Generic (PLEG): container finished" podID="b224a4c1-1b4b-47d5-ac92-98560fbb0ca9" containerID="6918883df55f669524e9f77d8ffb0901e2288660f9b9e5a59566193fe46eb6f8" exitCode=0 Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.708190 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b1f7-account-create-update-7qvml" event={"ID":"b224a4c1-1b4b-47d5-ac92-98560fbb0ca9","Type":"ContainerDied","Data":"6918883df55f669524e9f77d8ffb0901e2288660f9b9e5a59566193fe46eb6f8"} Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.708284 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b1f7-account-create-update-7qvml" event={"ID":"b224a4c1-1b4b-47d5-ac92-98560fbb0ca9","Type":"ContainerStarted","Data":"99f3eeb3ef32b71c3d35320a2f618bc2b679659bb2e137bb51ad4ccbdb213a79"} Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.709506 4921 generic.go:334] "Generic (PLEG): container finished" podID="234f532a-d318-49e4-91b9-731f2caa088d" containerID="7a63e758ad1ee2f086920e4d67ff603bedb55499f9447dbeb3145b227a411b3a" exitCode=0 Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.709654 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jwmtn" event={"ID":"234f532a-d318-49e4-91b9-731f2caa088d","Type":"ContainerDied","Data":"7a63e758ad1ee2f086920e4d67ff603bedb55499f9447dbeb3145b227a411b3a"} Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.709710 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jwmtn" event={"ID":"234f532a-d318-49e4-91b9-731f2caa088d","Type":"ContainerStarted","Data":"971fae08ae6108dbd29dc61499ebd8ac73c8fff7c33c9aa1e67c8c3001531187"} Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.711295 4921 generic.go:334] "Generic (PLEG): container finished" podID="4063981c-ffb9-4312-887c-8ca83e478a9d" containerID="de7f23af07688eabaa9e26675fde1b7e4b563ecbdbec31b2feb3d49976f22ea0" exitCode=0 Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.711410 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vdqdz" event={"ID":"4063981c-ffb9-4312-887c-8ca83e478a9d","Type":"ContainerDied","Data":"de7f23af07688eabaa9e26675fde1b7e4b563ecbdbec31b2feb3d49976f22ea0"} Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.711480 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vdqdz" event={"ID":"4063981c-ffb9-4312-887c-8ca83e478a9d","Type":"ContainerStarted","Data":"d01c2fe045e2d96296a74c90189d23922954588e7d6015a1395cc3dc25ccc089"} Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.713187 4921 generic.go:334] "Generic (PLEG): container finished" podID="5e61f83e-1a98-4c70-adfd-537d68cf4d62" containerID="1f851b7e53f0bc224acd376a19e567dfb54923dfe171a0a2f2b16de487de0f93" exitCode=0 Mar 12 13:28:30 crc kubenswrapper[4921]: I0312 13:28:30.713289 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qs7p6" event={"ID":"5e61f83e-1a98-4c70-adfd-537d68cf4d62","Type":"ContainerDied","Data":"1f851b7e53f0bc224acd376a19e567dfb54923dfe171a0a2f2b16de487de0f93"} Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.131360 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qs7p6" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.182979 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e61f83e-1a98-4c70-adfd-537d68cf4d62-operator-scripts\") pod \"5e61f83e-1a98-4c70-adfd-537d68cf4d62\" (UID: \"5e61f83e-1a98-4c70-adfd-537d68cf4d62\") " Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.183070 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtk9p\" (UniqueName: \"kubernetes.io/projected/5e61f83e-1a98-4c70-adfd-537d68cf4d62-kube-api-access-jtk9p\") pod \"5e61f83e-1a98-4c70-adfd-537d68cf4d62\" (UID: \"5e61f83e-1a98-4c70-adfd-537d68cf4d62\") " Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.187054 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e61f83e-1a98-4c70-adfd-537d68cf4d62-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e61f83e-1a98-4c70-adfd-537d68cf4d62" (UID: "5e61f83e-1a98-4c70-adfd-537d68cf4d62"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.207025 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e61f83e-1a98-4c70-adfd-537d68cf4d62-kube-api-access-jtk9p" (OuterVolumeSpecName: "kube-api-access-jtk9p") pod "5e61f83e-1a98-4c70-adfd-537d68cf4d62" (UID: "5e61f83e-1a98-4c70-adfd-537d68cf4d62"). InnerVolumeSpecName "kube-api-access-jtk9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.285457 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e61f83e-1a98-4c70-adfd-537d68cf4d62-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.285494 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtk9p\" (UniqueName: \"kubernetes.io/projected/5e61f83e-1a98-4c70-adfd-537d68cf4d62-kube-api-access-jtk9p\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.344355 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jwmtn" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.348013 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b1f7-account-create-update-7qvml" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.353923 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vdqdz" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.377301 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5b2c-account-create-update-wtbtv" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.380568 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ceba-account-create-update-wpb2z" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.386632 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkps2\" (UniqueName: \"kubernetes.io/projected/b224a4c1-1b4b-47d5-ac92-98560fbb0ca9-kube-api-access-wkps2\") pod \"b224a4c1-1b4b-47d5-ac92-98560fbb0ca9\" (UID: \"b224a4c1-1b4b-47d5-ac92-98560fbb0ca9\") " Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.386707 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4063981c-ffb9-4312-887c-8ca83e478a9d-operator-scripts\") pod \"4063981c-ffb9-4312-887c-8ca83e478a9d\" (UID: \"4063981c-ffb9-4312-887c-8ca83e478a9d\") " Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.386752 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmg29\" (UniqueName: \"kubernetes.io/projected/4063981c-ffb9-4312-887c-8ca83e478a9d-kube-api-access-xmg29\") pod \"4063981c-ffb9-4312-887c-8ca83e478a9d\" (UID: \"4063981c-ffb9-4312-887c-8ca83e478a9d\") " Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.386827 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5bjz\" (UniqueName: \"kubernetes.io/projected/234f532a-d318-49e4-91b9-731f2caa088d-kube-api-access-g5bjz\") pod \"234f532a-d318-49e4-91b9-731f2caa088d\" (UID: \"234f532a-d318-49e4-91b9-731f2caa088d\") " Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.386859 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b224a4c1-1b4b-47d5-ac92-98560fbb0ca9-operator-scripts\") pod \"b224a4c1-1b4b-47d5-ac92-98560fbb0ca9\" (UID: \"b224a4c1-1b4b-47d5-ac92-98560fbb0ca9\") " Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.387050 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/234f532a-d318-49e4-91b9-731f2caa088d-operator-scripts\") pod \"234f532a-d318-49e4-91b9-731f2caa088d\" (UID: \"234f532a-d318-49e4-91b9-731f2caa088d\") " Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.387439 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4063981c-ffb9-4312-887c-8ca83e478a9d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4063981c-ffb9-4312-887c-8ca83e478a9d" (UID: "4063981c-ffb9-4312-887c-8ca83e478a9d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.387625 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4063981c-ffb9-4312-887c-8ca83e478a9d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.388774 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/234f532a-d318-49e4-91b9-731f2caa088d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "234f532a-d318-49e4-91b9-731f2caa088d" (UID: "234f532a-d318-49e4-91b9-731f2caa088d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.389041 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b224a4c1-1b4b-47d5-ac92-98560fbb0ca9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b224a4c1-1b4b-47d5-ac92-98560fbb0ca9" (UID: "b224a4c1-1b4b-47d5-ac92-98560fbb0ca9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.394656 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/234f532a-d318-49e4-91b9-731f2caa088d-kube-api-access-g5bjz" (OuterVolumeSpecName: "kube-api-access-g5bjz") pod "234f532a-d318-49e4-91b9-731f2caa088d" (UID: "234f532a-d318-49e4-91b9-731f2caa088d"). InnerVolumeSpecName "kube-api-access-g5bjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.398038 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b224a4c1-1b4b-47d5-ac92-98560fbb0ca9-kube-api-access-wkps2" (OuterVolumeSpecName: "kube-api-access-wkps2") pod "b224a4c1-1b4b-47d5-ac92-98560fbb0ca9" (UID: "b224a4c1-1b4b-47d5-ac92-98560fbb0ca9"). InnerVolumeSpecName "kube-api-access-wkps2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.398101 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4063981c-ffb9-4312-887c-8ca83e478a9d-kube-api-access-xmg29" (OuterVolumeSpecName: "kube-api-access-xmg29") pod "4063981c-ffb9-4312-887c-8ca83e478a9d" (UID: "4063981c-ffb9-4312-887c-8ca83e478a9d"). InnerVolumeSpecName "kube-api-access-xmg29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.489055 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30f75b77-6080-41e8-a5db-9aa45c1c8fec-operator-scripts\") pod \"30f75b77-6080-41e8-a5db-9aa45c1c8fec\" (UID: \"30f75b77-6080-41e8-a5db-9aa45c1c8fec\") " Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.489650 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpt6h\" (UniqueName: \"kubernetes.io/projected/30f75b77-6080-41e8-a5db-9aa45c1c8fec-kube-api-access-vpt6h\") pod \"30f75b77-6080-41e8-a5db-9aa45c1c8fec\" (UID: \"30f75b77-6080-41e8-a5db-9aa45c1c8fec\") " Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.489938 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79wcj\" (UniqueName: \"kubernetes.io/projected/95af087a-ade7-45d2-b6a6-6ba5f6377393-kube-api-access-79wcj\") pod \"95af087a-ade7-45d2-b6a6-6ba5f6377393\" (UID: \"95af087a-ade7-45d2-b6a6-6ba5f6377393\") " Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.490216 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95af087a-ade7-45d2-b6a6-6ba5f6377393-operator-scripts\") pod \"95af087a-ade7-45d2-b6a6-6ba5f6377393\" (UID: \"95af087a-ade7-45d2-b6a6-6ba5f6377393\") " Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.490776 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30f75b77-6080-41e8-a5db-9aa45c1c8fec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30f75b77-6080-41e8-a5db-9aa45c1c8fec" (UID: "30f75b77-6080-41e8-a5db-9aa45c1c8fec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.491333 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkps2\" (UniqueName: \"kubernetes.io/projected/b224a4c1-1b4b-47d5-ac92-98560fbb0ca9-kube-api-access-wkps2\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.491376 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmg29\" (UniqueName: \"kubernetes.io/projected/4063981c-ffb9-4312-887c-8ca83e478a9d-kube-api-access-xmg29\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.491396 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5bjz\" (UniqueName: \"kubernetes.io/projected/234f532a-d318-49e4-91b9-731f2caa088d-kube-api-access-g5bjz\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.491426 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b224a4c1-1b4b-47d5-ac92-98560fbb0ca9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.491443 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/234f532a-d318-49e4-91b9-731f2caa088d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.491460 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30f75b77-6080-41e8-a5db-9aa45c1c8fec-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.491390 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95af087a-ade7-45d2-b6a6-6ba5f6377393-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95af087a-ade7-45d2-b6a6-6ba5f6377393" (UID: "95af087a-ade7-45d2-b6a6-6ba5f6377393"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.493844 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95af087a-ade7-45d2-b6a6-6ba5f6377393-kube-api-access-79wcj" (OuterVolumeSpecName: "kube-api-access-79wcj") pod "95af087a-ade7-45d2-b6a6-6ba5f6377393" (UID: "95af087a-ade7-45d2-b6a6-6ba5f6377393"). InnerVolumeSpecName "kube-api-access-79wcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.494358 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30f75b77-6080-41e8-a5db-9aa45c1c8fec-kube-api-access-vpt6h" (OuterVolumeSpecName: "kube-api-access-vpt6h") pod "30f75b77-6080-41e8-a5db-9aa45c1c8fec" (UID: "30f75b77-6080-41e8-a5db-9aa45c1c8fec"). InnerVolumeSpecName "kube-api-access-vpt6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.594835 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95af087a-ade7-45d2-b6a6-6ba5f6377393-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.594879 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpt6h\" (UniqueName: \"kubernetes.io/projected/30f75b77-6080-41e8-a5db-9aa45c1c8fec-kube-api-access-vpt6h\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.594892 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79wcj\" (UniqueName: \"kubernetes.io/projected/95af087a-ade7-45d2-b6a6-6ba5f6377393-kube-api-access-79wcj\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.732335 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vdqdz" event={"ID":"4063981c-ffb9-4312-887c-8ca83e478a9d","Type":"ContainerDied","Data":"d01c2fe045e2d96296a74c90189d23922954588e7d6015a1395cc3dc25ccc089"} Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.732372 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d01c2fe045e2d96296a74c90189d23922954588e7d6015a1395cc3dc25ccc089" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.732424 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vdqdz" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.734582 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qs7p6" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.734711 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qs7p6" event={"ID":"5e61f83e-1a98-4c70-adfd-537d68cf4d62","Type":"ContainerDied","Data":"4dc4fd2c779ed5d279a0f39dca3ffd38c8bc8b2163496a045a325a8c540f5951"} Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.734928 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dc4fd2c779ed5d279a0f39dca3ffd38c8bc8b2163496a045a325a8c540f5951" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.736387 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ceba-account-create-update-wpb2z" event={"ID":"30f75b77-6080-41e8-a5db-9aa45c1c8fec","Type":"ContainerDied","Data":"564631b955618960b246ada6addf49148264bd64b1c5af55fd0d693fae1f26ed"} Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.736430 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="564631b955618960b246ada6addf49148264bd64b1c5af55fd0d693fae1f26ed" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.736501 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ceba-account-create-update-wpb2z" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.738616 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-5b2c-account-create-update-wtbtv" event={"ID":"95af087a-ade7-45d2-b6a6-6ba5f6377393","Type":"ContainerDied","Data":"df8a77455de381f83f001cb0690b9647917a2866ab81aeb174c02eaea43145a6"} Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.738642 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df8a77455de381f83f001cb0690b9647917a2866ab81aeb174c02eaea43145a6" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.738672 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-5b2c-account-create-update-wtbtv" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.741223 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b1f7-account-create-update-7qvml" event={"ID":"b224a4c1-1b4b-47d5-ac92-98560fbb0ca9","Type":"ContainerDied","Data":"99f3eeb3ef32b71c3d35320a2f618bc2b679659bb2e137bb51ad4ccbdb213a79"} Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.741244 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b1f7-account-create-update-7qvml" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.741251 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99f3eeb3ef32b71c3d35320a2f618bc2b679659bb2e137bb51ad4ccbdb213a79" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.742765 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jwmtn" event={"ID":"234f532a-d318-49e4-91b9-731f2caa088d","Type":"ContainerDied","Data":"971fae08ae6108dbd29dc61499ebd8ac73c8fff7c33c9aa1e67c8c3001531187"} Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.742801 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="971fae08ae6108dbd29dc61499ebd8ac73c8fff7c33c9aa1e67c8c3001531187" Mar 12 13:28:32 crc kubenswrapper[4921]: I0312 13:28:32.742868 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jwmtn" Mar 12 13:28:34 crc kubenswrapper[4921]: I0312 13:28:34.316934 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:28:34 crc kubenswrapper[4921]: I0312 13:28:34.377961 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rgqch"] Mar 12 13:28:34 crc kubenswrapper[4921]: I0312 13:28:34.378238 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" podUID="464806fa-ec1f-477a-bd5e-bae85b7eaff3" containerName="dnsmasq-dns" containerID="cri-o://8aeae923b81d66363f05f5405c44d0a1f69eeceb60cd2780d03e1d897bb894c8" gracePeriod=10 Mar 12 13:28:34 crc kubenswrapper[4921]: I0312 13:28:34.759864 4921 generic.go:334] "Generic (PLEG): container finished" podID="464806fa-ec1f-477a-bd5e-bae85b7eaff3" containerID="8aeae923b81d66363f05f5405c44d0a1f69eeceb60cd2780d03e1d897bb894c8" exitCode=0 Mar 12 13:28:34 crc kubenswrapper[4921]: I0312 13:28:34.759906 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" event={"ID":"464806fa-ec1f-477a-bd5e-bae85b7eaff3","Type":"ContainerDied","Data":"8aeae923b81d66363f05f5405c44d0a1f69eeceb60cd2780d03e1d897bb894c8"} Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.236259 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.366057 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-ovsdbserver-nb\") pod \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.366132 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-config\") pod \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.366150 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-dns-svc\") pod \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.366231 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsvml\" (UniqueName: \"kubernetes.io/projected/464806fa-ec1f-477a-bd5e-bae85b7eaff3-kube-api-access-tsvml\") pod \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.366921 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-ovsdbserver-sb\") pod \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\" (UID: \"464806fa-ec1f-477a-bd5e-bae85b7eaff3\") " Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.371651 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/464806fa-ec1f-477a-bd5e-bae85b7eaff3-kube-api-access-tsvml" (OuterVolumeSpecName: "kube-api-access-tsvml") pod "464806fa-ec1f-477a-bd5e-bae85b7eaff3" (UID: "464806fa-ec1f-477a-bd5e-bae85b7eaff3"). InnerVolumeSpecName "kube-api-access-tsvml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.402766 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "464806fa-ec1f-477a-bd5e-bae85b7eaff3" (UID: "464806fa-ec1f-477a-bd5e-bae85b7eaff3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.403156 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "464806fa-ec1f-477a-bd5e-bae85b7eaff3" (UID: "464806fa-ec1f-477a-bd5e-bae85b7eaff3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.403658 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "464806fa-ec1f-477a-bd5e-bae85b7eaff3" (UID: "464806fa-ec1f-477a-bd5e-bae85b7eaff3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.419332 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-config" (OuterVolumeSpecName: "config") pod "464806fa-ec1f-477a-bd5e-bae85b7eaff3" (UID: "464806fa-ec1f-477a-bd5e-bae85b7eaff3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.468701 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.468734 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.468745 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.468755 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsvml\" (UniqueName: \"kubernetes.io/projected/464806fa-ec1f-477a-bd5e-bae85b7eaff3-kube-api-access-tsvml\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.468764 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/464806fa-ec1f-477a-bd5e-bae85b7eaff3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.783930 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.783987 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-rgqch" event={"ID":"464806fa-ec1f-477a-bd5e-bae85b7eaff3","Type":"ContainerDied","Data":"b48f1a704572fe236faccd4a47f2b2c53fe3eb6154b3c751d4fd96638b6212f9"} Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.784059 4921 scope.go:117] "RemoveContainer" containerID="8aeae923b81d66363f05f5405c44d0a1f69eeceb60cd2780d03e1d897bb894c8" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.788673 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9sscd" event={"ID":"f5b6fc72-721e-4dc3-9aa7-98707cfd403c","Type":"ContainerStarted","Data":"87b5541dd0e5017eb29e642444838157d37ef27ff4c6589461c2ca3515b083ca"} Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.818338 4921 scope.go:117] "RemoveContainer" containerID="8d7f31fcf1ba37287e343ddead93fac5db87fe0a5684cc4b322feb78276403a7" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.821380 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-9sscd" podStartSLOduration=2.1390438720000002 podStartE2EDuration="7.821349442s" podCreationTimestamp="2026-03-12 13:28:29 +0000 UTC" firstStartedPulling="2026-03-12 13:28:30.333906495 +0000 UTC m=+1133.023978466" lastFinishedPulling="2026-03-12 13:28:36.016212065 +0000 UTC m=+1138.706284036" observedRunningTime="2026-03-12 13:28:36.816094719 +0000 UTC m=+1139.506166730" watchObservedRunningTime="2026-03-12 13:28:36.821349442 +0000 UTC m=+1139.511421453" Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.856476 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rgqch"] Mar 12 13:28:36 crc kubenswrapper[4921]: I0312 13:28:36.864442 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-rgqch"] Mar 12 13:28:37 crc kubenswrapper[4921]: I0312 13:28:37.995891 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="464806fa-ec1f-477a-bd5e-bae85b7eaff3" path="/var/lib/kubelet/pods/464806fa-ec1f-477a-bd5e-bae85b7eaff3/volumes" Mar 12 13:28:39 crc kubenswrapper[4921]: I0312 13:28:39.226141 4921 scope.go:117] "RemoveContainer" containerID="02b9faa37d3c073ef9dfec8e1be069df54bf22e3bb65f54526c0702581df5f4b" Mar 12 13:28:40 crc kubenswrapper[4921]: I0312 13:28:40.835133 4921 generic.go:334] "Generic (PLEG): container finished" podID="f5b6fc72-721e-4dc3-9aa7-98707cfd403c" containerID="87b5541dd0e5017eb29e642444838157d37ef27ff4c6589461c2ca3515b083ca" exitCode=0 Mar 12 13:28:40 crc kubenswrapper[4921]: I0312 13:28:40.835602 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9sscd" event={"ID":"f5b6fc72-721e-4dc3-9aa7-98707cfd403c","Type":"ContainerDied","Data":"87b5541dd0e5017eb29e642444838157d37ef27ff4c6589461c2ca3515b083ca"} Mar 12 13:28:42 crc kubenswrapper[4921]: I0312 13:28:42.194014 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9sscd" Mar 12 13:28:42 crc kubenswrapper[4921]: I0312 13:28:42.274600 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ztpl\" (UniqueName: \"kubernetes.io/projected/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-kube-api-access-4ztpl\") pod \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\" (UID: \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\") " Mar 12 13:28:42 crc kubenswrapper[4921]: I0312 13:28:42.274706 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-config-data\") pod \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\" (UID: \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\") " Mar 12 13:28:42 crc kubenswrapper[4921]: I0312 13:28:42.274790 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-combined-ca-bundle\") pod \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\" (UID: \"f5b6fc72-721e-4dc3-9aa7-98707cfd403c\") " Mar 12 13:28:42 crc kubenswrapper[4921]: I0312 13:28:42.280351 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-kube-api-access-4ztpl" (OuterVolumeSpecName: "kube-api-access-4ztpl") pod "f5b6fc72-721e-4dc3-9aa7-98707cfd403c" (UID: "f5b6fc72-721e-4dc3-9aa7-98707cfd403c"). InnerVolumeSpecName "kube-api-access-4ztpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:42 crc kubenswrapper[4921]: I0312 13:28:42.320198 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5b6fc72-721e-4dc3-9aa7-98707cfd403c" (UID: "f5b6fc72-721e-4dc3-9aa7-98707cfd403c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:28:42 crc kubenswrapper[4921]: I0312 13:28:42.320500 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-config-data" (OuterVolumeSpecName: "config-data") pod "f5b6fc72-721e-4dc3-9aa7-98707cfd403c" (UID: "f5b6fc72-721e-4dc3-9aa7-98707cfd403c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:28:42 crc kubenswrapper[4921]: I0312 13:28:42.377480 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ztpl\" (UniqueName: \"kubernetes.io/projected/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-kube-api-access-4ztpl\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:42 crc kubenswrapper[4921]: I0312 13:28:42.377534 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:42 crc kubenswrapper[4921]: I0312 13:28:42.377554 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5b6fc72-721e-4dc3-9aa7-98707cfd403c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:42 crc kubenswrapper[4921]: I0312 13:28:42.856277 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9sscd" event={"ID":"f5b6fc72-721e-4dc3-9aa7-98707cfd403c","Type":"ContainerDied","Data":"900024e52f41fca27de8008e3ac81b8f9f2f6588efea8c9f25501e7264d28aa3"} Mar 12 13:28:42 crc kubenswrapper[4921]: I0312 13:28:42.856321 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="900024e52f41fca27de8008e3ac81b8f9f2f6588efea8c9f25501e7264d28aa3" Mar 12 13:28:42 crc kubenswrapper[4921]: I0312 13:28:42.856339 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9sscd" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.113188 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-mdvhh"] Mar 12 13:28:43 crc kubenswrapper[4921]: E0312 13:28:43.116591 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="234f532a-d318-49e4-91b9-731f2caa088d" containerName="mariadb-database-create" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.116626 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="234f532a-d318-49e4-91b9-731f2caa088d" containerName="mariadb-database-create" Mar 12 13:28:43 crc kubenswrapper[4921]: E0312 13:28:43.116712 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b224a4c1-1b4b-47d5-ac92-98560fbb0ca9" containerName="mariadb-account-create-update" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.116729 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b224a4c1-1b4b-47d5-ac92-98560fbb0ca9" containerName="mariadb-account-create-update" Mar 12 13:28:43 crc kubenswrapper[4921]: E0312 13:28:43.116780 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4063981c-ffb9-4312-887c-8ca83e478a9d" containerName="mariadb-database-create" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.116794 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4063981c-ffb9-4312-887c-8ca83e478a9d" containerName="mariadb-database-create" Mar 12 13:28:43 crc kubenswrapper[4921]: E0312 13:28:43.116853 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e61f83e-1a98-4c70-adfd-537d68cf4d62" containerName="mariadb-database-create" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.116862 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e61f83e-1a98-4c70-adfd-537d68cf4d62" containerName="mariadb-database-create" Mar 12 13:28:43 crc kubenswrapper[4921]: E0312 13:28:43.116881 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30f75b77-6080-41e8-a5db-9aa45c1c8fec" containerName="mariadb-account-create-update" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.116890 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="30f75b77-6080-41e8-a5db-9aa45c1c8fec" containerName="mariadb-account-create-update" Mar 12 13:28:43 crc kubenswrapper[4921]: E0312 13:28:43.116926 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="464806fa-ec1f-477a-bd5e-bae85b7eaff3" containerName="init" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.116942 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="464806fa-ec1f-477a-bd5e-bae85b7eaff3" containerName="init" Mar 12 13:28:43 crc kubenswrapper[4921]: E0312 13:28:43.116953 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95af087a-ade7-45d2-b6a6-6ba5f6377393" containerName="mariadb-account-create-update" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.116961 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="95af087a-ade7-45d2-b6a6-6ba5f6377393" containerName="mariadb-account-create-update" Mar 12 13:28:43 crc kubenswrapper[4921]: E0312 13:28:43.116973 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="464806fa-ec1f-477a-bd5e-bae85b7eaff3" containerName="dnsmasq-dns" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.117003 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="464806fa-ec1f-477a-bd5e-bae85b7eaff3" containerName="dnsmasq-dns" Mar 12 13:28:43 crc kubenswrapper[4921]: E0312 13:28:43.117016 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b6fc72-721e-4dc3-9aa7-98707cfd403c" containerName="keystone-db-sync" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.117026 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b6fc72-721e-4dc3-9aa7-98707cfd403c" containerName="keystone-db-sync" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.117275 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="464806fa-ec1f-477a-bd5e-bae85b7eaff3" containerName="dnsmasq-dns" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.117295 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="95af087a-ade7-45d2-b6a6-6ba5f6377393" containerName="mariadb-account-create-update" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.117331 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e61f83e-1a98-4c70-adfd-537d68cf4d62" containerName="mariadb-database-create" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.117346 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5b6fc72-721e-4dc3-9aa7-98707cfd403c" containerName="keystone-db-sync" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.117359 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4063981c-ffb9-4312-887c-8ca83e478a9d" containerName="mariadb-database-create" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.117372 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="234f532a-d318-49e4-91b9-731f2caa088d" containerName="mariadb-database-create" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.117406 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="30f75b77-6080-41e8-a5db-9aa45c1c8fec" containerName="mariadb-account-create-update" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.117420 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b224a4c1-1b4b-47d5-ac92-98560fbb0ca9" containerName="mariadb-account-create-update" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.118998 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.126695 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-mdvhh"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.182585 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-pkq8z"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.183712 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.189697 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-config\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.189764 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dtxg\" (UniqueName: \"kubernetes.io/projected/1548b5eb-2638-450c-ad0b-cf217b718b1f-kube-api-access-2dtxg\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.189831 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.189866 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.189891 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-dns-svc\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.191259 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.207710 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.207906 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.208023 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4ws54" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.208537 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.221097 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pkq8z"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.290855 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dtxg\" (UniqueName: \"kubernetes.io/projected/1548b5eb-2638-450c-ad0b-cf217b718b1f-kube-api-access-2dtxg\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.290904 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-config-data\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.290936 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-fernet-keys\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.290962 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-combined-ca-bundle\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.290993 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.291031 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.291053 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-credential-keys\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.291072 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-dns-svc\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.291095 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlv58\" (UniqueName: \"kubernetes.io/projected/e64eca53-8c21-401f-96bb-a74e61cb1ea5-kube-api-access-nlv58\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.291124 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-scripts\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.291148 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-config\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.292144 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-config\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.292591 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.292753 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-dns-svc\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.293289 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.317722 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dtxg\" (UniqueName: \"kubernetes.io/projected/1548b5eb-2638-450c-ad0b-cf217b718b1f-kube-api-access-2dtxg\") pod \"dnsmasq-dns-6546db6db7-mdvhh\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.332520 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-b8t7z"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.333546 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.339407 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kqsqg" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.339578 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.346039 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.353556 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-b8t7z"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.375780 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.377563 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.383504 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.383652 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.397854 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlv58\" (UniqueName: \"kubernetes.io/projected/e64eca53-8c21-401f-96bb-a74e61cb1ea5-kube-api-access-nlv58\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.397907 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-scripts\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.397949 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm987\" (UniqueName: \"kubernetes.io/projected/a9d3161d-0fd9-4116-8e46-74d541735563-kube-api-access-fm987\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.397975 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-scripts\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.398006 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-config-data\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.398031 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-fernet-keys\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.398054 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-combined-ca-bundle\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.398081 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-combined-ca-bundle\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.398109 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-db-sync-config-data\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.398135 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-config-data\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.398155 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9d3161d-0fd9-4116-8e46-74d541735563-etc-machine-id\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.398180 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-credential-keys\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.402509 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-scripts\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.402975 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-credential-keys\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.413657 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.416560 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-config-data\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.421565 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-fernet-keys\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.423417 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-combined-ca-bundle\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.433417 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlv58\" (UniqueName: \"kubernetes.io/projected/e64eca53-8c21-401f-96bb-a74e61cb1ea5-kube-api-access-nlv58\") pod \"keystone-bootstrap-pkq8z\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.446018 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.482798 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fv8c4"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.483759 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fv8c4" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.487446 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.487710 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-5n4pc" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.501781 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-combined-ca-bundle\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.501838 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-run-httpd\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.501862 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vrmt\" (UniqueName: \"kubernetes.io/projected/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-kube-api-access-7vrmt\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.501880 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-db-sync-config-data\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.501896 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-config-data\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.501919 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-config-data\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.501935 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9d3161d-0fd9-4116-8e46-74d541735563-etc-machine-id\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.501978 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.502014 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm987\" (UniqueName: \"kubernetes.io/projected/a9d3161d-0fd9-4116-8e46-74d541735563-kube-api-access-fm987\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.502035 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-scripts\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.502072 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.502090 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-scripts\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.502105 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-log-httpd\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.502369 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9d3161d-0fd9-4116-8e46-74d541735563-etc-machine-id\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.520458 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-4d9jl"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.521492 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4d9jl" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.523244 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.525144 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qmvk5" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.527570 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.528534 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-db-sync-config-data\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.531047 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-combined-ca-bundle\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.532274 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-scripts\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.533695 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-config-data\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.541584 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-4d9jl"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.542963 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.563237 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm987\" (UniqueName: \"kubernetes.io/projected/a9d3161d-0fd9-4116-8e46-74d541735563-kube-api-access-fm987\") pod \"cinder-db-sync-b8t7z\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.604268 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.604599 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbkt8\" (UniqueName: \"kubernetes.io/projected/cb9315da-7a44-4703-bc68-935d517a4412-kube-api-access-zbkt8\") pod \"barbican-db-sync-4d9jl\" (UID: \"cb9315da-7a44-4703-bc68-935d517a4412\") " pod="openstack/barbican-db-sync-4d9jl" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.604790 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb9315da-7a44-4703-bc68-935d517a4412-db-sync-config-data\") pod \"barbican-db-sync-4d9jl\" (UID: \"cb9315da-7a44-4703-bc68-935d517a4412\") " pod="openstack/barbican-db-sync-4d9jl" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.604956 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5331d35e-1086-4e7f-aa2f-164117b3df44-config\") pod \"neutron-db-sync-fv8c4\" (UID: \"5331d35e-1086-4e7f-aa2f-164117b3df44\") " pod="openstack/neutron-db-sync-fv8c4" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.605124 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.605259 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-scripts\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.606398 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-log-httpd\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.606574 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-run-httpd\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.606697 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5331d35e-1086-4e7f-aa2f-164117b3df44-combined-ca-bundle\") pod \"neutron-db-sync-fv8c4\" (UID: \"5331d35e-1086-4e7f-aa2f-164117b3df44\") " pod="openstack/neutron-db-sync-fv8c4" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.606836 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfq7n\" (UniqueName: \"kubernetes.io/projected/5331d35e-1086-4e7f-aa2f-164117b3df44-kube-api-access-sfq7n\") pod \"neutron-db-sync-fv8c4\" (UID: \"5331d35e-1086-4e7f-aa2f-164117b3df44\") " pod="openstack/neutron-db-sync-fv8c4" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.606968 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vrmt\" (UniqueName: \"kubernetes.io/projected/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-kube-api-access-7vrmt\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.607115 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-config-data\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.608054 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9315da-7a44-4703-bc68-935d517a4412-combined-ca-bundle\") pod \"barbican-db-sync-4d9jl\" (UID: \"cb9315da-7a44-4703-bc68-935d517a4412\") " pod="openstack/barbican-db-sync-4d9jl" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.608360 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-log-httpd\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.608629 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-run-httpd\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.628504 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fv8c4"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.633297 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.637431 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-scripts\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.637928 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.671686 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-config-data\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.674389 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.677094 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vrmt\" (UniqueName: \"kubernetes.io/projected/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-kube-api-access-7vrmt\") pod \"ceilometer-0\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.684900 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-mdvhh"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.710852 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9315da-7a44-4703-bc68-935d517a4412-combined-ca-bundle\") pod \"barbican-db-sync-4d9jl\" (UID: \"cb9315da-7a44-4703-bc68-935d517a4412\") " pod="openstack/barbican-db-sync-4d9jl" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.710933 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbkt8\" (UniqueName: \"kubernetes.io/projected/cb9315da-7a44-4703-bc68-935d517a4412-kube-api-access-zbkt8\") pod \"barbican-db-sync-4d9jl\" (UID: \"cb9315da-7a44-4703-bc68-935d517a4412\") " pod="openstack/barbican-db-sync-4d9jl" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.710962 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb9315da-7a44-4703-bc68-935d517a4412-db-sync-config-data\") pod \"barbican-db-sync-4d9jl\" (UID: \"cb9315da-7a44-4703-bc68-935d517a4412\") " pod="openstack/barbican-db-sync-4d9jl" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.710980 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5331d35e-1086-4e7f-aa2f-164117b3df44-config\") pod \"neutron-db-sync-fv8c4\" (UID: \"5331d35e-1086-4e7f-aa2f-164117b3df44\") " pod="openstack/neutron-db-sync-fv8c4" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.711032 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5331d35e-1086-4e7f-aa2f-164117b3df44-combined-ca-bundle\") pod \"neutron-db-sync-fv8c4\" (UID: \"5331d35e-1086-4e7f-aa2f-164117b3df44\") " pod="openstack/neutron-db-sync-fv8c4" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.711051 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfq7n\" (UniqueName: \"kubernetes.io/projected/5331d35e-1086-4e7f-aa2f-164117b3df44-kube-api-access-sfq7n\") pod \"neutron-db-sync-fv8c4\" (UID: \"5331d35e-1086-4e7f-aa2f-164117b3df44\") " pod="openstack/neutron-db-sync-fv8c4" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.728506 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-7jtcj"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.729565 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb9315da-7a44-4703-bc68-935d517a4412-db-sync-config-data\") pod \"barbican-db-sync-4d9jl\" (UID: \"cb9315da-7a44-4703-bc68-935d517a4412\") " pod="openstack/barbican-db-sync-4d9jl" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.729599 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.730119 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5331d35e-1086-4e7f-aa2f-164117b3df44-config\") pod \"neutron-db-sync-fv8c4\" (UID: \"5331d35e-1086-4e7f-aa2f-164117b3df44\") " pod="openstack/neutron-db-sync-fv8c4" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.730691 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9315da-7a44-4703-bc68-935d517a4412-combined-ca-bundle\") pod \"barbican-db-sync-4d9jl\" (UID: \"cb9315da-7a44-4703-bc68-935d517a4412\") " pod="openstack/barbican-db-sync-4d9jl" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.732782 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-sfv85" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.733060 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.733125 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.739713 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7jtcj"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.744424 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5331d35e-1086-4e7f-aa2f-164117b3df44-combined-ca-bundle\") pod \"neutron-db-sync-fv8c4\" (UID: \"5331d35e-1086-4e7f-aa2f-164117b3df44\") " pod="openstack/neutron-db-sync-fv8c4" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.747801 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfq7n\" (UniqueName: \"kubernetes.io/projected/5331d35e-1086-4e7f-aa2f-164117b3df44-kube-api-access-sfq7n\") pod \"neutron-db-sync-fv8c4\" (UID: \"5331d35e-1086-4e7f-aa2f-164117b3df44\") " pod="openstack/neutron-db-sync-fv8c4" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.750782 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-hjjtw"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.752099 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.752906 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbkt8\" (UniqueName: \"kubernetes.io/projected/cb9315da-7a44-4703-bc68-935d517a4412-kube-api-access-zbkt8\") pod \"barbican-db-sync-4d9jl\" (UID: \"cb9315da-7a44-4703-bc68-935d517a4412\") " pod="openstack/barbican-db-sync-4d9jl" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.758035 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-hjjtw"] Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.813201 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.814993 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.815106 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.815300 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-config-data\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.815338 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-logs\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.815395 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgks4\" (UniqueName: \"kubernetes.io/projected/98cd11fa-112d-45fe-b67e-ae910048dfbf-kube-api-access-tgks4\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.815443 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-scripts\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.815531 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-config\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.815732 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-combined-ca-bundle\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.815784 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxn69\" (UniqueName: \"kubernetes.io/projected/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-kube-api-access-bxn69\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.866371 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.919792 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.919937 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-config-data\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.919967 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-logs\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.920017 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgks4\" (UniqueName: \"kubernetes.io/projected/98cd11fa-112d-45fe-b67e-ae910048dfbf-kube-api-access-tgks4\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.920040 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-scripts\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.920080 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-config\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.920141 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-combined-ca-bundle\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.920159 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxn69\" (UniqueName: \"kubernetes.io/projected/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-kube-api-access-bxn69\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.920189 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.920231 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.921628 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.922497 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-logs\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.923309 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.923507 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-config\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.925905 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fv8c4" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.928182 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.937330 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-scripts\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.938796 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4d9jl" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.940342 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-config-data\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.942452 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-combined-ca-bundle\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.944327 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxn69\" (UniqueName: \"kubernetes.io/projected/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-kube-api-access-bxn69\") pod \"placement-db-sync-7jtcj\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:43 crc kubenswrapper[4921]: I0312 13:28:43.948636 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgks4\" (UniqueName: \"kubernetes.io/projected/98cd11fa-112d-45fe-b67e-ae910048dfbf-kube-api-access-tgks4\") pod \"dnsmasq-dns-7987f74bbc-hjjtw\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:44 crc kubenswrapper[4921]: I0312 13:28:44.072316 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7jtcj" Mar 12 13:28:44 crc kubenswrapper[4921]: I0312 13:28:44.085757 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:44 crc kubenswrapper[4921]: I0312 13:28:44.211276 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-mdvhh"] Mar 12 13:28:44 crc kubenswrapper[4921]: W0312 13:28:44.224635 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1548b5eb_2638_450c_ad0b_cf217b718b1f.slice/crio-eb079b9a2966d1d42370591553ae9804da33e0e778231b5a477df02f0748e1d7 WatchSource:0}: Error finding container eb079b9a2966d1d42370591553ae9804da33e0e778231b5a477df02f0748e1d7: Status 404 returned error can't find the container with id eb079b9a2966d1d42370591553ae9804da33e0e778231b5a477df02f0748e1d7 Mar 12 13:28:44 crc kubenswrapper[4921]: I0312 13:28:44.900144 4921 generic.go:334] "Generic (PLEG): container finished" podID="1548b5eb-2638-450c-ad0b-cf217b718b1f" containerID="d0f31ee7ed83b220b678667d5a809b8a9f8b0c576a45c73e0a076339b9c528fb" exitCode=0 Mar 12 13:28:44 crc kubenswrapper[4921]: I0312 13:28:44.900238 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" event={"ID":"1548b5eb-2638-450c-ad0b-cf217b718b1f","Type":"ContainerDied","Data":"d0f31ee7ed83b220b678667d5a809b8a9f8b0c576a45c73e0a076339b9c528fb"} Mar 12 13:28:44 crc kubenswrapper[4921]: I0312 13:28:44.900575 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" event={"ID":"1548b5eb-2638-450c-ad0b-cf217b718b1f","Type":"ContainerStarted","Data":"eb079b9a2966d1d42370591553ae9804da33e0e778231b5a477df02f0748e1d7"} Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.091459 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pkq8z"] Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.318035 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.467902 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fv8c4"] Mar 12 13:28:45 crc kubenswrapper[4921]: W0312 13:28:45.482879 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9d3161d_0fd9_4116_8e46_74d541735563.slice/crio-50a3a92afd3945fc836f67deea5a57529e0917e1e89c5d340cf9890b87c50df1 WatchSource:0}: Error finding container 50a3a92afd3945fc836f67deea5a57529e0917e1e89c5d340cf9890b87c50df1: Status 404 returned error can't find the container with id 50a3a92afd3945fc836f67deea5a57529e0917e1e89c5d340cf9890b87c50df1 Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.485756 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-b8t7z"] Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.499187 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:28:45 crc kubenswrapper[4921]: W0312 13:28:45.499830 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4af3faf6_1c64_4aa8_81ef_4093bf9ed247.slice/crio-78d20937edc9058d4e201a4c65f2409f80321a75b8da59ff0e74eab259893f3b WatchSource:0}: Error finding container 78d20937edc9058d4e201a4c65f2409f80321a75b8da59ff0e74eab259893f3b: Status 404 returned error can't find the container with id 78d20937edc9058d4e201a4c65f2409f80321a75b8da59ff0e74eab259893f3b Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.619297 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.657999 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-hjjtw"] Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.669783 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-7jtcj"] Mar 12 13:28:45 crc kubenswrapper[4921]: W0312 13:28:45.671265 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98cd11fa_112d_45fe_b67e_ae910048dfbf.slice/crio-f0de17c78f6fa375cb5eedbdf4927c021c412a8d544012d847308dbcb59e2838 WatchSource:0}: Error finding container f0de17c78f6fa375cb5eedbdf4927c021c412a8d544012d847308dbcb59e2838: Status 404 returned error can't find the container with id f0de17c78f6fa375cb5eedbdf4927c021c412a8d544012d847308dbcb59e2838 Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.685197 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-4d9jl"] Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.701166 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-ovsdbserver-sb\") pod \"1548b5eb-2638-450c-ad0b-cf217b718b1f\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.701281 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dtxg\" (UniqueName: \"kubernetes.io/projected/1548b5eb-2638-450c-ad0b-cf217b718b1f-kube-api-access-2dtxg\") pod \"1548b5eb-2638-450c-ad0b-cf217b718b1f\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.701408 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-ovsdbserver-nb\") pod \"1548b5eb-2638-450c-ad0b-cf217b718b1f\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.701491 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-dns-svc\") pod \"1548b5eb-2638-450c-ad0b-cf217b718b1f\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.701667 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-config\") pod \"1548b5eb-2638-450c-ad0b-cf217b718b1f\" (UID: \"1548b5eb-2638-450c-ad0b-cf217b718b1f\") " Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.705278 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1548b5eb-2638-450c-ad0b-cf217b718b1f-kube-api-access-2dtxg" (OuterVolumeSpecName: "kube-api-access-2dtxg") pod "1548b5eb-2638-450c-ad0b-cf217b718b1f" (UID: "1548b5eb-2638-450c-ad0b-cf217b718b1f"). InnerVolumeSpecName "kube-api-access-2dtxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.722045 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1548b5eb-2638-450c-ad0b-cf217b718b1f" (UID: "1548b5eb-2638-450c-ad0b-cf217b718b1f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.723263 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1548b5eb-2638-450c-ad0b-cf217b718b1f" (UID: "1548b5eb-2638-450c-ad0b-cf217b718b1f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.723400 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1548b5eb-2638-450c-ad0b-cf217b718b1f" (UID: "1548b5eb-2638-450c-ad0b-cf217b718b1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.727315 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-config" (OuterVolumeSpecName: "config") pod "1548b5eb-2638-450c-ad0b-cf217b718b1f" (UID: "1548b5eb-2638-450c-ad0b-cf217b718b1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.804246 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.804287 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.804304 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.804316 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1548b5eb-2638-450c-ad0b-cf217b718b1f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.804329 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dtxg\" (UniqueName: \"kubernetes.io/projected/1548b5eb-2638-450c-ad0b-cf217b718b1f-kube-api-access-2dtxg\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.917202 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pkq8z" event={"ID":"e64eca53-8c21-401f-96bb-a74e61cb1ea5","Type":"ContainerStarted","Data":"8ae6415b86e637b09ed6ed8eb721da79db2b5d0cf0ae025f5ebd65187ff5ba71"} Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.917256 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pkq8z" event={"ID":"e64eca53-8c21-401f-96bb-a74e61cb1ea5","Type":"ContainerStarted","Data":"bc1cc549d66afb440a6f75461f1e6edd9c7b09bda42b7816d932d6e19c724927"} Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.920131 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7jtcj" event={"ID":"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015","Type":"ContainerStarted","Data":"3b2ab4fd3aa58a433f66a3db63cfffd236e8b57fa88bf19af965bfc51b618c3d"} Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.923792 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.923792 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-mdvhh" event={"ID":"1548b5eb-2638-450c-ad0b-cf217b718b1f","Type":"ContainerDied","Data":"eb079b9a2966d1d42370591553ae9804da33e0e778231b5a477df02f0748e1d7"} Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.923875 4921 scope.go:117] "RemoveContainer" containerID="d0f31ee7ed83b220b678667d5a809b8a9f8b0c576a45c73e0a076339b9c528fb" Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.926509 4921 generic.go:334] "Generic (PLEG): container finished" podID="98cd11fa-112d-45fe-b67e-ae910048dfbf" containerID="d989f5eb4772ff26bc287180e5532582475e95b315084af5d2eef2800c320e20" exitCode=0 Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.926588 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" event={"ID":"98cd11fa-112d-45fe-b67e-ae910048dfbf","Type":"ContainerDied","Data":"d989f5eb4772ff26bc287180e5532582475e95b315084af5d2eef2800c320e20"} Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.926629 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" event={"ID":"98cd11fa-112d-45fe-b67e-ae910048dfbf","Type":"ContainerStarted","Data":"f0de17c78f6fa375cb5eedbdf4927c021c412a8d544012d847308dbcb59e2838"} Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.929296 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4d9jl" event={"ID":"cb9315da-7a44-4703-bc68-935d517a4412","Type":"ContainerStarted","Data":"c1497eae4ebbacd53faf566f3d5608cd2094faf64633b6ea555c1eefd2bce89b"} Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.931008 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3faf6-1c64-4aa8-81ef-4093bf9ed247","Type":"ContainerStarted","Data":"78d20937edc9058d4e201a4c65f2409f80321a75b8da59ff0e74eab259893f3b"} Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.939170 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fv8c4" event={"ID":"5331d35e-1086-4e7f-aa2f-164117b3df44","Type":"ContainerStarted","Data":"1df323f90efffdd1e6938c4300a799d121d63fc7aead625f796dded8e776229e"} Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.939234 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fv8c4" event={"ID":"5331d35e-1086-4e7f-aa2f-164117b3df44","Type":"ContainerStarted","Data":"673bd319bc175d697d1d696ab05c7951848ebcd55389d1d5e3a9ff7e272586a1"} Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.943098 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b8t7z" event={"ID":"a9d3161d-0fd9-4116-8e46-74d541735563","Type":"ContainerStarted","Data":"50a3a92afd3945fc836f67deea5a57529e0917e1e89c5d340cf9890b87c50df1"} Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.946293 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-pkq8z" podStartSLOduration=2.946268007 podStartE2EDuration="2.946268007s" podCreationTimestamp="2026-03-12 13:28:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:28:45.937336052 +0000 UTC m=+1148.627408023" watchObservedRunningTime="2026-03-12 13:28:45.946268007 +0000 UTC m=+1148.636339988" Mar 12 13:28:45 crc kubenswrapper[4921]: I0312 13:28:45.996418 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fv8c4" podStartSLOduration=2.996391971 podStartE2EDuration="2.996391971s" podCreationTimestamp="2026-03-12 13:28:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:28:45.996169464 +0000 UTC m=+1148.686241435" watchObservedRunningTime="2026-03-12 13:28:45.996391971 +0000 UTC m=+1148.686463942" Mar 12 13:28:46 crc kubenswrapper[4921]: I0312 13:28:46.077630 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-mdvhh"] Mar 12 13:28:46 crc kubenswrapper[4921]: I0312 13:28:46.088002 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-mdvhh"] Mar 12 13:28:46 crc kubenswrapper[4921]: I0312 13:28:46.961259 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" event={"ID":"98cd11fa-112d-45fe-b67e-ae910048dfbf","Type":"ContainerStarted","Data":"79b07f3a3c4edce9ea5e923bef8cb047bc661311581d43fe6448f6722f74e421"} Mar 12 13:28:46 crc kubenswrapper[4921]: I0312 13:28:46.988587 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" podStartSLOduration=3.98856734 podStartE2EDuration="3.98856734s" podCreationTimestamp="2026-03-12 13:28:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:28:46.985789825 +0000 UTC m=+1149.675861796" watchObservedRunningTime="2026-03-12 13:28:46.98856734 +0000 UTC m=+1149.678639321" Mar 12 13:28:47 crc kubenswrapper[4921]: I0312 13:28:47.971162 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:47 crc kubenswrapper[4921]: I0312 13:28:47.995639 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1548b5eb-2638-450c-ad0b-cf217b718b1f" path="/var/lib/kubelet/pods/1548b5eb-2638-450c-ad0b-cf217b718b1f/volumes" Mar 12 13:28:50 crc kubenswrapper[4921]: I0312 13:28:50.000949 4921 generic.go:334] "Generic (PLEG): container finished" podID="e64eca53-8c21-401f-96bb-a74e61cb1ea5" containerID="8ae6415b86e637b09ed6ed8eb721da79db2b5d0cf0ae025f5ebd65187ff5ba71" exitCode=0 Mar 12 13:28:50 crc kubenswrapper[4921]: I0312 13:28:50.000977 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pkq8z" event={"ID":"e64eca53-8c21-401f-96bb-a74e61cb1ea5","Type":"ContainerDied","Data":"8ae6415b86e637b09ed6ed8eb721da79db2b5d0cf0ae025f5ebd65187ff5ba71"} Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.665038 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.748656 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-credential-keys\") pod \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.748705 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-fernet-keys\") pod \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.748835 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-scripts\") pod \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.748883 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlv58\" (UniqueName: \"kubernetes.io/projected/e64eca53-8c21-401f-96bb-a74e61cb1ea5-kube-api-access-nlv58\") pod \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.748984 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-combined-ca-bundle\") pod \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.749016 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-config-data\") pod \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\" (UID: \"e64eca53-8c21-401f-96bb-a74e61cb1ea5\") " Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.755066 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e64eca53-8c21-401f-96bb-a74e61cb1ea5" (UID: "e64eca53-8c21-401f-96bb-a74e61cb1ea5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.765998 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e64eca53-8c21-401f-96bb-a74e61cb1ea5" (UID: "e64eca53-8c21-401f-96bb-a74e61cb1ea5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.767916 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e64eca53-8c21-401f-96bb-a74e61cb1ea5-kube-api-access-nlv58" (OuterVolumeSpecName: "kube-api-access-nlv58") pod "e64eca53-8c21-401f-96bb-a74e61cb1ea5" (UID: "e64eca53-8c21-401f-96bb-a74e61cb1ea5"). InnerVolumeSpecName "kube-api-access-nlv58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.768499 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-scripts" (OuterVolumeSpecName: "scripts") pod "e64eca53-8c21-401f-96bb-a74e61cb1ea5" (UID: "e64eca53-8c21-401f-96bb-a74e61cb1ea5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.774024 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e64eca53-8c21-401f-96bb-a74e61cb1ea5" (UID: "e64eca53-8c21-401f-96bb-a74e61cb1ea5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.774234 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-config-data" (OuterVolumeSpecName: "config-data") pod "e64eca53-8c21-401f-96bb-a74e61cb1ea5" (UID: "e64eca53-8c21-401f-96bb-a74e61cb1ea5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.850659 4921 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.850690 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.850700 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlv58\" (UniqueName: \"kubernetes.io/projected/e64eca53-8c21-401f-96bb-a74e61cb1ea5-kube-api-access-nlv58\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.850713 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.850722 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:53 crc kubenswrapper[4921]: I0312 13:28:53.850730 4921 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e64eca53-8c21-401f-96bb-a74e61cb1ea5-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.032481 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pkq8z" event={"ID":"e64eca53-8c21-401f-96bb-a74e61cb1ea5","Type":"ContainerDied","Data":"bc1cc549d66afb440a6f75461f1e6edd9c7b09bda42b7816d932d6e19c724927"} Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.032537 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc1cc549d66afb440a6f75461f1e6edd9c7b09bda42b7816d932d6e19c724927" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.032604 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pkq8z" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.087155 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.193511 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-srs62"] Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.193799 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" podUID="f12d28a3-bd3a-4484-b5a2-98721ada3b7e" containerName="dnsmasq-dns" containerID="cri-o://d919f77d2517b14200444846ad80d74de444231258e626c9d3d8594c3eb01f3a" gracePeriod=10 Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.318489 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" podUID="f12d28a3-bd3a-4484-b5a2-98721ada3b7e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.757714 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-pkq8z"] Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.766258 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-pkq8z"] Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.859523 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kf74w"] Mar 12 13:28:54 crc kubenswrapper[4921]: E0312 13:28:54.859951 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1548b5eb-2638-450c-ad0b-cf217b718b1f" containerName="init" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.859969 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1548b5eb-2638-450c-ad0b-cf217b718b1f" containerName="init" Mar 12 13:28:54 crc kubenswrapper[4921]: E0312 13:28:54.859982 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e64eca53-8c21-401f-96bb-a74e61cb1ea5" containerName="keystone-bootstrap" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.859988 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e64eca53-8c21-401f-96bb-a74e61cb1ea5" containerName="keystone-bootstrap" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.860139 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1548b5eb-2638-450c-ad0b-cf217b718b1f" containerName="init" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.860153 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e64eca53-8c21-401f-96bb-a74e61cb1ea5" containerName="keystone-bootstrap" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.860650 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.866780 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.866916 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kf74w"] Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.867127 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.867136 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.868046 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.872029 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4ws54" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.975165 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-fernet-keys\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.975251 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-scripts\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.975290 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-credential-keys\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.975318 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-config-data\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.975339 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzbb8\" (UniqueName: \"kubernetes.io/projected/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-kube-api-access-kzbb8\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:54 crc kubenswrapper[4921]: I0312 13:28:54.975418 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-combined-ca-bundle\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.040648 4921 generic.go:334] "Generic (PLEG): container finished" podID="f12d28a3-bd3a-4484-b5a2-98721ada3b7e" containerID="d919f77d2517b14200444846ad80d74de444231258e626c9d3d8594c3eb01f3a" exitCode=0 Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.040694 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" event={"ID":"f12d28a3-bd3a-4484-b5a2-98721ada3b7e","Type":"ContainerDied","Data":"d919f77d2517b14200444846ad80d74de444231258e626c9d3d8594c3eb01f3a"} Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.077466 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-combined-ca-bundle\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.077596 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-fernet-keys\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.077676 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-scripts\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.077709 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-credential-keys\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.077752 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-config-data\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.077785 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzbb8\" (UniqueName: \"kubernetes.io/projected/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-kube-api-access-kzbb8\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.083131 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-credential-keys\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.083577 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-fernet-keys\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.083970 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-scripts\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.085356 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-combined-ca-bundle\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.091315 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-config-data\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.100321 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzbb8\" (UniqueName: \"kubernetes.io/projected/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-kube-api-access-kzbb8\") pod \"keystone-bootstrap-kf74w\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.190871 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:28:55 crc kubenswrapper[4921]: I0312 13:28:55.997490 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e64eca53-8c21-401f-96bb-a74e61cb1ea5" path="/var/lib/kubelet/pods/e64eca53-8c21-401f-96bb-a74e61cb1ea5/volumes" Mar 12 13:28:56 crc kubenswrapper[4921]: I0312 13:28:56.324280 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:28:56 crc kubenswrapper[4921]: I0312 13:28:56.324352 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:28:59 crc kubenswrapper[4921]: I0312 13:28:59.316216 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" podUID="f12d28a3-bd3a-4484-b5a2-98721ada3b7e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.126:5353: connect: connection refused" Mar 12 13:29:02 crc kubenswrapper[4921]: E0312 13:29:02.011786 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 12 13:29:02 crc kubenswrapper[4921]: E0312 13:29:02.012187 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zbkt8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-4d9jl_openstack(cb9315da-7a44-4703-bc68-935d517a4412): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 13:29:02 crc kubenswrapper[4921]: E0312 13:29:02.013696 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-4d9jl" podUID="cb9315da-7a44-4703-bc68-935d517a4412" Mar 12 13:29:02 crc kubenswrapper[4921]: E0312 13:29:02.091220 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-4d9jl" podUID="cb9315da-7a44-4703-bc68-935d517a4412" Mar 12 13:29:03 crc kubenswrapper[4921]: E0312 13:29:03.039174 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 12 13:29:03 crc kubenswrapper[4921]: E0312 13:29:03.039540 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fm987,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-b8t7z_openstack(a9d3161d-0fd9-4116-8e46-74d541735563): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 13:29:03 crc kubenswrapper[4921]: E0312 13:29:03.040755 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-b8t7z" podUID="a9d3161d-0fd9-4116-8e46-74d541735563" Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.110444 4921 generic.go:334] "Generic (PLEG): container finished" podID="5331d35e-1086-4e7f-aa2f-164117b3df44" containerID="1df323f90efffdd1e6938c4300a799d121d63fc7aead625f796dded8e776229e" exitCode=0 Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.110547 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fv8c4" event={"ID":"5331d35e-1086-4e7f-aa2f-164117b3df44","Type":"ContainerDied","Data":"1df323f90efffdd1e6938c4300a799d121d63fc7aead625f796dded8e776229e"} Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.114095 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" event={"ID":"f12d28a3-bd3a-4484-b5a2-98721ada3b7e","Type":"ContainerDied","Data":"7fb166f97251e07d8935a57cda6ad33720b184cd24e8d17e2939c8080b223a13"} Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.114127 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fb166f97251e07d8935a57cda6ad33720b184cd24e8d17e2939c8080b223a13" Mar 12 13:29:03 crc kubenswrapper[4921]: E0312 13:29:03.116285 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-b8t7z" podUID="a9d3161d-0fd9-4116-8e46-74d541735563" Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.168523 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.237745 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-config\") pod \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.237966 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-dns-svc\") pod \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.238173 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-ovsdbserver-nb\") pod \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.238212 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwmck\" (UniqueName: \"kubernetes.io/projected/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-kube-api-access-rwmck\") pod \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.238277 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-ovsdbserver-sb\") pod \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\" (UID: \"f12d28a3-bd3a-4484-b5a2-98721ada3b7e\") " Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.250169 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-kube-api-access-rwmck" (OuterVolumeSpecName: "kube-api-access-rwmck") pod "f12d28a3-bd3a-4484-b5a2-98721ada3b7e" (UID: "f12d28a3-bd3a-4484-b5a2-98721ada3b7e"). InnerVolumeSpecName "kube-api-access-rwmck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.301234 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-config" (OuterVolumeSpecName: "config") pod "f12d28a3-bd3a-4484-b5a2-98721ada3b7e" (UID: "f12d28a3-bd3a-4484-b5a2-98721ada3b7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.306316 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f12d28a3-bd3a-4484-b5a2-98721ada3b7e" (UID: "f12d28a3-bd3a-4484-b5a2-98721ada3b7e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.308332 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f12d28a3-bd3a-4484-b5a2-98721ada3b7e" (UID: "f12d28a3-bd3a-4484-b5a2-98721ada3b7e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.309294 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f12d28a3-bd3a-4484-b5a2-98721ada3b7e" (UID: "f12d28a3-bd3a-4484-b5a2-98721ada3b7e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.342324 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.342359 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.342371 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.342384 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.342396 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwmck\" (UniqueName: \"kubernetes.io/projected/f12d28a3-bd3a-4484-b5a2-98721ada3b7e-kube-api-access-rwmck\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:03 crc kubenswrapper[4921]: I0312 13:29:03.458204 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kf74w"] Mar 12 13:29:03 crc kubenswrapper[4921]: W0312 13:29:03.462568 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3e49b37_c533_4d52_9ed8_dcb54e4c0955.slice/crio-aea300d8b7586933ac78e8df188a43a28b1ac90c469593d15ecf6117c9167a78 WatchSource:0}: Error finding container aea300d8b7586933ac78e8df188a43a28b1ac90c469593d15ecf6117c9167a78: Status 404 returned error can't find the container with id aea300d8b7586933ac78e8df188a43a28b1ac90c469593d15ecf6117c9167a78 Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.122258 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kf74w" event={"ID":"e3e49b37-c533-4d52-9ed8-dcb54e4c0955","Type":"ContainerStarted","Data":"5fc25203596b634784c962b5461abde49155a0352a648d5c2f127a66a9bc7a0f"} Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.124049 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kf74w" event={"ID":"e3e49b37-c533-4d52-9ed8-dcb54e4c0955","Type":"ContainerStarted","Data":"aea300d8b7586933ac78e8df188a43a28b1ac90c469593d15ecf6117c9167a78"} Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.125002 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7jtcj" event={"ID":"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015","Type":"ContainerStarted","Data":"6dafa166e8b02279a8e99a63389b9d5e57fcd514cc809c05db9630ce997e963d"} Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.127081 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3faf6-1c64-4aa8-81ef-4093bf9ed247","Type":"ContainerStarted","Data":"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a"} Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.127147 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-srs62" Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.163316 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kf74w" podStartSLOduration=10.163300058 podStartE2EDuration="10.163300058s" podCreationTimestamp="2026-03-12 13:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:04.147743219 +0000 UTC m=+1166.837815190" watchObservedRunningTime="2026-03-12 13:29:04.163300058 +0000 UTC m=+1166.853372029" Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.166347 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-srs62"] Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.174724 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-srs62"] Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.180990 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-7jtcj" podStartSLOduration=3.860143614 podStartE2EDuration="21.180971313s" podCreationTimestamp="2026-03-12 13:28:43 +0000 UTC" firstStartedPulling="2026-03-12 13:28:45.668918462 +0000 UTC m=+1148.358990433" lastFinishedPulling="2026-03-12 13:29:02.989746161 +0000 UTC m=+1165.679818132" observedRunningTime="2026-03-12 13:29:04.174632087 +0000 UTC m=+1166.864704048" watchObservedRunningTime="2026-03-12 13:29:04.180971313 +0000 UTC m=+1166.871043284" Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.666143 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fv8c4" Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.767561 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5331d35e-1086-4e7f-aa2f-164117b3df44-combined-ca-bundle\") pod \"5331d35e-1086-4e7f-aa2f-164117b3df44\" (UID: \"5331d35e-1086-4e7f-aa2f-164117b3df44\") " Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.768007 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5331d35e-1086-4e7f-aa2f-164117b3df44-config\") pod \"5331d35e-1086-4e7f-aa2f-164117b3df44\" (UID: \"5331d35e-1086-4e7f-aa2f-164117b3df44\") " Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.768189 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfq7n\" (UniqueName: \"kubernetes.io/projected/5331d35e-1086-4e7f-aa2f-164117b3df44-kube-api-access-sfq7n\") pod \"5331d35e-1086-4e7f-aa2f-164117b3df44\" (UID: \"5331d35e-1086-4e7f-aa2f-164117b3df44\") " Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.774163 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5331d35e-1086-4e7f-aa2f-164117b3df44-kube-api-access-sfq7n" (OuterVolumeSpecName: "kube-api-access-sfq7n") pod "5331d35e-1086-4e7f-aa2f-164117b3df44" (UID: "5331d35e-1086-4e7f-aa2f-164117b3df44"). InnerVolumeSpecName "kube-api-access-sfq7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.790691 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5331d35e-1086-4e7f-aa2f-164117b3df44-config" (OuterVolumeSpecName: "config") pod "5331d35e-1086-4e7f-aa2f-164117b3df44" (UID: "5331d35e-1086-4e7f-aa2f-164117b3df44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.792740 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5331d35e-1086-4e7f-aa2f-164117b3df44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5331d35e-1086-4e7f-aa2f-164117b3df44" (UID: "5331d35e-1086-4e7f-aa2f-164117b3df44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.870297 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5331d35e-1086-4e7f-aa2f-164117b3df44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.870325 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5331d35e-1086-4e7f-aa2f-164117b3df44-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:04 crc kubenswrapper[4921]: I0312 13:29:04.870336 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfq7n\" (UniqueName: \"kubernetes.io/projected/5331d35e-1086-4e7f-aa2f-164117b3df44-kube-api-access-sfq7n\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.135705 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fv8c4" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.136299 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fv8c4" event={"ID":"5331d35e-1086-4e7f-aa2f-164117b3df44","Type":"ContainerDied","Data":"673bd319bc175d697d1d696ab05c7951848ebcd55389d1d5e3a9ff7e272586a1"} Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.136390 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="673bd319bc175d697d1d696ab05c7951848ebcd55389d1d5e3a9ff7e272586a1" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.137837 4921 generic.go:334] "Generic (PLEG): container finished" podID="dd04ca5a-99dd-40dc-9fb8-0722ca1e4015" containerID="6dafa166e8b02279a8e99a63389b9d5e57fcd514cc809c05db9630ce997e963d" exitCode=0 Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.137918 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7jtcj" event={"ID":"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015","Type":"ContainerDied","Data":"6dafa166e8b02279a8e99a63389b9d5e57fcd514cc809c05db9630ce997e963d"} Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.140553 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3faf6-1c64-4aa8-81ef-4093bf9ed247","Type":"ContainerStarted","Data":"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca"} Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.314680 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-hjhq8"] Mar 12 13:29:05 crc kubenswrapper[4921]: E0312 13:29:05.315091 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12d28a3-bd3a-4484-b5a2-98721ada3b7e" containerName="dnsmasq-dns" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.315114 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12d28a3-bd3a-4484-b5a2-98721ada3b7e" containerName="dnsmasq-dns" Mar 12 13:29:05 crc kubenswrapper[4921]: E0312 13:29:05.315146 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5331d35e-1086-4e7f-aa2f-164117b3df44" containerName="neutron-db-sync" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.315154 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5331d35e-1086-4e7f-aa2f-164117b3df44" containerName="neutron-db-sync" Mar 12 13:29:05 crc kubenswrapper[4921]: E0312 13:29:05.315172 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12d28a3-bd3a-4484-b5a2-98721ada3b7e" containerName="init" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.315181 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12d28a3-bd3a-4484-b5a2-98721ada3b7e" containerName="init" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.315363 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12d28a3-bd3a-4484-b5a2-98721ada3b7e" containerName="dnsmasq-dns" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.315387 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5331d35e-1086-4e7f-aa2f-164117b3df44" containerName="neutron-db-sync" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.316343 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.357659 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-hjhq8"] Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.378782 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq6zg\" (UniqueName: \"kubernetes.io/projected/9bd9f449-a423-4e35-9177-37728b5fdcf9-kube-api-access-kq6zg\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.378864 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-dns-svc\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.378929 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-config\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.378945 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.378970 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.452473 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-69d56fdd9b-bhnqx"] Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.454858 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.461341 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.461794 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.461954 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.462101 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-5n4pc" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.463861 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69d56fdd9b-bhnqx"] Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.480991 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-config\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.481032 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.481060 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.481120 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq6zg\" (UniqueName: \"kubernetes.io/projected/9bd9f449-a423-4e35-9177-37728b5fdcf9-kube-api-access-kq6zg\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.481158 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-dns-svc\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.482204 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-dns-svc\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.482908 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.483429 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-config\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.486071 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.505307 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq6zg\" (UniqueName: \"kubernetes.io/projected/9bd9f449-a423-4e35-9177-37728b5fdcf9-kube-api-access-kq6zg\") pod \"dnsmasq-dns-7b946d459c-hjhq8\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.582406 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-combined-ca-bundle\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.582478 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-httpd-config\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.582498 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf4nf\" (UniqueName: \"kubernetes.io/projected/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-kube-api-access-wf4nf\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.582745 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-config\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.582858 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-ovndb-tls-certs\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.651087 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.684825 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-combined-ca-bundle\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.684904 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-httpd-config\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.684926 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf4nf\" (UniqueName: \"kubernetes.io/projected/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-kube-api-access-wf4nf\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.684983 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-config\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.685020 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-ovndb-tls-certs\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.693352 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-combined-ca-bundle\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.696552 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-ovndb-tls-certs\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.707890 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-httpd-config\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.707969 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-config\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.714443 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf4nf\" (UniqueName: \"kubernetes.io/projected/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-kube-api-access-wf4nf\") pod \"neutron-69d56fdd9b-bhnqx\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:05 crc kubenswrapper[4921]: I0312 13:29:05.776602 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.020867 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f12d28a3-bd3a-4484-b5a2-98721ada3b7e" path="/var/lib/kubelet/pods/f12d28a3-bd3a-4484-b5a2-98721ada3b7e/volumes" Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.201293 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-hjhq8"] Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.308422 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-69d56fdd9b-bhnqx"] Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.575495 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7jtcj" Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.618854 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-scripts\") pod \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.619419 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-combined-ca-bundle\") pod \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.619550 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-config-data\") pod \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.619645 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-logs\") pod \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.619795 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxn69\" (UniqueName: \"kubernetes.io/projected/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-kube-api-access-bxn69\") pod \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.620661 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-logs" (OuterVolumeSpecName: "logs") pod "dd04ca5a-99dd-40dc-9fb8-0722ca1e4015" (UID: "dd04ca5a-99dd-40dc-9fb8-0722ca1e4015"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.626185 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-kube-api-access-bxn69" (OuterVolumeSpecName: "kube-api-access-bxn69") pod "dd04ca5a-99dd-40dc-9fb8-0722ca1e4015" (UID: "dd04ca5a-99dd-40dc-9fb8-0722ca1e4015"). InnerVolumeSpecName "kube-api-access-bxn69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.627209 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-scripts" (OuterVolumeSpecName: "scripts") pod "dd04ca5a-99dd-40dc-9fb8-0722ca1e4015" (UID: "dd04ca5a-99dd-40dc-9fb8-0722ca1e4015"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:06 crc kubenswrapper[4921]: E0312 13:29:06.670797 4921 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-combined-ca-bundle podName:dd04ca5a-99dd-40dc-9fb8-0722ca1e4015 nodeName:}" failed. No retries permitted until 2026-03-12 13:29:07.170762392 +0000 UTC m=+1169.860834363 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-combined-ca-bundle") pod "dd04ca5a-99dd-40dc-9fb8-0722ca1e4015" (UID: "dd04ca5a-99dd-40dc-9fb8-0722ca1e4015") : error deleting /var/lib/kubelet/pods/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015/volume-subpaths: remove /var/lib/kubelet/pods/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015/volume-subpaths: no such file or directory Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.673311 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-config-data" (OuterVolumeSpecName: "config-data") pod "dd04ca5a-99dd-40dc-9fb8-0722ca1e4015" (UID: "dd04ca5a-99dd-40dc-9fb8-0722ca1e4015"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.721850 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.721885 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.721895 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-logs\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:06 crc kubenswrapper[4921]: I0312 13:29:06.721903 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxn69\" (UniqueName: \"kubernetes.io/projected/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-kube-api-access-bxn69\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.168475 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d56fdd9b-bhnqx" event={"ID":"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444","Type":"ContainerStarted","Data":"6208e6e16328b07ee57acad0b7d54e2206e69379e81312961f6b0b4954b50748"} Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.168540 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d56fdd9b-bhnqx" event={"ID":"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444","Type":"ContainerStarted","Data":"5de3ad8d3563a1c396e01a64104cc10969c7df05da22fb2e628de17147678961"} Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.168553 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d56fdd9b-bhnqx" event={"ID":"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444","Type":"ContainerStarted","Data":"c86ec04c16847ce2fbe63232169f040cd0ee6d80498af3414c591b6e55fe6eff"} Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.168567 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.181947 4921 generic.go:334] "Generic (PLEG): container finished" podID="9bd9f449-a423-4e35-9177-37728b5fdcf9" containerID="f216143a8008bac79a5dc0d4fd1ded16e190386846073bf123e5c6be365d0bbc" exitCode=0 Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.182121 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" event={"ID":"9bd9f449-a423-4e35-9177-37728b5fdcf9","Type":"ContainerDied","Data":"f216143a8008bac79a5dc0d4fd1ded16e190386846073bf123e5c6be365d0bbc"} Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.182156 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" event={"ID":"9bd9f449-a423-4e35-9177-37728b5fdcf9","Type":"ContainerStarted","Data":"260f9baa05851cf421e6eeb8c04458d9e3d88cc1a450833d3c8a552bec8ee049"} Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.191854 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-69d56fdd9b-bhnqx" podStartSLOduration=2.191834524 podStartE2EDuration="2.191834524s" podCreationTimestamp="2026-03-12 13:29:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:07.183455981 +0000 UTC m=+1169.873527962" watchObservedRunningTime="2026-03-12 13:29:07.191834524 +0000 UTC m=+1169.881906495" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.191955 4921 generic.go:334] "Generic (PLEG): container finished" podID="e3e49b37-c533-4d52-9ed8-dcb54e4c0955" containerID="5fc25203596b634784c962b5461abde49155a0352a648d5c2f127a66a9bc7a0f" exitCode=0 Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.192022 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kf74w" event={"ID":"e3e49b37-c533-4d52-9ed8-dcb54e4c0955","Type":"ContainerDied","Data":"5fc25203596b634784c962b5461abde49155a0352a648d5c2f127a66a9bc7a0f"} Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.201661 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-7jtcj" event={"ID":"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015","Type":"ContainerDied","Data":"3b2ab4fd3aa58a433f66a3db63cfffd236e8b57fa88bf19af965bfc51b618c3d"} Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.201698 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b2ab4fd3aa58a433f66a3db63cfffd236e8b57fa88bf19af965bfc51b618c3d" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.201751 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-7jtcj" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.228730 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-combined-ca-bundle\") pod \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\" (UID: \"dd04ca5a-99dd-40dc-9fb8-0722ca1e4015\") " Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.234605 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd04ca5a-99dd-40dc-9fb8-0722ca1e4015" (UID: "dd04ca5a-99dd-40dc-9fb8-0722ca1e4015"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.330958 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.741769 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6cff966cbd-8c6fq"] Mar 12 13:29:07 crc kubenswrapper[4921]: E0312 13:29:07.742184 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd04ca5a-99dd-40dc-9fb8-0722ca1e4015" containerName="placement-db-sync" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.742204 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd04ca5a-99dd-40dc-9fb8-0722ca1e4015" containerName="placement-db-sync" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.742400 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd04ca5a-99dd-40dc-9fb8-0722ca1e4015" containerName="placement-db-sync" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.743412 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.746978 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.747086 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.747290 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-sfv85" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.747342 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.747565 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.751521 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6cff966cbd-8c6fq"] Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.839358 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62fhm\" (UniqueName: \"kubernetes.io/projected/88666465-b61a-40e2-b20f-c8e6ad561ad8-kube-api-access-62fhm\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.839421 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-combined-ca-bundle\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.839446 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-internal-tls-certs\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.839542 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-public-tls-certs\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.839715 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-config-data\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.839772 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88666465-b61a-40e2-b20f-c8e6ad561ad8-logs\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.839805 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-scripts\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.846963 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-676f8c65df-nxrf5"] Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.848835 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.851264 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.853200 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.879653 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-676f8c65df-nxrf5"] Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.941707 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-public-tls-certs\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.942043 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-httpd-config\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.942076 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-combined-ca-bundle\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.942096 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmmtg\" (UniqueName: \"kubernetes.io/projected/57d73461-cb3e-4790-9576-1cb19e03815c-kube-api-access-lmmtg\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.942143 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-config-data\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.942160 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88666465-b61a-40e2-b20f-c8e6ad561ad8-logs\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.942177 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-scripts\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.942237 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-ovndb-tls-certs\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.942271 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-config\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.942301 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62fhm\" (UniqueName: \"kubernetes.io/projected/88666465-b61a-40e2-b20f-c8e6ad561ad8-kube-api-access-62fhm\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.942319 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-internal-tls-certs\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.942345 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-combined-ca-bundle\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.942359 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-internal-tls-certs\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.942381 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-public-tls-certs\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.944581 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88666465-b61a-40e2-b20f-c8e6ad561ad8-logs\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.951582 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-config-data\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.970164 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-public-tls-certs\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.970231 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-internal-tls-certs\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.970445 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-combined-ca-bundle\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.970492 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-scripts\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:07 crc kubenswrapper[4921]: I0312 13:29:07.972600 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62fhm\" (UniqueName: \"kubernetes.io/projected/88666465-b61a-40e2-b20f-c8e6ad561ad8-kube-api-access-62fhm\") pod \"placement-6cff966cbd-8c6fq\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.043442 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-internal-tls-certs\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.043507 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-public-tls-certs\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.043578 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-httpd-config\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.043600 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-combined-ca-bundle\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.043626 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmmtg\" (UniqueName: \"kubernetes.io/projected/57d73461-cb3e-4790-9576-1cb19e03815c-kube-api-access-lmmtg\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.043708 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-ovndb-tls-certs\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.043745 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-config\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.047020 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-internal-tls-certs\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.047511 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-ovndb-tls-certs\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.047536 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-public-tls-certs\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.048224 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-config\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.049004 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-httpd-config\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.050507 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-combined-ca-bundle\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.060758 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.061450 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmmtg\" (UniqueName: \"kubernetes.io/projected/57d73461-cb3e-4790-9576-1cb19e03815c-kube-api-access-lmmtg\") pod \"neutron-676f8c65df-nxrf5\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.167124 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.273228 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" event={"ID":"9bd9f449-a423-4e35-9177-37728b5fdcf9","Type":"ContainerStarted","Data":"56aeeb2d6810f39ed6253f94ef23636bfe49078974a5a2c62e0a0b5dbb5b4da7"} Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.302106 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" podStartSLOduration=3.302083245 podStartE2EDuration="3.302083245s" podCreationTimestamp="2026-03-12 13:29:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:08.295965209 +0000 UTC m=+1170.986037180" watchObservedRunningTime="2026-03-12 13:29:08.302083245 +0000 UTC m=+1170.992155216" Mar 12 13:29:08 crc kubenswrapper[4921]: I0312 13:29:08.617894 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6cff966cbd-8c6fq"] Mar 12 13:29:09 crc kubenswrapper[4921]: I0312 13:29:09.281295 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:11 crc kubenswrapper[4921]: W0312 13:29:11.540951 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88666465_b61a_40e2_b20f_c8e6ad561ad8.slice/crio-fcf080e4ff2a6b1d6dfbda4cbc58b1e7d01dcb622a9d09c419f1d4e5a8d5a242 WatchSource:0}: Error finding container fcf080e4ff2a6b1d6dfbda4cbc58b1e7d01dcb622a9d09c419f1d4e5a8d5a242: Status 404 returned error can't find the container with id fcf080e4ff2a6b1d6dfbda4cbc58b1e7d01dcb622a9d09c419f1d4e5a8d5a242 Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.780168 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.833598 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-config-data\") pod \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.834117 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-combined-ca-bundle\") pod \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.834159 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzbb8\" (UniqueName: \"kubernetes.io/projected/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-kube-api-access-kzbb8\") pod \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.834214 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-fernet-keys\") pod \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.834244 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-credential-keys\") pod \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.834270 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-scripts\") pod \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\" (UID: \"e3e49b37-c533-4d52-9ed8-dcb54e4c0955\") " Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.838052 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e3e49b37-c533-4d52-9ed8-dcb54e4c0955" (UID: "e3e49b37-c533-4d52-9ed8-dcb54e4c0955"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.840768 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e3e49b37-c533-4d52-9ed8-dcb54e4c0955" (UID: "e3e49b37-c533-4d52-9ed8-dcb54e4c0955"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.840907 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-scripts" (OuterVolumeSpecName: "scripts") pod "e3e49b37-c533-4d52-9ed8-dcb54e4c0955" (UID: "e3e49b37-c533-4d52-9ed8-dcb54e4c0955"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.841031 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-kube-api-access-kzbb8" (OuterVolumeSpecName: "kube-api-access-kzbb8") pod "e3e49b37-c533-4d52-9ed8-dcb54e4c0955" (UID: "e3e49b37-c533-4d52-9ed8-dcb54e4c0955"). InnerVolumeSpecName "kube-api-access-kzbb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.879958 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3e49b37-c533-4d52-9ed8-dcb54e4c0955" (UID: "e3e49b37-c533-4d52-9ed8-dcb54e4c0955"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.892210 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-config-data" (OuterVolumeSpecName: "config-data") pod "e3e49b37-c533-4d52-9ed8-dcb54e4c0955" (UID: "e3e49b37-c533-4d52-9ed8-dcb54e4c0955"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.936080 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.936122 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzbb8\" (UniqueName: \"kubernetes.io/projected/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-kube-api-access-kzbb8\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.936140 4921 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.936152 4921 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.936164 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:11 crc kubenswrapper[4921]: I0312 13:29:11.936174 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e49b37-c533-4d52-9ed8-dcb54e4c0955-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.333263 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kf74w" event={"ID":"e3e49b37-c533-4d52-9ed8-dcb54e4c0955","Type":"ContainerDied","Data":"aea300d8b7586933ac78e8df188a43a28b1ac90c469593d15ecf6117c9167a78"} Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.333575 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aea300d8b7586933ac78e8df188a43a28b1ac90c469593d15ecf6117c9167a78" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.333657 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kf74w" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.358163 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3faf6-1c64-4aa8-81ef-4093bf9ed247","Type":"ContainerStarted","Data":"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db"} Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.363697 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cff966cbd-8c6fq" event={"ID":"88666465-b61a-40e2-b20f-c8e6ad561ad8","Type":"ContainerStarted","Data":"da715f6ca5f367dbad5b9c201230eccc4275d1a915f195c25ae08bb7138d1cfb"} Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.363780 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cff966cbd-8c6fq" event={"ID":"88666465-b61a-40e2-b20f-c8e6ad561ad8","Type":"ContainerStarted","Data":"902c256cb4282cba89b058f5c40357a799a90cac9c0a13138373c64e6b689cd2"} Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.363796 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cff966cbd-8c6fq" event={"ID":"88666465-b61a-40e2-b20f-c8e6ad561ad8","Type":"ContainerStarted","Data":"fcf080e4ff2a6b1d6dfbda4cbc58b1e7d01dcb622a9d09c419f1d4e5a8d5a242"} Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.364004 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.364047 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.392755 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-676f8c65df-nxrf5"] Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.402095 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6cff966cbd-8c6fq" podStartSLOduration=5.402069604 podStartE2EDuration="5.402069604s" podCreationTimestamp="2026-03-12 13:29:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:12.388564666 +0000 UTC m=+1175.078636657" watchObservedRunningTime="2026-03-12 13:29:12.402069604 +0000 UTC m=+1175.092141575" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.923002 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c8b44c5c7-pc46f"] Mar 12 13:29:12 crc kubenswrapper[4921]: E0312 13:29:12.923679 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e49b37-c533-4d52-9ed8-dcb54e4c0955" containerName="keystone-bootstrap" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.923694 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e49b37-c533-4d52-9ed8-dcb54e4c0955" containerName="keystone-bootstrap" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.923858 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e49b37-c533-4d52-9ed8-dcb54e4c0955" containerName="keystone-bootstrap" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.924416 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.926942 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.927437 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4ws54" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.927698 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.927848 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.928095 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.929649 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.938484 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c8b44c5c7-pc46f"] Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.953877 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hln6m\" (UniqueName: \"kubernetes.io/projected/3fcdfac3-13b0-42ac-9396-587a7d443e2a-kube-api-access-hln6m\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.953921 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-fernet-keys\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.954056 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-credential-keys\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.954206 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-public-tls-certs\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.954266 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-internal-tls-certs\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.954311 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-scripts\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.954333 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-config-data\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:12 crc kubenswrapper[4921]: I0312 13:29:12.954475 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-combined-ca-bundle\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.056628 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-internal-tls-certs\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.056956 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-config-data\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.057034 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-scripts\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.057131 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-combined-ca-bundle\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.057250 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hln6m\" (UniqueName: \"kubernetes.io/projected/3fcdfac3-13b0-42ac-9396-587a7d443e2a-kube-api-access-hln6m\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.057322 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-fernet-keys\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.057411 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-credential-keys\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.057502 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-public-tls-certs\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.059968 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-internal-tls-certs\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.063717 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-scripts\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.067398 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-combined-ca-bundle\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.075300 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-public-tls-certs\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.090806 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-config-data\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.091447 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-fernet-keys\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.093199 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3fcdfac3-13b0-42ac-9396-587a7d443e2a-credential-keys\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.108431 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hln6m\" (UniqueName: \"kubernetes.io/projected/3fcdfac3-13b0-42ac-9396-587a7d443e2a-kube-api-access-hln6m\") pod \"keystone-c8b44c5c7-pc46f\" (UID: \"3fcdfac3-13b0-42ac-9396-587a7d443e2a\") " pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.241927 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.374453 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-676f8c65df-nxrf5" event={"ID":"57d73461-cb3e-4790-9576-1cb19e03815c","Type":"ContainerStarted","Data":"882f00afa5a8574380a3cbcdeb51f134e2802146be94414cd8bf3305d0077bf2"} Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.374548 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.374571 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-676f8c65df-nxrf5" event={"ID":"57d73461-cb3e-4790-9576-1cb19e03815c","Type":"ContainerStarted","Data":"694464c1f45d4e2e6d1383d11455ef2cd84aec943d4c7963d7e68de0022a32c6"} Mar 12 13:29:13 crc kubenswrapper[4921]: I0312 13:29:13.374586 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-676f8c65df-nxrf5" event={"ID":"57d73461-cb3e-4790-9576-1cb19e03815c","Type":"ContainerStarted","Data":"e9cc563ce259d5dcbe0713e35563657a208561df51d83da10fd9ad4a0e4c24e7"} Mar 12 13:29:14 crc kubenswrapper[4921]: I0312 13:29:14.256964 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-676f8c65df-nxrf5" podStartSLOduration=7.256944322 podStartE2EDuration="7.256944322s" podCreationTimestamp="2026-03-12 13:29:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:13.400616098 +0000 UTC m=+1176.090688089" watchObservedRunningTime="2026-03-12 13:29:14.256944322 +0000 UTC m=+1176.947016293" Mar 12 13:29:14 crc kubenswrapper[4921]: I0312 13:29:14.263593 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c8b44c5c7-pc46f"] Mar 12 13:29:14 crc kubenswrapper[4921]: I0312 13:29:14.396959 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c8b44c5c7-pc46f" event={"ID":"3fcdfac3-13b0-42ac-9396-587a7d443e2a","Type":"ContainerStarted","Data":"5255aab9d95318fa73359f458290b62183eeef12144bac3fb674ac807e089acf"} Mar 12 13:29:15 crc kubenswrapper[4921]: I0312 13:29:15.431416 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c8b44c5c7-pc46f" event={"ID":"3fcdfac3-13b0-42ac-9396-587a7d443e2a","Type":"ContainerStarted","Data":"9c17cb460c9a80fefe8c78b3efbb1b0f91da4a4d6d596f2f3c9638877577f097"} Mar 12 13:29:15 crc kubenswrapper[4921]: I0312 13:29:15.433952 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:15 crc kubenswrapper[4921]: I0312 13:29:15.461017 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-c8b44c5c7-pc46f" podStartSLOduration=3.460993567 podStartE2EDuration="3.460993567s" podCreationTimestamp="2026-03-12 13:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:15.454305944 +0000 UTC m=+1178.144377915" watchObservedRunningTime="2026-03-12 13:29:15.460993567 +0000 UTC m=+1178.151065538" Mar 12 13:29:15 crc kubenswrapper[4921]: I0312 13:29:15.661097 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:15 crc kubenswrapper[4921]: I0312 13:29:15.742699 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-hjjtw"] Mar 12 13:29:15 crc kubenswrapper[4921]: I0312 13:29:15.742971 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" podUID="98cd11fa-112d-45fe-b67e-ae910048dfbf" containerName="dnsmasq-dns" containerID="cri-o://79b07f3a3c4edce9ea5e923bef8cb047bc661311581d43fe6448f6722f74e421" gracePeriod=10 Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.376567 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.438912 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgks4\" (UniqueName: \"kubernetes.io/projected/98cd11fa-112d-45fe-b67e-ae910048dfbf-kube-api-access-tgks4\") pod \"98cd11fa-112d-45fe-b67e-ae910048dfbf\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.438977 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-config\") pod \"98cd11fa-112d-45fe-b67e-ae910048dfbf\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.439049 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-dns-svc\") pod \"98cd11fa-112d-45fe-b67e-ae910048dfbf\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.439152 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-ovsdbserver-nb\") pod \"98cd11fa-112d-45fe-b67e-ae910048dfbf\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.439208 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-ovsdbserver-sb\") pod \"98cd11fa-112d-45fe-b67e-ae910048dfbf\" (UID: \"98cd11fa-112d-45fe-b67e-ae910048dfbf\") " Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.445968 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98cd11fa-112d-45fe-b67e-ae910048dfbf-kube-api-access-tgks4" (OuterVolumeSpecName: "kube-api-access-tgks4") pod "98cd11fa-112d-45fe-b67e-ae910048dfbf" (UID: "98cd11fa-112d-45fe-b67e-ae910048dfbf"). InnerVolumeSpecName "kube-api-access-tgks4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.447118 4921 generic.go:334] "Generic (PLEG): container finished" podID="98cd11fa-112d-45fe-b67e-ae910048dfbf" containerID="79b07f3a3c4edce9ea5e923bef8cb047bc661311581d43fe6448f6722f74e421" exitCode=0 Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.448080 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.448446 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" event={"ID":"98cd11fa-112d-45fe-b67e-ae910048dfbf","Type":"ContainerDied","Data":"79b07f3a3c4edce9ea5e923bef8cb047bc661311581d43fe6448f6722f74e421"} Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.448472 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-hjjtw" event={"ID":"98cd11fa-112d-45fe-b67e-ae910048dfbf","Type":"ContainerDied","Data":"f0de17c78f6fa375cb5eedbdf4927c021c412a8d544012d847308dbcb59e2838"} Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.448487 4921 scope.go:117] "RemoveContainer" containerID="79b07f3a3c4edce9ea5e923bef8cb047bc661311581d43fe6448f6722f74e421" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.487269 4921 scope.go:117] "RemoveContainer" containerID="d989f5eb4772ff26bc287180e5532582475e95b315084af5d2eef2800c320e20" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.489005 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "98cd11fa-112d-45fe-b67e-ae910048dfbf" (UID: "98cd11fa-112d-45fe-b67e-ae910048dfbf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.493841 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-config" (OuterVolumeSpecName: "config") pod "98cd11fa-112d-45fe-b67e-ae910048dfbf" (UID: "98cd11fa-112d-45fe-b67e-ae910048dfbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.494318 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "98cd11fa-112d-45fe-b67e-ae910048dfbf" (UID: "98cd11fa-112d-45fe-b67e-ae910048dfbf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.494631 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "98cd11fa-112d-45fe-b67e-ae910048dfbf" (UID: "98cd11fa-112d-45fe-b67e-ae910048dfbf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.522950 4921 scope.go:117] "RemoveContainer" containerID="79b07f3a3c4edce9ea5e923bef8cb047bc661311581d43fe6448f6722f74e421" Mar 12 13:29:16 crc kubenswrapper[4921]: E0312 13:29:16.523926 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79b07f3a3c4edce9ea5e923bef8cb047bc661311581d43fe6448f6722f74e421\": container with ID starting with 79b07f3a3c4edce9ea5e923bef8cb047bc661311581d43fe6448f6722f74e421 not found: ID does not exist" containerID="79b07f3a3c4edce9ea5e923bef8cb047bc661311581d43fe6448f6722f74e421" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.523980 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79b07f3a3c4edce9ea5e923bef8cb047bc661311581d43fe6448f6722f74e421"} err="failed to get container status \"79b07f3a3c4edce9ea5e923bef8cb047bc661311581d43fe6448f6722f74e421\": rpc error: code = NotFound desc = could not find container \"79b07f3a3c4edce9ea5e923bef8cb047bc661311581d43fe6448f6722f74e421\": container with ID starting with 79b07f3a3c4edce9ea5e923bef8cb047bc661311581d43fe6448f6722f74e421 not found: ID does not exist" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.524014 4921 scope.go:117] "RemoveContainer" containerID="d989f5eb4772ff26bc287180e5532582475e95b315084af5d2eef2800c320e20" Mar 12 13:29:16 crc kubenswrapper[4921]: E0312 13:29:16.524286 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d989f5eb4772ff26bc287180e5532582475e95b315084af5d2eef2800c320e20\": container with ID starting with d989f5eb4772ff26bc287180e5532582475e95b315084af5d2eef2800c320e20 not found: ID does not exist" containerID="d989f5eb4772ff26bc287180e5532582475e95b315084af5d2eef2800c320e20" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.524310 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d989f5eb4772ff26bc287180e5532582475e95b315084af5d2eef2800c320e20"} err="failed to get container status \"d989f5eb4772ff26bc287180e5532582475e95b315084af5d2eef2800c320e20\": rpc error: code = NotFound desc = could not find container \"d989f5eb4772ff26bc287180e5532582475e95b315084af5d2eef2800c320e20\": container with ID starting with d989f5eb4772ff26bc287180e5532582475e95b315084af5d2eef2800c320e20 not found: ID does not exist" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.542000 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.542045 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.542060 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgks4\" (UniqueName: \"kubernetes.io/projected/98cd11fa-112d-45fe-b67e-ae910048dfbf-kube-api-access-tgks4\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.542073 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.542085 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98cd11fa-112d-45fe-b67e-ae910048dfbf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.777869 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-hjjtw"] Mar 12 13:29:16 crc kubenswrapper[4921]: I0312 13:29:16.783969 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-hjjtw"] Mar 12 13:29:17 crc kubenswrapper[4921]: I0312 13:29:17.470353 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b8t7z" event={"ID":"a9d3161d-0fd9-4116-8e46-74d541735563","Type":"ContainerStarted","Data":"05a4aa398d018d3520833cceba9c5a91b21d92a30f54ad282d3f72f506cce6f5"} Mar 12 13:29:17 crc kubenswrapper[4921]: I0312 13:29:17.492902 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-b8t7z" podStartSLOduration=4.380593127 podStartE2EDuration="34.492875714s" podCreationTimestamp="2026-03-12 13:28:43 +0000 UTC" firstStartedPulling="2026-03-12 13:28:45.489556606 +0000 UTC m=+1148.179628577" lastFinishedPulling="2026-03-12 13:29:15.601839183 +0000 UTC m=+1178.291911164" observedRunningTime="2026-03-12 13:29:17.487875663 +0000 UTC m=+1180.177947654" watchObservedRunningTime="2026-03-12 13:29:17.492875714 +0000 UTC m=+1180.182947695" Mar 12 13:29:17 crc kubenswrapper[4921]: I0312 13:29:17.996293 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98cd11fa-112d-45fe-b67e-ae910048dfbf" path="/var/lib/kubelet/pods/98cd11fa-112d-45fe-b67e-ae910048dfbf/volumes" Mar 12 13:29:21 crc kubenswrapper[4921]: I0312 13:29:21.505105 4921 generic.go:334] "Generic (PLEG): container finished" podID="a9d3161d-0fd9-4116-8e46-74d541735563" containerID="05a4aa398d018d3520833cceba9c5a91b21d92a30f54ad282d3f72f506cce6f5" exitCode=0 Mar 12 13:29:21 crc kubenswrapper[4921]: I0312 13:29:21.505144 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b8t7z" event={"ID":"a9d3161d-0fd9-4116-8e46-74d541735563","Type":"ContainerDied","Data":"05a4aa398d018d3520833cceba9c5a91b21d92a30f54ad282d3f72f506cce6f5"} Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.521588 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4d9jl" event={"ID":"cb9315da-7a44-4703-bc68-935d517a4412","Type":"ContainerStarted","Data":"549fabab75489983c2ff504acf4fb7d6e20b4ed5a226b2274e4a242f4d2f24e5"} Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.529503 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3faf6-1c64-4aa8-81ef-4093bf9ed247","Type":"ContainerStarted","Data":"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f"} Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.529879 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.529922 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="ceilometer-central-agent" containerID="cri-o://9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a" gracePeriod=30 Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.529987 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="sg-core" containerID="cri-o://f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db" gracePeriod=30 Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.529973 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="ceilometer-notification-agent" containerID="cri-o://8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca" gracePeriod=30 Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.529941 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="proxy-httpd" containerID="cri-o://accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f" gracePeriod=30 Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.557933 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-4d9jl" podStartSLOduration=3.543623362 podStartE2EDuration="39.557907775s" podCreationTimestamp="2026-03-12 13:28:43 +0000 UTC" firstStartedPulling="2026-03-12 13:28:45.66885317 +0000 UTC m=+1148.358925141" lastFinishedPulling="2026-03-12 13:29:21.683137563 +0000 UTC m=+1184.373209554" observedRunningTime="2026-03-12 13:29:22.549517692 +0000 UTC m=+1185.239589673" watchObservedRunningTime="2026-03-12 13:29:22.557907775 +0000 UTC m=+1185.247979776" Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.574764 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.384773467 podStartE2EDuration="39.574743044s" podCreationTimestamp="2026-03-12 13:28:43 +0000 UTC" firstStartedPulling="2026-03-12 13:28:45.507370675 +0000 UTC m=+1148.197442646" lastFinishedPulling="2026-03-12 13:29:21.697340232 +0000 UTC m=+1184.387412223" observedRunningTime="2026-03-12 13:29:22.573391733 +0000 UTC m=+1185.263463724" watchObservedRunningTime="2026-03-12 13:29:22.574743044 +0000 UTC m=+1185.264815035" Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.896453 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.962526 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-db-sync-config-data\") pod \"a9d3161d-0fd9-4116-8e46-74d541735563\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.962661 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-config-data\") pod \"a9d3161d-0fd9-4116-8e46-74d541735563\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.962739 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm987\" (UniqueName: \"kubernetes.io/projected/a9d3161d-0fd9-4116-8e46-74d541735563-kube-api-access-fm987\") pod \"a9d3161d-0fd9-4116-8e46-74d541735563\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.962779 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9d3161d-0fd9-4116-8e46-74d541735563-etc-machine-id\") pod \"a9d3161d-0fd9-4116-8e46-74d541735563\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.962853 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-combined-ca-bundle\") pod \"a9d3161d-0fd9-4116-8e46-74d541735563\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.962883 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-scripts\") pod \"a9d3161d-0fd9-4116-8e46-74d541735563\" (UID: \"a9d3161d-0fd9-4116-8e46-74d541735563\") " Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.962890 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9d3161d-0fd9-4116-8e46-74d541735563-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a9d3161d-0fd9-4116-8e46-74d541735563" (UID: "a9d3161d-0fd9-4116-8e46-74d541735563"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.963239 4921 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9d3161d-0fd9-4116-8e46-74d541735563-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.973661 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a9d3161d-0fd9-4116-8e46-74d541735563" (UID: "a9d3161d-0fd9-4116-8e46-74d541735563"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.974438 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-scripts" (OuterVolumeSpecName: "scripts") pod "a9d3161d-0fd9-4116-8e46-74d541735563" (UID: "a9d3161d-0fd9-4116-8e46-74d541735563"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:22 crc kubenswrapper[4921]: I0312 13:29:22.974509 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9d3161d-0fd9-4116-8e46-74d541735563-kube-api-access-fm987" (OuterVolumeSpecName: "kube-api-access-fm987") pod "a9d3161d-0fd9-4116-8e46-74d541735563" (UID: "a9d3161d-0fd9-4116-8e46-74d541735563"). InnerVolumeSpecName "kube-api-access-fm987". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.009914 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9d3161d-0fd9-4116-8e46-74d541735563" (UID: "a9d3161d-0fd9-4116-8e46-74d541735563"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.020963 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-config-data" (OuterVolumeSpecName: "config-data") pod "a9d3161d-0fd9-4116-8e46-74d541735563" (UID: "a9d3161d-0fd9-4116-8e46-74d541735563"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.064399 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm987\" (UniqueName: \"kubernetes.io/projected/a9d3161d-0fd9-4116-8e46-74d541735563-kube-api-access-fm987\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.064431 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.064442 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.064451 4921 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.064459 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9d3161d-0fd9-4116-8e46-74d541735563-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.387251 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.472707 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-scripts\") pod \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.472855 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-log-httpd\") pod \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.472929 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-sg-core-conf-yaml\") pod \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.472978 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vrmt\" (UniqueName: \"kubernetes.io/projected/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-kube-api-access-7vrmt\") pod \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.473045 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-combined-ca-bundle\") pod \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.473080 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-run-httpd\") pod \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.473113 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-config-data\") pod \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\" (UID: \"4af3faf6-1c64-4aa8-81ef-4093bf9ed247\") " Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.473693 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4af3faf6-1c64-4aa8-81ef-4093bf9ed247" (UID: "4af3faf6-1c64-4aa8-81ef-4093bf9ed247"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.473853 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.474437 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4af3faf6-1c64-4aa8-81ef-4093bf9ed247" (UID: "4af3faf6-1c64-4aa8-81ef-4093bf9ed247"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.479559 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-kube-api-access-7vrmt" (OuterVolumeSpecName: "kube-api-access-7vrmt") pod "4af3faf6-1c64-4aa8-81ef-4093bf9ed247" (UID: "4af3faf6-1c64-4aa8-81ef-4093bf9ed247"). InnerVolumeSpecName "kube-api-access-7vrmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.482282 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-scripts" (OuterVolumeSpecName: "scripts") pod "4af3faf6-1c64-4aa8-81ef-4093bf9ed247" (UID: "4af3faf6-1c64-4aa8-81ef-4093bf9ed247"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.511499 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4af3faf6-1c64-4aa8-81ef-4093bf9ed247" (UID: "4af3faf6-1c64-4aa8-81ef-4093bf9ed247"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.545763 4921 generic.go:334] "Generic (PLEG): container finished" podID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerID="accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f" exitCode=0 Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.545802 4921 generic.go:334] "Generic (PLEG): container finished" podID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerID="f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db" exitCode=2 Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.545853 4921 generic.go:334] "Generic (PLEG): container finished" podID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerID="8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca" exitCode=0 Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.545862 4921 generic.go:334] "Generic (PLEG): container finished" podID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerID="9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a" exitCode=0 Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.545949 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3faf6-1c64-4aa8-81ef-4093bf9ed247","Type":"ContainerDied","Data":"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f"} Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.545979 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3faf6-1c64-4aa8-81ef-4093bf9ed247","Type":"ContainerDied","Data":"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db"} Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.546025 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3faf6-1c64-4aa8-81ef-4093bf9ed247","Type":"ContainerDied","Data":"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca"} Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.546036 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3faf6-1c64-4aa8-81ef-4093bf9ed247","Type":"ContainerDied","Data":"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a"} Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.546047 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4af3faf6-1c64-4aa8-81ef-4093bf9ed247","Type":"ContainerDied","Data":"78d20937edc9058d4e201a4c65f2409f80321a75b8da59ff0e74eab259893f3b"} Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.546777 4921 scope.go:117] "RemoveContainer" containerID="accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.546960 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.559380 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-b8t7z" event={"ID":"a9d3161d-0fd9-4116-8e46-74d541735563","Type":"ContainerDied","Data":"50a3a92afd3945fc836f67deea5a57529e0917e1e89c5d340cf9890b87c50df1"} Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.559465 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50a3a92afd3945fc836f67deea5a57529e0917e1e89c5d340cf9890b87c50df1" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.559517 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-b8t7z" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.568021 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4af3faf6-1c64-4aa8-81ef-4093bf9ed247" (UID: "4af3faf6-1c64-4aa8-81ef-4093bf9ed247"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.576750 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.576779 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.576793 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vrmt\" (UniqueName: \"kubernetes.io/projected/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-kube-api-access-7vrmt\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.576806 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.576832 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.596360 4921 scope.go:117] "RemoveContainer" containerID="f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.612845 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-config-data" (OuterVolumeSpecName: "config-data") pod "4af3faf6-1c64-4aa8-81ef-4093bf9ed247" (UID: "4af3faf6-1c64-4aa8-81ef-4093bf9ed247"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.622702 4921 scope.go:117] "RemoveContainer" containerID="8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.652713 4921 scope.go:117] "RemoveContainer" containerID="9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.678693 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3faf6-1c64-4aa8-81ef-4093bf9ed247-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.680514 4921 scope.go:117] "RemoveContainer" containerID="accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f" Mar 12 13:29:23 crc kubenswrapper[4921]: E0312 13:29:23.682562 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f\": container with ID starting with accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f not found: ID does not exist" containerID="accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.682601 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f"} err="failed to get container status \"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f\": rpc error: code = NotFound desc = could not find container \"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f\": container with ID starting with accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.682627 4921 scope.go:117] "RemoveContainer" containerID="f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db" Mar 12 13:29:23 crc kubenswrapper[4921]: E0312 13:29:23.683136 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db\": container with ID starting with f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db not found: ID does not exist" containerID="f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.683159 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db"} err="failed to get container status \"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db\": rpc error: code = NotFound desc = could not find container \"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db\": container with ID starting with f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.683174 4921 scope.go:117] "RemoveContainer" containerID="8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca" Mar 12 13:29:23 crc kubenswrapper[4921]: E0312 13:29:23.683449 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca\": container with ID starting with 8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca not found: ID does not exist" containerID="8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.683474 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca"} err="failed to get container status \"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca\": rpc error: code = NotFound desc = could not find container \"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca\": container with ID starting with 8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.683490 4921 scope.go:117] "RemoveContainer" containerID="9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a" Mar 12 13:29:23 crc kubenswrapper[4921]: E0312 13:29:23.683764 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a\": container with ID starting with 9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a not found: ID does not exist" containerID="9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.683835 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a"} err="failed to get container status \"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a\": rpc error: code = NotFound desc = could not find container \"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a\": container with ID starting with 9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.683860 4921 scope.go:117] "RemoveContainer" containerID="accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.684089 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f"} err="failed to get container status \"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f\": rpc error: code = NotFound desc = could not find container \"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f\": container with ID starting with accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.684113 4921 scope.go:117] "RemoveContainer" containerID="f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.684696 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db"} err="failed to get container status \"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db\": rpc error: code = NotFound desc = could not find container \"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db\": container with ID starting with f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.684738 4921 scope.go:117] "RemoveContainer" containerID="8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.685006 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca"} err="failed to get container status \"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca\": rpc error: code = NotFound desc = could not find container \"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca\": container with ID starting with 8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.685036 4921 scope.go:117] "RemoveContainer" containerID="9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.685304 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a"} err="failed to get container status \"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a\": rpc error: code = NotFound desc = could not find container \"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a\": container with ID starting with 9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.685325 4921 scope.go:117] "RemoveContainer" containerID="accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.685562 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f"} err="failed to get container status \"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f\": rpc error: code = NotFound desc = could not find container \"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f\": container with ID starting with accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.685591 4921 scope.go:117] "RemoveContainer" containerID="f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.685891 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db"} err="failed to get container status \"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db\": rpc error: code = NotFound desc = could not find container \"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db\": container with ID starting with f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.685919 4921 scope.go:117] "RemoveContainer" containerID="8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.686150 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca"} err="failed to get container status \"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca\": rpc error: code = NotFound desc = could not find container \"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca\": container with ID starting with 8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.686186 4921 scope.go:117] "RemoveContainer" containerID="9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.686399 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a"} err="failed to get container status \"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a\": rpc error: code = NotFound desc = could not find container \"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a\": container with ID starting with 9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.686423 4921 scope.go:117] "RemoveContainer" containerID="accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.686632 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f"} err="failed to get container status \"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f\": rpc error: code = NotFound desc = could not find container \"accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f\": container with ID starting with accf4e0cb20a656d5e278179967dab7cf7c430edcbd29036da1140035db60a4f not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.686658 4921 scope.go:117] "RemoveContainer" containerID="f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.687046 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db"} err="failed to get container status \"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db\": rpc error: code = NotFound desc = could not find container \"f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db\": container with ID starting with f268c12bde30c2672ff9ad9bd0edc83e4285937d19470228f339a4ec4f7e90db not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.687073 4921 scope.go:117] "RemoveContainer" containerID="8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.687281 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca"} err="failed to get container status \"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca\": rpc error: code = NotFound desc = could not find container \"8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca\": container with ID starting with 8ea181b4ee793446725dd0d9afe2c0cd390a29c982732084b9337af8fdb322ca not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.687306 4921 scope.go:117] "RemoveContainer" containerID="9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.687514 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a"} err="failed to get container status \"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a\": rpc error: code = NotFound desc = could not find container \"9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a\": container with ID starting with 9db9f18ebf813d9cf8ff84634a4a298255c4affb7582d2cad114f2a39da4a54a not found: ID does not exist" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.757048 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 13:29:23 crc kubenswrapper[4921]: E0312 13:29:23.757355 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98cd11fa-112d-45fe-b67e-ae910048dfbf" containerName="init" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.757370 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="98cd11fa-112d-45fe-b67e-ae910048dfbf" containerName="init" Mar 12 13:29:23 crc kubenswrapper[4921]: E0312 13:29:23.757387 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="ceilometer-central-agent" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.757394 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="ceilometer-central-agent" Mar 12 13:29:23 crc kubenswrapper[4921]: E0312 13:29:23.757416 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98cd11fa-112d-45fe-b67e-ae910048dfbf" containerName="dnsmasq-dns" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.757423 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="98cd11fa-112d-45fe-b67e-ae910048dfbf" containerName="dnsmasq-dns" Mar 12 13:29:23 crc kubenswrapper[4921]: E0312 13:29:23.757433 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="proxy-httpd" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.757439 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="proxy-httpd" Mar 12 13:29:23 crc kubenswrapper[4921]: E0312 13:29:23.757446 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="ceilometer-notification-agent" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.757453 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="ceilometer-notification-agent" Mar 12 13:29:23 crc kubenswrapper[4921]: E0312 13:29:23.757460 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9d3161d-0fd9-4116-8e46-74d541735563" containerName="cinder-db-sync" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.757466 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9d3161d-0fd9-4116-8e46-74d541735563" containerName="cinder-db-sync" Mar 12 13:29:23 crc kubenswrapper[4921]: E0312 13:29:23.757474 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="sg-core" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.757479 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="sg-core" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.757658 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="ceilometer-notification-agent" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.757676 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9d3161d-0fd9-4116-8e46-74d541735563" containerName="cinder-db-sync" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.757684 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="sg-core" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.757692 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="ceilometer-central-agent" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.757703 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="98cd11fa-112d-45fe-b67e-ae910048dfbf" containerName="dnsmasq-dns" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.757712 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" containerName="proxy-httpd" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.758481 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.761873 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.762047 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-kqsqg" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.762195 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.762418 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.779571 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb54c\" (UniqueName: \"kubernetes.io/projected/cf411ab8-cf05-4dbb-99d6-05c15227d433-kube-api-access-pb54c\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.779625 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf411ab8-cf05-4dbb-99d6-05c15227d433-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.779666 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-config-data\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.779693 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.779731 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.779752 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-scripts\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.823359 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.832987 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f64d5748f-75x54"] Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.836678 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.861155 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f64d5748f-75x54"] Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.881002 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-config\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.881048 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-dns-svc\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.881082 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb54c\" (UniqueName: \"kubernetes.io/projected/cf411ab8-cf05-4dbb-99d6-05c15227d433-kube-api-access-pb54c\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.881123 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf411ab8-cf05-4dbb-99d6-05c15227d433-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.881177 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-ovsdbserver-sb\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.881202 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-config-data\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.881241 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.881295 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.881327 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-scripts\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.881350 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-ovsdbserver-nb\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.881364 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf411ab8-cf05-4dbb-99d6-05c15227d433-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.881382 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtm6q\" (UniqueName: \"kubernetes.io/projected/38361fc2-c7d9-44e0-a0bc-5874167f9f91-kube-api-access-mtm6q\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.891552 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-config-data\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.893834 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.897418 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.915580 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb54c\" (UniqueName: \"kubernetes.io/projected/cf411ab8-cf05-4dbb-99d6-05c15227d433-kube-api-access-pb54c\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.916032 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-scripts\") pod \"cinder-scheduler-0\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.920990 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.935522 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.945219 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.947234 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.950942 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.951142 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.960133 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.984701 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-scripts\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.984737 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8gh9\" (UniqueName: \"kubernetes.io/projected/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-kube-api-access-w8gh9\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.984777 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-ovsdbserver-sb\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.984870 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.984892 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-log-httpd\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.984926 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.984967 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-ovsdbserver-nb\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.985002 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtm6q\" (UniqueName: \"kubernetes.io/projected/38361fc2-c7d9-44e0-a0bc-5874167f9f91-kube-api-access-mtm6q\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.985034 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-run-httpd\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.985056 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-config-data\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.985108 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-config\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.985131 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-dns-svc\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.986926 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-ovsdbserver-nb\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.986997 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-config\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.987065 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-dns-svc\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:23 crc kubenswrapper[4921]: I0312 13:29:23.987415 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-ovsdbserver-sb\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.001748 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4af3faf6-1c64-4aa8-81ef-4093bf9ed247" path="/var/lib/kubelet/pods/4af3faf6-1c64-4aa8-81ef-4093bf9ed247/volumes" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.006736 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtm6q\" (UniqueName: \"kubernetes.io/projected/38361fc2-c7d9-44e0-a0bc-5874167f9f91-kube-api-access-mtm6q\") pod \"dnsmasq-dns-f64d5748f-75x54\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.041882 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:24 crc kubenswrapper[4921]: E0312 13:29:24.042587 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-w8gh9 log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.057476 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.058711 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.063720 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.072836 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.087102 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de5063d-1ba7-4dd8-af7b-8d7286177244-logs\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.087252 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.087316 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-config-data-custom\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.087343 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-scripts\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.087374 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8gh9\" (UniqueName: \"kubernetes.io/projected/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-kube-api-access-w8gh9\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.087400 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnmtc\" (UniqueName: \"kubernetes.io/projected/3de5063d-1ba7-4dd8-af7b-8d7286177244-kube-api-access-dnmtc\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.087602 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-config-data\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.087642 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.087656 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-log-httpd\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.087710 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.087758 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-scripts\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.087800 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3de5063d-1ba7-4dd8-af7b-8d7286177244-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.087870 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-run-httpd\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.087891 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-config-data\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.088620 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-run-httpd\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.090095 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-log-httpd\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.090506 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.090843 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-scripts\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.094157 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.095613 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.104447 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8gh9\" (UniqueName: \"kubernetes.io/projected/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-kube-api-access-w8gh9\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.108290 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-config-data\") pod \"ceilometer-0\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.168199 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.190016 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3de5063d-1ba7-4dd8-af7b-8d7286177244-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.190111 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de5063d-1ba7-4dd8-af7b-8d7286177244-logs\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.190186 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.190233 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-config-data-custom\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.190256 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnmtc\" (UniqueName: \"kubernetes.io/projected/3de5063d-1ba7-4dd8-af7b-8d7286177244-kube-api-access-dnmtc\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.190319 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-config-data\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.190359 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-scripts\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.190526 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de5063d-1ba7-4dd8-af7b-8d7286177244-logs\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.191750 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3de5063d-1ba7-4dd8-af7b-8d7286177244-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.197085 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-config-data-custom\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.197556 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.198848 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-scripts\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.199192 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-config-data\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.206657 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnmtc\" (UniqueName: \"kubernetes.io/projected/3de5063d-1ba7-4dd8-af7b-8d7286177244-kube-api-access-dnmtc\") pod \"cinder-api-0\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.375045 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.574594 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.596545 4921 generic.go:334] "Generic (PLEG): container finished" podID="cb9315da-7a44-4703-bc68-935d517a4412" containerID="549fabab75489983c2ff504acf4fb7d6e20b4ed5a226b2274e4a242f4d2f24e5" exitCode=0 Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.596668 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4d9jl" event={"ID":"cb9315da-7a44-4703-bc68-935d517a4412","Type":"ContainerDied","Data":"549fabab75489983c2ff504acf4fb7d6e20b4ed5a226b2274e4a242f4d2f24e5"} Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.602651 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.637606 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.680828 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f64d5748f-75x54"] Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.701354 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-combined-ca-bundle\") pod \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.701434 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-log-httpd\") pod \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.701648 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8gh9\" (UniqueName: \"kubernetes.io/projected/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-kube-api-access-w8gh9\") pod \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.701715 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-config-data\") pod \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.701742 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-sg-core-conf-yaml\") pod \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.701783 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-run-httpd\") pod \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.701831 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-scripts\") pod \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\" (UID: \"6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5\") " Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.705108 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5" (UID: "6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.705281 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5" (UID: "6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.708150 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-scripts" (OuterVolumeSpecName: "scripts") pod "6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5" (UID: "6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.708251 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-kube-api-access-w8gh9" (OuterVolumeSpecName: "kube-api-access-w8gh9") pod "6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5" (UID: "6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5"). InnerVolumeSpecName "kube-api-access-w8gh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.708381 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-config-data" (OuterVolumeSpecName: "config-data") pod "6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5" (UID: "6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.709023 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5" (UID: "6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.711716 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5" (UID: "6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.803378 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8gh9\" (UniqueName: \"kubernetes.io/projected/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-kube-api-access-w8gh9\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.803408 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.803418 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.803428 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.803436 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.803444 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.803452 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:24 crc kubenswrapper[4921]: I0312 13:29:24.884012 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.612774 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cf411ab8-cf05-4dbb-99d6-05c15227d433","Type":"ContainerStarted","Data":"8588d43d47c7715159c77c6c924e167b2be0f672f6f217b34c45393ce09ea035"} Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.614605 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3de5063d-1ba7-4dd8-af7b-8d7286177244","Type":"ContainerStarted","Data":"a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66"} Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.614653 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3de5063d-1ba7-4dd8-af7b-8d7286177244","Type":"ContainerStarted","Data":"72d9d9bb02b16e2148fdb59cac9938b2f1ef3b9ab6a8adfc68ece4f0ad2339a5"} Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.618258 4921 generic.go:334] "Generic (PLEG): container finished" podID="38361fc2-c7d9-44e0-a0bc-5874167f9f91" containerID="b0f8a6a272551669a44140fdc016aee05696510531604cf39f6d677993828a82" exitCode=0 Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.618325 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f64d5748f-75x54" event={"ID":"38361fc2-c7d9-44e0-a0bc-5874167f9f91","Type":"ContainerDied","Data":"b0f8a6a272551669a44140fdc016aee05696510531604cf39f6d677993828a82"} Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.618404 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f64d5748f-75x54" event={"ID":"38361fc2-c7d9-44e0-a0bc-5874167f9f91","Type":"ContainerStarted","Data":"08a9d4738b1a3e23a5c15e28e5127ac80244db70bbd91fe86959c0ee52e942c4"} Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.618410 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.700235 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.714557 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.733656 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.736015 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.742693 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.743371 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.763693 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.824664 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-log-httpd\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.824990 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-run-httpd\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.825172 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-config-data\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.825250 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-scripts\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.825302 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.825320 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgdpp\" (UniqueName: \"kubernetes.io/projected/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-kube-api-access-qgdpp\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.825357 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.928450 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-run-httpd\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.928513 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-config-data\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.928536 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-scripts\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.928555 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.928569 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgdpp\" (UniqueName: \"kubernetes.io/projected/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-kube-api-access-qgdpp\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.928589 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.928634 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-log-httpd\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.929107 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-log-httpd\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.929308 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-run-httpd\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.942205 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.947384 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-config-data\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.950001 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-scripts\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.959916 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:25 crc kubenswrapper[4921]: I0312 13:29:25.964233 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgdpp\" (UniqueName: \"kubernetes.io/projected/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-kube-api-access-qgdpp\") pod \"ceilometer-0\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " pod="openstack/ceilometer-0" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.017781 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5" path="/var/lib/kubelet/pods/6af5b094-b4d3-40b7-b4fe-d9ac6c9eb5a5/volumes" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.055784 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.073993 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4d9jl" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.080444 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.135409 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9315da-7a44-4703-bc68-935d517a4412-combined-ca-bundle\") pod \"cb9315da-7a44-4703-bc68-935d517a4412\" (UID: \"cb9315da-7a44-4703-bc68-935d517a4412\") " Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.135534 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbkt8\" (UniqueName: \"kubernetes.io/projected/cb9315da-7a44-4703-bc68-935d517a4412-kube-api-access-zbkt8\") pod \"cb9315da-7a44-4703-bc68-935d517a4412\" (UID: \"cb9315da-7a44-4703-bc68-935d517a4412\") " Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.135699 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb9315da-7a44-4703-bc68-935d517a4412-db-sync-config-data\") pod \"cb9315da-7a44-4703-bc68-935d517a4412\" (UID: \"cb9315da-7a44-4703-bc68-935d517a4412\") " Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.146025 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb9315da-7a44-4703-bc68-935d517a4412-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cb9315da-7a44-4703-bc68-935d517a4412" (UID: "cb9315da-7a44-4703-bc68-935d517a4412"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.148730 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb9315da-7a44-4703-bc68-935d517a4412-kube-api-access-zbkt8" (OuterVolumeSpecName: "kube-api-access-zbkt8") pod "cb9315da-7a44-4703-bc68-935d517a4412" (UID: "cb9315da-7a44-4703-bc68-935d517a4412"). InnerVolumeSpecName "kube-api-access-zbkt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.167888 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb9315da-7a44-4703-bc68-935d517a4412-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb9315da-7a44-4703-bc68-935d517a4412" (UID: "cb9315da-7a44-4703-bc68-935d517a4412"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.238126 4921 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cb9315da-7a44-4703-bc68-935d517a4412-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.238153 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9315da-7a44-4703-bc68-935d517a4412-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.238165 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbkt8\" (UniqueName: \"kubernetes.io/projected/cb9315da-7a44-4703-bc68-935d517a4412-kube-api-access-zbkt8\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.323870 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.323927 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.579798 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:26 crc kubenswrapper[4921]: W0312 13:29:26.587227 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0c119c7_ae7a_4f1c_9d57_8ae81fe32500.slice/crio-9c854b2f3ee25a0a9399c8f6a66d36484e172682ac103aed5971a8392f0d47f4 WatchSource:0}: Error finding container 9c854b2f3ee25a0a9399c8f6a66d36484e172682ac103aed5971a8392f0d47f4: Status 404 returned error can't find the container with id 9c854b2f3ee25a0a9399c8f6a66d36484e172682ac103aed5971a8392f0d47f4 Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.637330 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3de5063d-1ba7-4dd8-af7b-8d7286177244","Type":"ContainerStarted","Data":"af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3"} Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.637406 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3de5063d-1ba7-4dd8-af7b-8d7286177244" containerName="cinder-api-log" containerID="cri-o://a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66" gracePeriod=30 Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.637472 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.637498 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3de5063d-1ba7-4dd8-af7b-8d7286177244" containerName="cinder-api" containerID="cri-o://af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3" gracePeriod=30 Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.642298 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f64d5748f-75x54" event={"ID":"38361fc2-c7d9-44e0-a0bc-5874167f9f91","Type":"ContainerStarted","Data":"94c8c120f9dff68cf91fcb64856f0d0fcec42d9eab6c3ae5a66fbdebea7c0a37"} Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.644157 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.645603 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500","Type":"ContainerStarted","Data":"9c854b2f3ee25a0a9399c8f6a66d36484e172682ac103aed5971a8392f0d47f4"} Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.655575 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cf411ab8-cf05-4dbb-99d6-05c15227d433","Type":"ContainerStarted","Data":"ddc6034ca1768e62b2a90d78557e9de3eee96cf182d83646944f2a43d6164722"} Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.659580 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.659565875 podStartE2EDuration="2.659565875s" podCreationTimestamp="2026-03-12 13:29:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:26.658200994 +0000 UTC m=+1189.348272955" watchObservedRunningTime="2026-03-12 13:29:26.659565875 +0000 UTC m=+1189.349637846" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.662605 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-4d9jl" event={"ID":"cb9315da-7a44-4703-bc68-935d517a4412","Type":"ContainerDied","Data":"c1497eae4ebbacd53faf566f3d5608cd2094faf64633b6ea555c1eefd2bce89b"} Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.662637 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1497eae4ebbacd53faf566f3d5608cd2094faf64633b6ea555c1eefd2bce89b" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.662734 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-4d9jl" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.692000 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f64d5748f-75x54" podStartSLOduration=3.691976375 podStartE2EDuration="3.691976375s" podCreationTimestamp="2026-03-12 13:29:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:26.684429627 +0000 UTC m=+1189.374501588" watchObservedRunningTime="2026-03-12 13:29:26.691976375 +0000 UTC m=+1189.382048346" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.895849 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-594f99766c-xf6hh"] Mar 12 13:29:26 crc kubenswrapper[4921]: E0312 13:29:26.896766 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9315da-7a44-4703-bc68-935d517a4412" containerName="barbican-db-sync" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.896931 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9315da-7a44-4703-bc68-935d517a4412" containerName="barbican-db-sync" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.897238 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb9315da-7a44-4703-bc68-935d517a4412" containerName="barbican-db-sync" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.898298 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.914556 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qmvk5" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.923751 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.928661 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.951566 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c4d7515-b40d-418c-b32e-b6a857c040a7-logs\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.951628 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4d7515-b40d-418c-b32e-b6a857c040a7-combined-ca-bundle\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.951692 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzxmt\" (UniqueName: \"kubernetes.io/projected/6c4d7515-b40d-418c-b32e-b6a857c040a7-kube-api-access-tzxmt\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.951731 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c4d7515-b40d-418c-b32e-b6a857c040a7-config-data-custom\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.951756 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4d7515-b40d-418c-b32e-b6a857c040a7-config-data\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:26 crc kubenswrapper[4921]: I0312 13:29:26.971022 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-594f99766c-xf6hh"] Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.025879 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-76b64f84d4-tpqnj"] Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.027714 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.033661 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.052821 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47867e82-3783-4f22-bc4f-9128016cf98e-combined-ca-bundle\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.052878 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c4d7515-b40d-418c-b32e-b6a857c040a7-logs\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.052910 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4d7515-b40d-418c-b32e-b6a857c040a7-combined-ca-bundle\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.052940 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47867e82-3783-4f22-bc4f-9128016cf98e-config-data\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.052974 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzxmt\" (UniqueName: \"kubernetes.io/projected/6c4d7515-b40d-418c-b32e-b6a857c040a7-kube-api-access-tzxmt\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.052994 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsq68\" (UniqueName: \"kubernetes.io/projected/47867e82-3783-4f22-bc4f-9128016cf98e-kube-api-access-bsq68\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.053010 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47867e82-3783-4f22-bc4f-9128016cf98e-logs\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.053046 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c4d7515-b40d-418c-b32e-b6a857c040a7-config-data-custom\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.053066 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4d7515-b40d-418c-b32e-b6a857c040a7-config-data\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.053085 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47867e82-3783-4f22-bc4f-9128016cf98e-config-data-custom\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.054850 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c4d7515-b40d-418c-b32e-b6a857c040a7-logs\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.056106 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76b64f84d4-tpqnj"] Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.069585 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4d7515-b40d-418c-b32e-b6a857c040a7-config-data\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.074278 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6c4d7515-b40d-418c-b32e-b6a857c040a7-config-data-custom\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.078284 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4d7515-b40d-418c-b32e-b6a857c040a7-combined-ca-bundle\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.084876 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f64d5748f-75x54"] Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.103570 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzxmt\" (UniqueName: \"kubernetes.io/projected/6c4d7515-b40d-418c-b32e-b6a857c040a7-kube-api-access-tzxmt\") pod \"barbican-worker-594f99766c-xf6hh\" (UID: \"6c4d7515-b40d-418c-b32e-b6a857c040a7\") " pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.120961 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-msxxx"] Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.133242 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.152873 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-msxxx"] Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.154113 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krrmn\" (UniqueName: \"kubernetes.io/projected/c073128e-fc26-48f6-98d1-cdbb747363c6-kube-api-access-krrmn\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.154150 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-config\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.154171 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.154186 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.154211 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47867e82-3783-4f22-bc4f-9128016cf98e-combined-ca-bundle\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.154254 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47867e82-3783-4f22-bc4f-9128016cf98e-config-data\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.154284 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsq68\" (UniqueName: \"kubernetes.io/projected/47867e82-3783-4f22-bc4f-9128016cf98e-kube-api-access-bsq68\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.154299 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47867e82-3783-4f22-bc4f-9128016cf98e-logs\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.154332 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47867e82-3783-4f22-bc4f-9128016cf98e-config-data-custom\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.154373 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.157421 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/47867e82-3783-4f22-bc4f-9128016cf98e-logs\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.158064 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47867e82-3783-4f22-bc4f-9128016cf98e-combined-ca-bundle\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.175514 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47867e82-3783-4f22-bc4f-9128016cf98e-config-data\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.184380 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsq68\" (UniqueName: \"kubernetes.io/projected/47867e82-3783-4f22-bc4f-9128016cf98e-kube-api-access-bsq68\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.184443 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5856fbd666-n2nmr"] Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.185946 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.186386 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/47867e82-3783-4f22-bc4f-9128016cf98e-config-data-custom\") pod \"barbican-keystone-listener-76b64f84d4-tpqnj\" (UID: \"47867e82-3783-4f22-bc4f-9128016cf98e\") " pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.189682 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.230796 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5856fbd666-n2nmr"] Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.256059 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-config-data\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.256139 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krrmn\" (UniqueName: \"kubernetes.io/projected/c073128e-fc26-48f6-98d1-cdbb747363c6-kube-api-access-krrmn\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.256168 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-config\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.256195 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.256217 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.256257 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-logs\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.256306 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7465p\" (UniqueName: \"kubernetes.io/projected/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-kube-api-access-7465p\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.256401 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-config-data-custom\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.256424 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-combined-ca-bundle\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.256464 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.258254 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.259708 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-config\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.260104 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.260529 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.272139 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-594f99766c-xf6hh" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.296721 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krrmn\" (UniqueName: \"kubernetes.io/projected/c073128e-fc26-48f6-98d1-cdbb747363c6-kube-api-access-krrmn\") pod \"dnsmasq-dns-6d97fcdd8f-msxxx\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.350372 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.365080 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-config-data-custom\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.365134 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-combined-ca-bundle\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.365206 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-config-data\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.365267 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-logs\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.365320 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7465p\" (UniqueName: \"kubernetes.io/projected/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-kube-api-access-7465p\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.366235 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-logs\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.379405 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-config-data-custom\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.379920 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-combined-ca-bundle\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.409182 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-config-data\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.430297 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7465p\" (UniqueName: \"kubernetes.io/projected/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-kube-api-access-7465p\") pod \"barbican-api-5856fbd666-n2nmr\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.515426 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.526092 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.535362 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.571742 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-config-data\") pod \"3de5063d-1ba7-4dd8-af7b-8d7286177244\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.571937 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-config-data-custom\") pod \"3de5063d-1ba7-4dd8-af7b-8d7286177244\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.571980 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3de5063d-1ba7-4dd8-af7b-8d7286177244-etc-machine-id\") pod \"3de5063d-1ba7-4dd8-af7b-8d7286177244\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.572045 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnmtc\" (UniqueName: \"kubernetes.io/projected/3de5063d-1ba7-4dd8-af7b-8d7286177244-kube-api-access-dnmtc\") pod \"3de5063d-1ba7-4dd8-af7b-8d7286177244\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.572063 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-scripts\") pod \"3de5063d-1ba7-4dd8-af7b-8d7286177244\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.572113 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de5063d-1ba7-4dd8-af7b-8d7286177244-logs\") pod \"3de5063d-1ba7-4dd8-af7b-8d7286177244\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.572132 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-combined-ca-bundle\") pod \"3de5063d-1ba7-4dd8-af7b-8d7286177244\" (UID: \"3de5063d-1ba7-4dd8-af7b-8d7286177244\") " Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.573220 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3de5063d-1ba7-4dd8-af7b-8d7286177244-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3de5063d-1ba7-4dd8-af7b-8d7286177244" (UID: "3de5063d-1ba7-4dd8-af7b-8d7286177244"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.573336 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3de5063d-1ba7-4dd8-af7b-8d7286177244-logs" (OuterVolumeSpecName: "logs") pod "3de5063d-1ba7-4dd8-af7b-8d7286177244" (UID: "3de5063d-1ba7-4dd8-af7b-8d7286177244"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.578395 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3de5063d-1ba7-4dd8-af7b-8d7286177244" (UID: "3de5063d-1ba7-4dd8-af7b-8d7286177244"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.586121 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-scripts" (OuterVolumeSpecName: "scripts") pod "3de5063d-1ba7-4dd8-af7b-8d7286177244" (UID: "3de5063d-1ba7-4dd8-af7b-8d7286177244"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.590121 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de5063d-1ba7-4dd8-af7b-8d7286177244-kube-api-access-dnmtc" (OuterVolumeSpecName: "kube-api-access-dnmtc") pod "3de5063d-1ba7-4dd8-af7b-8d7286177244" (UID: "3de5063d-1ba7-4dd8-af7b-8d7286177244"). InnerVolumeSpecName "kube-api-access-dnmtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.636942 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3de5063d-1ba7-4dd8-af7b-8d7286177244" (UID: "3de5063d-1ba7-4dd8-af7b-8d7286177244"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.656843 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-config-data" (OuterVolumeSpecName: "config-data") pod "3de5063d-1ba7-4dd8-af7b-8d7286177244" (UID: "3de5063d-1ba7-4dd8-af7b-8d7286177244"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.674982 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.675017 4921 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3de5063d-1ba7-4dd8-af7b-8d7286177244-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.675029 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnmtc\" (UniqueName: \"kubernetes.io/projected/3de5063d-1ba7-4dd8-af7b-8d7286177244-kube-api-access-dnmtc\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.675039 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.675047 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3de5063d-1ba7-4dd8-af7b-8d7286177244-logs\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.675056 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.675063 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3de5063d-1ba7-4dd8-af7b-8d7286177244-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.697456 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cf411ab8-cf05-4dbb-99d6-05c15227d433","Type":"ContainerStarted","Data":"2889001decb261fab0b054e7c562eb32f623dd26b5eb31526e36aa2e3637b7bb"} Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.702556 4921 generic.go:334] "Generic (PLEG): container finished" podID="3de5063d-1ba7-4dd8-af7b-8d7286177244" containerID="af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3" exitCode=0 Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.702581 4921 generic.go:334] "Generic (PLEG): container finished" podID="3de5063d-1ba7-4dd8-af7b-8d7286177244" containerID="a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66" exitCode=143 Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.703023 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.705899 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3de5063d-1ba7-4dd8-af7b-8d7286177244","Type":"ContainerDied","Data":"af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3"} Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.705928 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3de5063d-1ba7-4dd8-af7b-8d7286177244","Type":"ContainerDied","Data":"a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66"} Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.705940 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3de5063d-1ba7-4dd8-af7b-8d7286177244","Type":"ContainerDied","Data":"72d9d9bb02b16e2148fdb59cac9938b2f1ef3b9ab6a8adfc68ece4f0ad2339a5"} Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.705954 4921 scope.go:117] "RemoveContainer" containerID="af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.719076 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.782033678 podStartE2EDuration="4.719059411s" podCreationTimestamp="2026-03-12 13:29:23 +0000 UTC" firstStartedPulling="2026-03-12 13:29:24.589092851 +0000 UTC m=+1187.279164822" lastFinishedPulling="2026-03-12 13:29:25.526118574 +0000 UTC m=+1188.216190555" observedRunningTime="2026-03-12 13:29:27.715010238 +0000 UTC m=+1190.405082209" watchObservedRunningTime="2026-03-12 13:29:27.719059411 +0000 UTC m=+1190.409131382" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.791890 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.794657 4921 scope.go:117] "RemoveContainer" containerID="a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.807019 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.824754 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 12 13:29:27 crc kubenswrapper[4921]: E0312 13:29:27.825114 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de5063d-1ba7-4dd8-af7b-8d7286177244" containerName="cinder-api-log" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.825128 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de5063d-1ba7-4dd8-af7b-8d7286177244" containerName="cinder-api-log" Mar 12 13:29:27 crc kubenswrapper[4921]: E0312 13:29:27.825152 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3de5063d-1ba7-4dd8-af7b-8d7286177244" containerName="cinder-api" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.825157 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de5063d-1ba7-4dd8-af7b-8d7286177244" containerName="cinder-api" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.825337 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de5063d-1ba7-4dd8-af7b-8d7286177244" containerName="cinder-api-log" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.825351 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3de5063d-1ba7-4dd8-af7b-8d7286177244" containerName="cinder-api" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.826148 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.835614 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.835936 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.836051 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.837929 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.849189 4921 scope.go:117] "RemoveContainer" containerID="af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3" Mar 12 13:29:27 crc kubenswrapper[4921]: E0312 13:29:27.849925 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3\": container with ID starting with af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3 not found: ID does not exist" containerID="af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.849986 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3"} err="failed to get container status \"af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3\": rpc error: code = NotFound desc = could not find container \"af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3\": container with ID starting with af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3 not found: ID does not exist" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.850012 4921 scope.go:117] "RemoveContainer" containerID="a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66" Mar 12 13:29:27 crc kubenswrapper[4921]: E0312 13:29:27.850641 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66\": container with ID starting with a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66 not found: ID does not exist" containerID="a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.850672 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66"} err="failed to get container status \"a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66\": rpc error: code = NotFound desc = could not find container \"a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66\": container with ID starting with a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66 not found: ID does not exist" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.850696 4921 scope.go:117] "RemoveContainer" containerID="af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.851162 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3"} err="failed to get container status \"af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3\": rpc error: code = NotFound desc = could not find container \"af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3\": container with ID starting with af016d1fd7f1d3bca0400c1d1f56bcd28f4960b151d7b651013a4f11ecba7bb3 not found: ID does not exist" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.851186 4921 scope.go:117] "RemoveContainer" containerID="a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.851996 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66"} err="failed to get container status \"a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66\": rpc error: code = NotFound desc = could not find container \"a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66\": container with ID starting with a2efc1383c3f2e2502e1d6a14d09f8b980e3952736d08837bc70c44185aaac66 not found: ID does not exist" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.882739 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.882790 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-scripts\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.882848 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.882973 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g927b\" (UniqueName: \"kubernetes.io/projected/a5b74f92-1f9b-4321-b549-47269e3eb04c-kube-api-access-g927b\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.883182 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5b74f92-1f9b-4321-b549-47269e3eb04c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.883382 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-config-data\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.883404 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.883577 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.883630 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5b74f92-1f9b-4321-b549-47269e3eb04c-logs\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.964526 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-76b64f84d4-tpqnj"] Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.991159 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.991216 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5b74f92-1f9b-4321-b549-47269e3eb04c-logs\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.991260 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.991285 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-scripts\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.991304 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.991337 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g927b\" (UniqueName: \"kubernetes.io/projected/a5b74f92-1f9b-4321-b549-47269e3eb04c-kube-api-access-g927b\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.991353 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5b74f92-1f9b-4321-b549-47269e3eb04c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.991387 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-config-data\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.991416 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.993142 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a5b74f92-1f9b-4321-b549-47269e3eb04c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.993657 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5b74f92-1f9b-4321-b549-47269e3eb04c-logs\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:27 crc kubenswrapper[4921]: I0312 13:29:27.998356 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.000479 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.002535 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.002601 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-scripts\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.002718 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-config-data-custom\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.009935 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g927b\" (UniqueName: \"kubernetes.io/projected/a5b74f92-1f9b-4321-b549-47269e3eb04c-kube-api-access-g927b\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.012279 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b74f92-1f9b-4321-b549-47269e3eb04c-config-data\") pod \"cinder-api-0\" (UID: \"a5b74f92-1f9b-4321-b549-47269e3eb04c\") " pod="openstack/cinder-api-0" Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.019215 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3de5063d-1ba7-4dd8-af7b-8d7286177244" path="/var/lib/kubelet/pods/3de5063d-1ba7-4dd8-af7b-8d7286177244/volumes" Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.020660 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-594f99766c-xf6hh"] Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.173339 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.201668 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-msxxx"] Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.229492 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5856fbd666-n2nmr"] Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.658730 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 12 13:29:28 crc kubenswrapper[4921]: W0312 13:29:28.670391 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5b74f92_1f9b_4321_b549_47269e3eb04c.slice/crio-bf3f7223f9ac0a959f7c8f9d41b61d5a58e6dcdf2e36ec45931650114da433be WatchSource:0}: Error finding container bf3f7223f9ac0a959f7c8f9d41b61d5a58e6dcdf2e36ec45931650114da433be: Status 404 returned error can't find the container with id bf3f7223f9ac0a959f7c8f9d41b61d5a58e6dcdf2e36ec45931650114da433be Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.720194 4921 generic.go:334] "Generic (PLEG): container finished" podID="c073128e-fc26-48f6-98d1-cdbb747363c6" containerID="c00a16430304386cbce881d70ab096702dcb2de626bce4ecbbbe89486aabafac" exitCode=0 Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.720303 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" event={"ID":"c073128e-fc26-48f6-98d1-cdbb747363c6","Type":"ContainerDied","Data":"c00a16430304386cbce881d70ab096702dcb2de626bce4ecbbbe89486aabafac"} Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.720347 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" event={"ID":"c073128e-fc26-48f6-98d1-cdbb747363c6","Type":"ContainerStarted","Data":"ed7a6a0820ca72a0749c72a878e56251a1e44693b3cbfa5cdcb49feba2d62f8c"} Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.723549 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" event={"ID":"47867e82-3783-4f22-bc4f-9128016cf98e","Type":"ContainerStarted","Data":"5c6533cb0bbb36cb09a9d6d80e4510df4fbd926fafd54fb44587c9087255e2af"} Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.742940 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500","Type":"ContainerStarted","Data":"8a4e54ba77111d37ff7d676f549552ee755ce8d297a481ce87b665e127397111"} Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.759723 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-594f99766c-xf6hh" event={"ID":"6c4d7515-b40d-418c-b32e-b6a857c040a7","Type":"ContainerStarted","Data":"97e512283905e698cff268e7a7b3b8d844bc3a723e345527976125c9851527b7"} Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.773611 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5b74f92-1f9b-4321-b549-47269e3eb04c","Type":"ContainerStarted","Data":"bf3f7223f9ac0a959f7c8f9d41b61d5a58e6dcdf2e36ec45931650114da433be"} Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.778357 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5856fbd666-n2nmr" event={"ID":"cb38ca02-497e-48a0-8b6b-c6b1dbea5568","Type":"ContainerStarted","Data":"588d974d9e58cc3b539e3f1d86ba4f7dcc00d32eb710a7c99a6008eb0dcdcf79"} Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.778463 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5856fbd666-n2nmr" event={"ID":"cb38ca02-497e-48a0-8b6b-c6b1dbea5568","Type":"ContainerStarted","Data":"c6a1e4e25694e1b84b5ed7ffb7cafecd22b85b9a5bf70c69336d6c2e42112506"} Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.778481 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5856fbd666-n2nmr" event={"ID":"cb38ca02-497e-48a0-8b6b-c6b1dbea5568","Type":"ContainerStarted","Data":"6d88f854df1d88d540445cdf47bcab0327ea8000c83bc5a9d3f192150835dc20"} Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.778969 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.779074 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.794154 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f64d5748f-75x54" podUID="38361fc2-c7d9-44e0-a0bc-5874167f9f91" containerName="dnsmasq-dns" containerID="cri-o://94c8c120f9dff68cf91fcb64856f0d0fcec42d9eab6c3ae5a66fbdebea7c0a37" gracePeriod=10 Mar 12 13:29:28 crc kubenswrapper[4921]: I0312 13:29:28.836802 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5856fbd666-n2nmr" podStartSLOduration=1.8367736159999999 podStartE2EDuration="1.836773616s" podCreationTimestamp="2026-03-12 13:29:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:28.814042899 +0000 UTC m=+1191.504114870" watchObservedRunningTime="2026-03-12 13:29:28.836773616 +0000 UTC m=+1191.526845587" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.095615 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.398730 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.533570 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-ovsdbserver-sb\") pod \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.533931 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtm6q\" (UniqueName: \"kubernetes.io/projected/38361fc2-c7d9-44e0-a0bc-5874167f9f91-kube-api-access-mtm6q\") pod \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.533990 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-config\") pod \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.534045 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-dns-svc\") pod \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.534098 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-ovsdbserver-nb\") pod \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\" (UID: \"38361fc2-c7d9-44e0-a0bc-5874167f9f91\") " Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.542982 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38361fc2-c7d9-44e0-a0bc-5874167f9f91-kube-api-access-mtm6q" (OuterVolumeSpecName: "kube-api-access-mtm6q") pod "38361fc2-c7d9-44e0-a0bc-5874167f9f91" (UID: "38361fc2-c7d9-44e0-a0bc-5874167f9f91"). InnerVolumeSpecName "kube-api-access-mtm6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.576503 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38361fc2-c7d9-44e0-a0bc-5874167f9f91" (UID: "38361fc2-c7d9-44e0-a0bc-5874167f9f91"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.581893 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-config" (OuterVolumeSpecName: "config") pod "38361fc2-c7d9-44e0-a0bc-5874167f9f91" (UID: "38361fc2-c7d9-44e0-a0bc-5874167f9f91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.582718 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38361fc2-c7d9-44e0-a0bc-5874167f9f91" (UID: "38361fc2-c7d9-44e0-a0bc-5874167f9f91"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.589463 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "38361fc2-c7d9-44e0-a0bc-5874167f9f91" (UID: "38361fc2-c7d9-44e0-a0bc-5874167f9f91"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.635997 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.636042 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtm6q\" (UniqueName: \"kubernetes.io/projected/38361fc2-c7d9-44e0-a0bc-5874167f9f91-kube-api-access-mtm6q\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.636149 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.636163 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.636173 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38361fc2-c7d9-44e0-a0bc-5874167f9f91-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.806804 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5b74f92-1f9b-4321-b549-47269e3eb04c","Type":"ContainerStarted","Data":"0a6b9681bc426480adcf25f763e878f1b1a08d49ba858d674a18b62b4effe1ea"} Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.809210 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" event={"ID":"c073128e-fc26-48f6-98d1-cdbb747363c6","Type":"ContainerStarted","Data":"17d350bba769855dcb48166e7328c4f44fbc010288bd9d064cc318b9c562d02e"} Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.809359 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.848882 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" podStartSLOduration=2.848860798 podStartE2EDuration="2.848860798s" podCreationTimestamp="2026-03-12 13:29:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:29.832032909 +0000 UTC m=+1192.522104900" watchObservedRunningTime="2026-03-12 13:29:29.848860798 +0000 UTC m=+1192.538932769" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.854033 4921 generic.go:334] "Generic (PLEG): container finished" podID="38361fc2-c7d9-44e0-a0bc-5874167f9f91" containerID="94c8c120f9dff68cf91fcb64856f0d0fcec42d9eab6c3ae5a66fbdebea7c0a37" exitCode=0 Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.857863 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f64d5748f-75x54" event={"ID":"38361fc2-c7d9-44e0-a0bc-5874167f9f91","Type":"ContainerDied","Data":"94c8c120f9dff68cf91fcb64856f0d0fcec42d9eab6c3ae5a66fbdebea7c0a37"} Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.857935 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f64d5748f-75x54" event={"ID":"38361fc2-c7d9-44e0-a0bc-5874167f9f91","Type":"ContainerDied","Data":"08a9d4738b1a3e23a5c15e28e5127ac80244db70bbd91fe86959c0ee52e942c4"} Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.857958 4921 scope.go:117] "RemoveContainer" containerID="94c8c120f9dff68cf91fcb64856f0d0fcec42d9eab6c3ae5a66fbdebea7c0a37" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.858177 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f64d5748f-75x54" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.965557 4921 scope.go:117] "RemoveContainer" containerID="b0f8a6a272551669a44140fdc016aee05696510531604cf39f6d677993828a82" Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.996218 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f64d5748f-75x54"] Mar 12 13:29:29 crc kubenswrapper[4921]: I0312 13:29:29.996249 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f64d5748f-75x54"] Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.022810 4921 scope.go:117] "RemoveContainer" containerID="94c8c120f9dff68cf91fcb64856f0d0fcec42d9eab6c3ae5a66fbdebea7c0a37" Mar 12 13:29:30 crc kubenswrapper[4921]: E0312 13:29:30.023581 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94c8c120f9dff68cf91fcb64856f0d0fcec42d9eab6c3ae5a66fbdebea7c0a37\": container with ID starting with 94c8c120f9dff68cf91fcb64856f0d0fcec42d9eab6c3ae5a66fbdebea7c0a37 not found: ID does not exist" containerID="94c8c120f9dff68cf91fcb64856f0d0fcec42d9eab6c3ae5a66fbdebea7c0a37" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.023610 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94c8c120f9dff68cf91fcb64856f0d0fcec42d9eab6c3ae5a66fbdebea7c0a37"} err="failed to get container status \"94c8c120f9dff68cf91fcb64856f0d0fcec42d9eab6c3ae5a66fbdebea7c0a37\": rpc error: code = NotFound desc = could not find container \"94c8c120f9dff68cf91fcb64856f0d0fcec42d9eab6c3ae5a66fbdebea7c0a37\": container with ID starting with 94c8c120f9dff68cf91fcb64856f0d0fcec42d9eab6c3ae5a66fbdebea7c0a37 not found: ID does not exist" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.023632 4921 scope.go:117] "RemoveContainer" containerID="b0f8a6a272551669a44140fdc016aee05696510531604cf39f6d677993828a82" Mar 12 13:29:30 crc kubenswrapper[4921]: E0312 13:29:30.024080 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0f8a6a272551669a44140fdc016aee05696510531604cf39f6d677993828a82\": container with ID starting with b0f8a6a272551669a44140fdc016aee05696510531604cf39f6d677993828a82 not found: ID does not exist" containerID="b0f8a6a272551669a44140fdc016aee05696510531604cf39f6d677993828a82" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.024102 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0f8a6a272551669a44140fdc016aee05696510531604cf39f6d677993828a82"} err="failed to get container status \"b0f8a6a272551669a44140fdc016aee05696510531604cf39f6d677993828a82\": rpc error: code = NotFound desc = could not find container \"b0f8a6a272551669a44140fdc016aee05696510531604cf39f6d677993828a82\": container with ID starting with b0f8a6a272551669a44140fdc016aee05696510531604cf39f6d677993828a82 not found: ID does not exist" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.774210 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-bcbd96998-bx4p5"] Mar 12 13:29:30 crc kubenswrapper[4921]: E0312 13:29:30.781445 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38361fc2-c7d9-44e0-a0bc-5874167f9f91" containerName="dnsmasq-dns" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.781481 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="38361fc2-c7d9-44e0-a0bc-5874167f9f91" containerName="dnsmasq-dns" Mar 12 13:29:30 crc kubenswrapper[4921]: E0312 13:29:30.781505 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38361fc2-c7d9-44e0-a0bc-5874167f9f91" containerName="init" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.781511 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="38361fc2-c7d9-44e0-a0bc-5874167f9f91" containerName="init" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.781967 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="38361fc2-c7d9-44e0-a0bc-5874167f9f91" containerName="dnsmasq-dns" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.783325 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.787389 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.787608 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.832193 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-bcbd96998-bx4p5"] Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.857287 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-combined-ca-bundle\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.857344 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-internal-tls-certs\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.857362 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-public-tls-certs\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.857404 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-config-data\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.857434 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-config-data-custom\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.857451 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5klz\" (UniqueName: \"kubernetes.io/projected/59a6f440-5a89-42a7-baa1-77a875476665-kube-api-access-h5klz\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.857474 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59a6f440-5a89-42a7-baa1-77a875476665-logs\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.902979 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" event={"ID":"47867e82-3783-4f22-bc4f-9128016cf98e","Type":"ContainerStarted","Data":"b42197850481a90fbe352f5b0c0c0203976ed3f102103d2bb1ca69e0346a75db"} Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.903018 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" event={"ID":"47867e82-3783-4f22-bc4f-9128016cf98e","Type":"ContainerStarted","Data":"bdc6ab98b439b85222f3ba06a593b79816ad43e3111d3bb044ec01a3914dac0b"} Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.905388 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500","Type":"ContainerStarted","Data":"9f27556e5c3f50472dd219423dccabe1ab44e19b36b80ab0ee364851ae33325a"} Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.907417 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a5b74f92-1f9b-4321-b549-47269e3eb04c","Type":"ContainerStarted","Data":"b56d5e96de534d7943073b2f995dfd7ae906bf62129678f2e34765b81e00b742"} Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.907444 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.925782 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-76b64f84d4-tpqnj" podStartSLOduration=3.134669721 podStartE2EDuration="4.92575385s" podCreationTimestamp="2026-03-12 13:29:26 +0000 UTC" firstStartedPulling="2026-03-12 13:29:27.97243018 +0000 UTC m=+1190.662502151" lastFinishedPulling="2026-03-12 13:29:29.763514309 +0000 UTC m=+1192.453586280" observedRunningTime="2026-03-12 13:29:30.923375287 +0000 UTC m=+1193.613447268" watchObservedRunningTime="2026-03-12 13:29:30.92575385 +0000 UTC m=+1193.615825821" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.952839 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.952800257 podStartE2EDuration="3.952800257s" podCreationTimestamp="2026-03-12 13:29:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:30.951001113 +0000 UTC m=+1193.641073084" watchObservedRunningTime="2026-03-12 13:29:30.952800257 +0000 UTC m=+1193.642872228" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.959046 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-config-data\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.959710 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-config-data-custom\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.959967 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5klz\" (UniqueName: \"kubernetes.io/projected/59a6f440-5a89-42a7-baa1-77a875476665-kube-api-access-h5klz\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.959993 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59a6f440-5a89-42a7-baa1-77a875476665-logs\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.960489 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-combined-ca-bundle\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.960552 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-internal-tls-certs\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.960569 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-public-tls-certs\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.961104 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59a6f440-5a89-42a7-baa1-77a875476665-logs\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.966835 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-config-data\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.968965 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-config-data-custom\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.970066 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-public-tls-certs\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.971122 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-combined-ca-bundle\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.977439 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a6f440-5a89-42a7-baa1-77a875476665-internal-tls-certs\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:30 crc kubenswrapper[4921]: I0312 13:29:30.977864 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5klz\" (UniqueName: \"kubernetes.io/projected/59a6f440-5a89-42a7-baa1-77a875476665-kube-api-access-h5klz\") pod \"barbican-api-bcbd96998-bx4p5\" (UID: \"59a6f440-5a89-42a7-baa1-77a875476665\") " pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:31 crc kubenswrapper[4921]: I0312 13:29:31.114374 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:31 crc kubenswrapper[4921]: I0312 13:29:31.581624 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-bcbd96998-bx4p5"] Mar 12 13:29:31 crc kubenswrapper[4921]: W0312 13:29:31.590022 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59a6f440_5a89_42a7_baa1_77a875476665.slice/crio-d28344eff940da2d46737738875e3d1c38a1d340d7dadc84fb665fee2a184167 WatchSource:0}: Error finding container d28344eff940da2d46737738875e3d1c38a1d340d7dadc84fb665fee2a184167: Status 404 returned error can't find the container with id d28344eff940da2d46737738875e3d1c38a1d340d7dadc84fb665fee2a184167 Mar 12 13:29:31 crc kubenswrapper[4921]: I0312 13:29:31.929483 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500","Type":"ContainerStarted","Data":"08f5b742b9a925ddd2db57514e75c2687399998cd21ba35f6b3848036beb9705"} Mar 12 13:29:31 crc kubenswrapper[4921]: I0312 13:29:31.933898 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-594f99766c-xf6hh" event={"ID":"6c4d7515-b40d-418c-b32e-b6a857c040a7","Type":"ContainerStarted","Data":"b5c781f44f108183679b77f73e6591eebc098d1e86b33cd1bc11c0144f720978"} Mar 12 13:29:31 crc kubenswrapper[4921]: I0312 13:29:31.933938 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-594f99766c-xf6hh" event={"ID":"6c4d7515-b40d-418c-b32e-b6a857c040a7","Type":"ContainerStarted","Data":"1d60c1e8b972b1f67302e8705987d5d90547b8f48b8ff45bd1beaaeb0b7a4ec2"} Mar 12 13:29:31 crc kubenswrapper[4921]: I0312 13:29:31.944253 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bcbd96998-bx4p5" event={"ID":"59a6f440-5a89-42a7-baa1-77a875476665","Type":"ContainerStarted","Data":"933de2b8912c1c4eb6ce5c9d11996076a23d14e7c7f815996281cad7e228744a"} Mar 12 13:29:31 crc kubenswrapper[4921]: I0312 13:29:31.944312 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bcbd96998-bx4p5" event={"ID":"59a6f440-5a89-42a7-baa1-77a875476665","Type":"ContainerStarted","Data":"d28344eff940da2d46737738875e3d1c38a1d340d7dadc84fb665fee2a184167"} Mar 12 13:29:31 crc kubenswrapper[4921]: I0312 13:29:31.953613 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-594f99766c-xf6hh" podStartSLOduration=3.371720205 podStartE2EDuration="5.953596397s" podCreationTimestamp="2026-03-12 13:29:26 +0000 UTC" firstStartedPulling="2026-03-12 13:29:28.012440319 +0000 UTC m=+1190.702512300" lastFinishedPulling="2026-03-12 13:29:30.594316511 +0000 UTC m=+1193.284388492" observedRunningTime="2026-03-12 13:29:31.949093292 +0000 UTC m=+1194.639165263" watchObservedRunningTime="2026-03-12 13:29:31.953596397 +0000 UTC m=+1194.643668358" Mar 12 13:29:31 crc kubenswrapper[4921]: I0312 13:29:31.992318 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38361fc2-c7d9-44e0-a0bc-5874167f9f91" path="/var/lib/kubelet/pods/38361fc2-c7d9-44e0-a0bc-5874167f9f91/volumes" Mar 12 13:29:32 crc kubenswrapper[4921]: I0312 13:29:32.960429 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bcbd96998-bx4p5" event={"ID":"59a6f440-5a89-42a7-baa1-77a875476665","Type":"ContainerStarted","Data":"b3d7bcb91dc23e94de029dd232ec538bf1ddc871c558c8a3fbb737f9f1a48ee4"} Mar 12 13:29:32 crc kubenswrapper[4921]: I0312 13:29:32.960658 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:32 crc kubenswrapper[4921]: I0312 13:29:32.996024 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-bcbd96998-bx4p5" podStartSLOduration=2.995999497 podStartE2EDuration="2.995999497s" podCreationTimestamp="2026-03-12 13:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:32.991205192 +0000 UTC m=+1195.681277183" watchObservedRunningTime="2026-03-12 13:29:32.995999497 +0000 UTC m=+1195.686071468" Mar 12 13:29:34 crc kubenswrapper[4921]: I0312 13:29:34.009780 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500","Type":"ContainerStarted","Data":"7e4a80126550f6bb637e807eeaddfb628a7fa3237155c65c26691d13eaab778f"} Mar 12 13:29:34 crc kubenswrapper[4921]: I0312 13:29:34.010433 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:34 crc kubenswrapper[4921]: I0312 13:29:34.010450 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 13:29:34 crc kubenswrapper[4921]: I0312 13:29:34.047365 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.957167962 podStartE2EDuration="9.047347596s" podCreationTimestamp="2026-03-12 13:29:25 +0000 UTC" firstStartedPulling="2026-03-12 13:29:26.589326453 +0000 UTC m=+1189.279398424" lastFinishedPulling="2026-03-12 13:29:33.679506087 +0000 UTC m=+1196.369578058" observedRunningTime="2026-03-12 13:29:34.034090955 +0000 UTC m=+1196.724162936" watchObservedRunningTime="2026-03-12 13:29:34.047347596 +0000 UTC m=+1196.737419567" Mar 12 13:29:34 crc kubenswrapper[4921]: I0312 13:29:34.326141 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 12 13:29:34 crc kubenswrapper[4921]: I0312 13:29:34.370396 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 13:29:35 crc kubenswrapper[4921]: I0312 13:29:35.016494 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cf411ab8-cf05-4dbb-99d6-05c15227d433" containerName="cinder-scheduler" containerID="cri-o://ddc6034ca1768e62b2a90d78557e9de3eee96cf182d83646944f2a43d6164722" gracePeriod=30 Mar 12 13:29:35 crc kubenswrapper[4921]: I0312 13:29:35.017634 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="cf411ab8-cf05-4dbb-99d6-05c15227d433" containerName="probe" containerID="cri-o://2889001decb261fab0b054e7c562eb32f623dd26b5eb31526e36aa2e3637b7bb" gracePeriod=30 Mar 12 13:29:35 crc kubenswrapper[4921]: I0312 13:29:35.790324 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.027210 4921 generic.go:334] "Generic (PLEG): container finished" podID="cf411ab8-cf05-4dbb-99d6-05c15227d433" containerID="2889001decb261fab0b054e7c562eb32f623dd26b5eb31526e36aa2e3637b7bb" exitCode=0 Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.027260 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cf411ab8-cf05-4dbb-99d6-05c15227d433","Type":"ContainerDied","Data":"2889001decb261fab0b054e7c562eb32f623dd26b5eb31526e36aa2e3637b7bb"} Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.052948 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-676f8c65df-nxrf5"] Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.053217 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-676f8c65df-nxrf5" podUID="57d73461-cb3e-4790-9576-1cb19e03815c" containerName="neutron-api" containerID="cri-o://694464c1f45d4e2e6d1383d11455ef2cd84aec943d4c7963d7e68de0022a32c6" gracePeriod=30 Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.054500 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-676f8c65df-nxrf5" podUID="57d73461-cb3e-4790-9576-1cb19e03815c" containerName="neutron-httpd" containerID="cri-o://882f00afa5a8574380a3cbcdeb51f134e2802146be94414cd8bf3305d0077bf2" gracePeriod=30 Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.063074 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-676f8c65df-nxrf5" podUID="57d73461-cb3e-4790-9576-1cb19e03815c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.146:9696/\": EOF" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.080588 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77dd7dfdbc-bp67m"] Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.081879 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.093195 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77dd7dfdbc-bp67m"] Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.171124 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-config\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.171249 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-combined-ca-bundle\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.171279 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-ovndb-tls-certs\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.171325 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-public-tls-certs\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.171353 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-internal-tls-certs\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.171427 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d9lp\" (UniqueName: \"kubernetes.io/projected/426d27b4-1f08-4c20-84c9-67b47fbc4753-kube-api-access-7d9lp\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.171457 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-httpd-config\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.276725 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d9lp\" (UniqueName: \"kubernetes.io/projected/426d27b4-1f08-4c20-84c9-67b47fbc4753-kube-api-access-7d9lp\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.276795 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-httpd-config\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.276850 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-config\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.276904 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-combined-ca-bundle\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.276929 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-ovndb-tls-certs\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.276944 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-public-tls-certs\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.276973 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-internal-tls-certs\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.282603 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-internal-tls-certs\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.282645 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-config\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.284482 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-public-tls-certs\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.285461 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-httpd-config\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.290792 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-ovndb-tls-certs\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.296257 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-combined-ca-bundle\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.301362 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d9lp\" (UniqueName: \"kubernetes.io/projected/426d27b4-1f08-4c20-84c9-67b47fbc4753-kube-api-access-7d9lp\") pod \"neutron-77dd7dfdbc-bp67m\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:36 crc kubenswrapper[4921]: I0312 13:29:36.411298 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:37 crc kubenswrapper[4921]: I0312 13:29:37.035677 4921 generic.go:334] "Generic (PLEG): container finished" podID="57d73461-cb3e-4790-9576-1cb19e03815c" containerID="882f00afa5a8574380a3cbcdeb51f134e2802146be94414cd8bf3305d0077bf2" exitCode=0 Mar 12 13:29:37 crc kubenswrapper[4921]: I0312 13:29:37.035966 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-676f8c65df-nxrf5" event={"ID":"57d73461-cb3e-4790-9576-1cb19e03815c","Type":"ContainerDied","Data":"882f00afa5a8574380a3cbcdeb51f134e2802146be94414cd8bf3305d0077bf2"} Mar 12 13:29:37 crc kubenswrapper[4921]: I0312 13:29:37.134926 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77dd7dfdbc-bp67m"] Mar 12 13:29:37 crc kubenswrapper[4921]: I0312 13:29:37.518465 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:29:37 crc kubenswrapper[4921]: I0312 13:29:37.604528 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-hjhq8"] Mar 12 13:29:37 crc kubenswrapper[4921]: I0312 13:29:37.604833 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" podUID="9bd9f449-a423-4e35-9177-37728b5fdcf9" containerName="dnsmasq-dns" containerID="cri-o://56aeeb2d6810f39ed6253f94ef23636bfe49078974a5a2c62e0a0b5dbb5b4da7" gracePeriod=10 Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.101704 4921 generic.go:334] "Generic (PLEG): container finished" podID="9bd9f449-a423-4e35-9177-37728b5fdcf9" containerID="56aeeb2d6810f39ed6253f94ef23636bfe49078974a5a2c62e0a0b5dbb5b4da7" exitCode=0 Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.102171 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" event={"ID":"9bd9f449-a423-4e35-9177-37728b5fdcf9","Type":"ContainerDied","Data":"56aeeb2d6810f39ed6253f94ef23636bfe49078974a5a2c62e0a0b5dbb5b4da7"} Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.108216 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77dd7dfdbc-bp67m" event={"ID":"426d27b4-1f08-4c20-84c9-67b47fbc4753","Type":"ContainerStarted","Data":"a6f5bd804e97a767d1a05fb61556f05b13af54eb76fa4abc65dc0c39906023dc"} Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.108264 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77dd7dfdbc-bp67m" event={"ID":"426d27b4-1f08-4c20-84c9-67b47fbc4753","Type":"ContainerStarted","Data":"4d5f602f5fbbc38d69a28156dde47e51200d00e7f0a8d0d4bcb619b9900a14c7"} Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.180287 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-676f8c65df-nxrf5" podUID="57d73461-cb3e-4790-9576-1cb19e03815c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.146:9696/\": dial tcp 10.217.0.146:9696: connect: connection refused" Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.364177 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.418523 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.554178 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-dns-svc\") pod \"9bd9f449-a423-4e35-9177-37728b5fdcf9\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.554294 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-ovsdbserver-nb\") pod \"9bd9f449-a423-4e35-9177-37728b5fdcf9\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.554372 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq6zg\" (UniqueName: \"kubernetes.io/projected/9bd9f449-a423-4e35-9177-37728b5fdcf9-kube-api-access-kq6zg\") pod \"9bd9f449-a423-4e35-9177-37728b5fdcf9\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.554408 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-ovsdbserver-sb\") pod \"9bd9f449-a423-4e35-9177-37728b5fdcf9\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.554449 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-config\") pod \"9bd9f449-a423-4e35-9177-37728b5fdcf9\" (UID: \"9bd9f449-a423-4e35-9177-37728b5fdcf9\") " Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.566308 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bd9f449-a423-4e35-9177-37728b5fdcf9-kube-api-access-kq6zg" (OuterVolumeSpecName: "kube-api-access-kq6zg") pod "9bd9f449-a423-4e35-9177-37728b5fdcf9" (UID: "9bd9f449-a423-4e35-9177-37728b5fdcf9"). InnerVolumeSpecName "kube-api-access-kq6zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.608560 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-config" (OuterVolumeSpecName: "config") pod "9bd9f449-a423-4e35-9177-37728b5fdcf9" (UID: "9bd9f449-a423-4e35-9177-37728b5fdcf9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.615216 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9bd9f449-a423-4e35-9177-37728b5fdcf9" (UID: "9bd9f449-a423-4e35-9177-37728b5fdcf9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.627711 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9bd9f449-a423-4e35-9177-37728b5fdcf9" (UID: "9bd9f449-a423-4e35-9177-37728b5fdcf9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.632209 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9bd9f449-a423-4e35-9177-37728b5fdcf9" (UID: "9bd9f449-a423-4e35-9177-37728b5fdcf9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.656628 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.656667 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.656680 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq6zg\" (UniqueName: \"kubernetes.io/projected/9bd9f449-a423-4e35-9177-37728b5fdcf9-kube-api-access-kq6zg\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.656690 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:38 crc kubenswrapper[4921]: I0312 13:29:38.656699 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bd9f449-a423-4e35-9177-37728b5fdcf9-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.126081 4921 generic.go:334] "Generic (PLEG): container finished" podID="cf411ab8-cf05-4dbb-99d6-05c15227d433" containerID="ddc6034ca1768e62b2a90d78557e9de3eee96cf182d83646944f2a43d6164722" exitCode=0 Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.126185 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cf411ab8-cf05-4dbb-99d6-05c15227d433","Type":"ContainerDied","Data":"ddc6034ca1768e62b2a90d78557e9de3eee96cf182d83646944f2a43d6164722"} Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.131085 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77dd7dfdbc-bp67m" event={"ID":"426d27b4-1f08-4c20-84c9-67b47fbc4753","Type":"ContainerStarted","Data":"65c14d7e8788a0f9a68db25707046d3b0017bcec70c84e179a1930b781c69dc4"} Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.131186 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.132455 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" event={"ID":"9bd9f449-a423-4e35-9177-37728b5fdcf9","Type":"ContainerDied","Data":"260f9baa05851cf421e6eeb8c04458d9e3d88cc1a450833d3c8a552bec8ee049"} Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.132500 4921 scope.go:117] "RemoveContainer" containerID="56aeeb2d6810f39ed6253f94ef23636bfe49078974a5a2c62e0a0b5dbb5b4da7" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.132516 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-hjhq8" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.159896 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77dd7dfdbc-bp67m" podStartSLOduration=3.159879622 podStartE2EDuration="3.159879622s" podCreationTimestamp="2026-03-12 13:29:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:39.154743247 +0000 UTC m=+1201.844815218" watchObservedRunningTime="2026-03-12 13:29:39.159879622 +0000 UTC m=+1201.849951593" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.217556 4921 scope.go:117] "RemoveContainer" containerID="f216143a8008bac79a5dc0d4fd1ded16e190386846073bf123e5c6be365d0bbc" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.233889 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-hjhq8"] Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.247120 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-hjhq8"] Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.499672 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.578421 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-config-data\") pod \"cf411ab8-cf05-4dbb-99d6-05c15227d433\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.578605 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-scripts\") pod \"cf411ab8-cf05-4dbb-99d6-05c15227d433\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.578634 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb54c\" (UniqueName: \"kubernetes.io/projected/cf411ab8-cf05-4dbb-99d6-05c15227d433-kube-api-access-pb54c\") pod \"cf411ab8-cf05-4dbb-99d6-05c15227d433\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.578679 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-combined-ca-bundle\") pod \"cf411ab8-cf05-4dbb-99d6-05c15227d433\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.578714 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf411ab8-cf05-4dbb-99d6-05c15227d433-etc-machine-id\") pod \"cf411ab8-cf05-4dbb-99d6-05c15227d433\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.578736 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-config-data-custom\") pod \"cf411ab8-cf05-4dbb-99d6-05c15227d433\" (UID: \"cf411ab8-cf05-4dbb-99d6-05c15227d433\") " Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.579072 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf411ab8-cf05-4dbb-99d6-05c15227d433-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cf411ab8-cf05-4dbb-99d6-05c15227d433" (UID: "cf411ab8-cf05-4dbb-99d6-05c15227d433"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.579788 4921 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cf411ab8-cf05-4dbb-99d6-05c15227d433-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.585763 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cf411ab8-cf05-4dbb-99d6-05c15227d433" (UID: "cf411ab8-cf05-4dbb-99d6-05c15227d433"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.589927 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-scripts" (OuterVolumeSpecName: "scripts") pod "cf411ab8-cf05-4dbb-99d6-05c15227d433" (UID: "cf411ab8-cf05-4dbb-99d6-05c15227d433"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.591962 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf411ab8-cf05-4dbb-99d6-05c15227d433-kube-api-access-pb54c" (OuterVolumeSpecName: "kube-api-access-pb54c") pod "cf411ab8-cf05-4dbb-99d6-05c15227d433" (UID: "cf411ab8-cf05-4dbb-99d6-05c15227d433"). InnerVolumeSpecName "kube-api-access-pb54c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.630936 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf411ab8-cf05-4dbb-99d6-05c15227d433" (UID: "cf411ab8-cf05-4dbb-99d6-05c15227d433"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.681391 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.681429 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb54c\" (UniqueName: \"kubernetes.io/projected/cf411ab8-cf05-4dbb-99d6-05c15227d433-kube-api-access-pb54c\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.681439 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.681447 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.692186 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-config-data" (OuterVolumeSpecName: "config-data") pod "cf411ab8-cf05-4dbb-99d6-05c15227d433" (UID: "cf411ab8-cf05-4dbb-99d6-05c15227d433"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.782942 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf411ab8-cf05-4dbb-99d6-05c15227d433-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:39 crc kubenswrapper[4921]: I0312 13:29:39.994060 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bd9f449-a423-4e35-9177-37728b5fdcf9" path="/var/lib/kubelet/pods/9bd9f449-a423-4e35-9177-37728b5fdcf9/volumes" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.142518 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.142734 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cf411ab8-cf05-4dbb-99d6-05c15227d433","Type":"ContainerDied","Data":"8588d43d47c7715159c77c6c924e167b2be0f672f6f217b34c45393ce09ea035"} Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.143214 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.144057 4921 scope.go:117] "RemoveContainer" containerID="2889001decb261fab0b054e7c562eb32f623dd26b5eb31526e36aa2e3637b7bb" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.211378 4921 scope.go:117] "RemoveContainer" containerID="ddc6034ca1768e62b2a90d78557e9de3eee96cf182d83646944f2a43d6164722" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.215489 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.222921 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.248953 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.262516 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 13:29:40 crc kubenswrapper[4921]: E0312 13:29:40.262862 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd9f449-a423-4e35-9177-37728b5fdcf9" containerName="init" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.262877 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd9f449-a423-4e35-9177-37728b5fdcf9" containerName="init" Mar 12 13:29:40 crc kubenswrapper[4921]: E0312 13:29:40.262892 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf411ab8-cf05-4dbb-99d6-05c15227d433" containerName="probe" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.262898 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf411ab8-cf05-4dbb-99d6-05c15227d433" containerName="probe" Mar 12 13:29:40 crc kubenswrapper[4921]: E0312 13:29:40.262916 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf411ab8-cf05-4dbb-99d6-05c15227d433" containerName="cinder-scheduler" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.262922 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf411ab8-cf05-4dbb-99d6-05c15227d433" containerName="cinder-scheduler" Mar 12 13:29:40 crc kubenswrapper[4921]: E0312 13:29:40.262937 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bd9f449-a423-4e35-9177-37728b5fdcf9" containerName="dnsmasq-dns" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.262943 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bd9f449-a423-4e35-9177-37728b5fdcf9" containerName="dnsmasq-dns" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.263101 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf411ab8-cf05-4dbb-99d6-05c15227d433" containerName="cinder-scheduler" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.263119 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bd9f449-a423-4e35-9177-37728b5fdcf9" containerName="dnsmasq-dns" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.263135 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf411ab8-cf05-4dbb-99d6-05c15227d433" containerName="probe" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.264376 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.266902 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.276213 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.393978 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cda98bc-d6ac-4204-8477-8ecd7dafb976-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.394041 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cda98bc-d6ac-4204-8477-8ecd7dafb976-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.394092 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf5m9\" (UniqueName: \"kubernetes.io/projected/7cda98bc-d6ac-4204-8477-8ecd7dafb976-kube-api-access-rf5m9\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.394128 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cda98bc-d6ac-4204-8477-8ecd7dafb976-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.394143 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cda98bc-d6ac-4204-8477-8ecd7dafb976-scripts\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.394164 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cda98bc-d6ac-4204-8477-8ecd7dafb976-config-data\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.480608 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7f7ffb8f48-l6m2k"] Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.495320 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cda98bc-d6ac-4204-8477-8ecd7dafb976-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.495407 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cda98bc-d6ac-4204-8477-8ecd7dafb976-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.495469 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf5m9\" (UniqueName: \"kubernetes.io/projected/7cda98bc-d6ac-4204-8477-8ecd7dafb976-kube-api-access-rf5m9\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.495520 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cda98bc-d6ac-4204-8477-8ecd7dafb976-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.495541 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cda98bc-d6ac-4204-8477-8ecd7dafb976-scripts\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.495576 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cda98bc-d6ac-4204-8477-8ecd7dafb976-config-data\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.495620 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cda98bc-d6ac-4204-8477-8ecd7dafb976-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.501005 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.503720 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cda98bc-d6ac-4204-8477-8ecd7dafb976-config-data\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.506361 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cda98bc-d6ac-4204-8477-8ecd7dafb976-scripts\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.510500 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cda98bc-d6ac-4204-8477-8ecd7dafb976-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.527051 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f7ffb8f48-l6m2k"] Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.530432 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cda98bc-d6ac-4204-8477-8ecd7dafb976-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.532411 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf5m9\" (UniqueName: \"kubernetes.io/projected/7cda98bc-d6ac-4204-8477-8ecd7dafb976-kube-api-access-rf5m9\") pod \"cinder-scheduler-0\" (UID: \"7cda98bc-d6ac-4204-8477-8ecd7dafb976\") " pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.587973 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.597453 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0091a555-ed5b-415c-ba49-7d2c64fdf54d-logs\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.597499 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-scripts\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.597517 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-combined-ca-bundle\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.597540 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-config-data\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.597593 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-internal-tls-certs\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.597624 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-public-tls-certs\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.597692 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4g9w\" (UniqueName: \"kubernetes.io/projected/0091a555-ed5b-415c-ba49-7d2c64fdf54d-kube-api-access-q4g9w\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.641786 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.708713 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-public-tls-certs\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.709117 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4g9w\" (UniqueName: \"kubernetes.io/projected/0091a555-ed5b-415c-ba49-7d2c64fdf54d-kube-api-access-q4g9w\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.709189 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0091a555-ed5b-415c-ba49-7d2c64fdf54d-logs\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.709207 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-scripts\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.709223 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-combined-ca-bundle\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.709244 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-config-data\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.709306 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-internal-tls-certs\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.712340 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0091a555-ed5b-415c-ba49-7d2c64fdf54d-logs\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.715738 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-scripts\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.731307 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-combined-ca-bundle\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.735545 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-public-tls-certs\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.736264 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-internal-tls-certs\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.736840 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0091a555-ed5b-415c-ba49-7d2c64fdf54d-config-data\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.764982 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4g9w\" (UniqueName: \"kubernetes.io/projected/0091a555-ed5b-415c-ba49-7d2c64fdf54d-kube-api-access-q4g9w\") pod \"placement-7f7ffb8f48-l6m2k\" (UID: \"0091a555-ed5b-415c-ba49-7d2c64fdf54d\") " pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:40 crc kubenswrapper[4921]: I0312 13:29:40.907865 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:41 crc kubenswrapper[4921]: I0312 13:29:41.218143 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 12 13:29:41 crc kubenswrapper[4921]: I0312 13:29:41.492907 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:41 crc kubenswrapper[4921]: I0312 13:29:41.503943 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 12 13:29:41 crc kubenswrapper[4921]: I0312 13:29:41.621044 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7f7ffb8f48-l6m2k"] Mar 12 13:29:42 crc kubenswrapper[4921]: I0312 13:29:42.003115 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf411ab8-cf05-4dbb-99d6-05c15227d433" path="/var/lib/kubelet/pods/cf411ab8-cf05-4dbb-99d6-05c15227d433/volumes" Mar 12 13:29:42 crc kubenswrapper[4921]: I0312 13:29:42.236121 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7ffb8f48-l6m2k" event={"ID":"0091a555-ed5b-415c-ba49-7d2c64fdf54d","Type":"ContainerStarted","Data":"9ae46fce399dee6ed8b42c5169b538cbcb1fad1788a99cc800e715e94f8a5b38"} Mar 12 13:29:42 crc kubenswrapper[4921]: I0312 13:29:42.236165 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7ffb8f48-l6m2k" event={"ID":"0091a555-ed5b-415c-ba49-7d2c64fdf54d","Type":"ContainerStarted","Data":"0be300a332fdbe1adb170c1bc877bf6c902d75296ae1d74311a7afc5a10c8e6e"} Mar 12 13:29:42 crc kubenswrapper[4921]: I0312 13:29:42.236176 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7f7ffb8f48-l6m2k" event={"ID":"0091a555-ed5b-415c-ba49-7d2c64fdf54d","Type":"ContainerStarted","Data":"e90b25ac45110885ada7d97ba782668a7225c038610984eb06e74dcfc739caf4"} Mar 12 13:29:42 crc kubenswrapper[4921]: I0312 13:29:42.237339 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:42 crc kubenswrapper[4921]: I0312 13:29:42.237362 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:29:42 crc kubenswrapper[4921]: I0312 13:29:42.250269 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cda98bc-d6ac-4204-8477-8ecd7dafb976","Type":"ContainerStarted","Data":"3c48e8c03fe5b79c38f77e6a2314e5cabd0cb88a08f8d8f06da925fbda0f9451"} Mar 12 13:29:42 crc kubenswrapper[4921]: I0312 13:29:42.250335 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cda98bc-d6ac-4204-8477-8ecd7dafb976","Type":"ContainerStarted","Data":"e0113be28e22a9fd7f4331568984dae7dee5170cd099481e9035a397ad451e04"} Mar 12 13:29:42 crc kubenswrapper[4921]: I0312 13:29:42.260667 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7f7ffb8f48-l6m2k" podStartSLOduration=2.260650449 podStartE2EDuration="2.260650449s" podCreationTimestamp="2026-03-12 13:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:42.254530334 +0000 UTC m=+1204.944602305" watchObservedRunningTime="2026-03-12 13:29:42.260650449 +0000 UTC m=+1204.950722420" Mar 12 13:29:42 crc kubenswrapper[4921]: I0312 13:29:42.562863 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-bcbd96998-bx4p5" Mar 12 13:29:42 crc kubenswrapper[4921]: I0312 13:29:42.627435 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5856fbd666-n2nmr"] Mar 12 13:29:42 crc kubenswrapper[4921]: I0312 13:29:42.627631 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5856fbd666-n2nmr" podUID="cb38ca02-497e-48a0-8b6b-c6b1dbea5568" containerName="barbican-api-log" containerID="cri-o://c6a1e4e25694e1b84b5ed7ffb7cafecd22b85b9a5bf70c69336d6c2e42112506" gracePeriod=30 Mar 12 13:29:42 crc kubenswrapper[4921]: I0312 13:29:42.628304 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5856fbd666-n2nmr" podUID="cb38ca02-497e-48a0-8b6b-c6b1dbea5568" containerName="barbican-api" containerID="cri-o://588d974d9e58cc3b539e3f1d86ba4f7dcc00d32eb710a7c99a6008eb0dcdcf79" gracePeriod=30 Mar 12 13:29:43 crc kubenswrapper[4921]: I0312 13:29:43.261594 4921 generic.go:334] "Generic (PLEG): container finished" podID="cb38ca02-497e-48a0-8b6b-c6b1dbea5568" containerID="c6a1e4e25694e1b84b5ed7ffb7cafecd22b85b9a5bf70c69336d6c2e42112506" exitCode=143 Mar 12 13:29:43 crc kubenswrapper[4921]: I0312 13:29:43.261670 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5856fbd666-n2nmr" event={"ID":"cb38ca02-497e-48a0-8b6b-c6b1dbea5568","Type":"ContainerDied","Data":"c6a1e4e25694e1b84b5ed7ffb7cafecd22b85b9a5bf70c69336d6c2e42112506"} Mar 12 13:29:43 crc kubenswrapper[4921]: I0312 13:29:43.265620 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cda98bc-d6ac-4204-8477-8ecd7dafb976","Type":"ContainerStarted","Data":"9e1550fbf4ec4041ee622e35f595034c85a7a141e5d86d990a77fafcc28ac93d"} Mar 12 13:29:43 crc kubenswrapper[4921]: I0312 13:29:43.290384 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.290367224 podStartE2EDuration="3.290367224s" podCreationTimestamp="2026-03-12 13:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:29:43.284829547 +0000 UTC m=+1205.974901518" watchObservedRunningTime="2026-03-12 13:29:43.290367224 +0000 UTC m=+1205.980439195" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.291791 4921 generic.go:334] "Generic (PLEG): container finished" podID="57d73461-cb3e-4790-9576-1cb19e03815c" containerID="694464c1f45d4e2e6d1383d11455ef2cd84aec943d4c7963d7e68de0022a32c6" exitCode=0 Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.293614 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-676f8c65df-nxrf5" event={"ID":"57d73461-cb3e-4790-9576-1cb19e03815c","Type":"ContainerDied","Data":"694464c1f45d4e2e6d1383d11455ef2cd84aec943d4c7963d7e68de0022a32c6"} Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.425592 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.515419 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-public-tls-certs\") pod \"57d73461-cb3e-4790-9576-1cb19e03815c\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.515465 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-ovndb-tls-certs\") pod \"57d73461-cb3e-4790-9576-1cb19e03815c\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.515511 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-config\") pod \"57d73461-cb3e-4790-9576-1cb19e03815c\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.515593 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-internal-tls-certs\") pod \"57d73461-cb3e-4790-9576-1cb19e03815c\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.515668 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-combined-ca-bundle\") pod \"57d73461-cb3e-4790-9576-1cb19e03815c\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.515745 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-httpd-config\") pod \"57d73461-cb3e-4790-9576-1cb19e03815c\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.515777 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmmtg\" (UniqueName: \"kubernetes.io/projected/57d73461-cb3e-4790-9576-1cb19e03815c-kube-api-access-lmmtg\") pod \"57d73461-cb3e-4790-9576-1cb19e03815c\" (UID: \"57d73461-cb3e-4790-9576-1cb19e03815c\") " Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.524009 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "57d73461-cb3e-4790-9576-1cb19e03815c" (UID: "57d73461-cb3e-4790-9576-1cb19e03815c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.527007 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57d73461-cb3e-4790-9576-1cb19e03815c-kube-api-access-lmmtg" (OuterVolumeSpecName: "kube-api-access-lmmtg") pod "57d73461-cb3e-4790-9576-1cb19e03815c" (UID: "57d73461-cb3e-4790-9576-1cb19e03815c"). InnerVolumeSpecName "kube-api-access-lmmtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.569000 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57d73461-cb3e-4790-9576-1cb19e03815c" (UID: "57d73461-cb3e-4790-9576-1cb19e03815c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.591775 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "57d73461-cb3e-4790-9576-1cb19e03815c" (UID: "57d73461-cb3e-4790-9576-1cb19e03815c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.594283 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "57d73461-cb3e-4790-9576-1cb19e03815c" (UID: "57d73461-cb3e-4790-9576-1cb19e03815c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.602931 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "57d73461-cb3e-4790-9576-1cb19e03815c" (UID: "57d73461-cb3e-4790-9576-1cb19e03815c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.610915 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-config" (OuterVolumeSpecName: "config") pod "57d73461-cb3e-4790-9576-1cb19e03815c" (UID: "57d73461-cb3e-4790-9576-1cb19e03815c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.617895 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.617929 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.617940 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmmtg\" (UniqueName: \"kubernetes.io/projected/57d73461-cb3e-4790-9576-1cb19e03815c-kube-api-access-lmmtg\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.617950 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.617958 4921 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.617965 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:44 crc kubenswrapper[4921]: I0312 13:29:44.617973 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57d73461-cb3e-4790-9576-1cb19e03815c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:45 crc kubenswrapper[4921]: I0312 13:29:45.236755 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-c8b44c5c7-pc46f" Mar 12 13:29:45 crc kubenswrapper[4921]: I0312 13:29:45.310354 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-676f8c65df-nxrf5" event={"ID":"57d73461-cb3e-4790-9576-1cb19e03815c","Type":"ContainerDied","Data":"e9cc563ce259d5dcbe0713e35563657a208561df51d83da10fd9ad4a0e4c24e7"} Mar 12 13:29:45 crc kubenswrapper[4921]: I0312 13:29:45.310417 4921 scope.go:117] "RemoveContainer" containerID="882f00afa5a8574380a3cbcdeb51f134e2802146be94414cd8bf3305d0077bf2" Mar 12 13:29:45 crc kubenswrapper[4921]: I0312 13:29:45.310438 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-676f8c65df-nxrf5" Mar 12 13:29:45 crc kubenswrapper[4921]: I0312 13:29:45.340910 4921 scope.go:117] "RemoveContainer" containerID="694464c1f45d4e2e6d1383d11455ef2cd84aec943d4c7963d7e68de0022a32c6" Mar 12 13:29:45 crc kubenswrapper[4921]: I0312 13:29:45.364757 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-676f8c65df-nxrf5"] Mar 12 13:29:45 crc kubenswrapper[4921]: I0312 13:29:45.378022 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-676f8c65df-nxrf5"] Mar 12 13:29:45 crc kubenswrapper[4921]: E0312 13:29:45.453605 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57d73461_cb3e_4790_9576_1cb19e03815c.slice\": RecentStats: unable to find data in memory cache]" Mar 12 13:29:45 crc kubenswrapper[4921]: I0312 13:29:45.589045 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 12 13:29:45 crc kubenswrapper[4921]: I0312 13:29:45.993885 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57d73461-cb3e-4790-9576-1cb19e03815c" path="/var/lib/kubelet/pods/57d73461-cb3e-4790-9576-1cb19e03815c/volumes" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.194778 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.341520 4921 generic.go:334] "Generic (PLEG): container finished" podID="cb38ca02-497e-48a0-8b6b-c6b1dbea5568" containerID="588d974d9e58cc3b539e3f1d86ba4f7dcc00d32eb710a7c99a6008eb0dcdcf79" exitCode=0 Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.341561 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5856fbd666-n2nmr" event={"ID":"cb38ca02-497e-48a0-8b6b-c6b1dbea5568","Type":"ContainerDied","Data":"588d974d9e58cc3b539e3f1d86ba4f7dcc00d32eb710a7c99a6008eb0dcdcf79"} Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.341586 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5856fbd666-n2nmr" event={"ID":"cb38ca02-497e-48a0-8b6b-c6b1dbea5568","Type":"ContainerDied","Data":"6d88f854df1d88d540445cdf47bcab0327ea8000c83bc5a9d3f192150835dc20"} Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.341601 4921 scope.go:117] "RemoveContainer" containerID="588d974d9e58cc3b539e3f1d86ba4f7dcc00d32eb710a7c99a6008eb0dcdcf79" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.341657 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5856fbd666-n2nmr" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.345996 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-logs\") pod \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.346154 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-combined-ca-bundle\") pod \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.346331 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-config-data\") pod \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.346365 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-config-data-custom\") pod \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.346411 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7465p\" (UniqueName: \"kubernetes.io/projected/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-kube-api-access-7465p\") pod \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\" (UID: \"cb38ca02-497e-48a0-8b6b-c6b1dbea5568\") " Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.346587 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-logs" (OuterVolumeSpecName: "logs") pod "cb38ca02-497e-48a0-8b6b-c6b1dbea5568" (UID: "cb38ca02-497e-48a0-8b6b-c6b1dbea5568"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.347237 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-logs\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.367131 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-kube-api-access-7465p" (OuterVolumeSpecName: "kube-api-access-7465p") pod "cb38ca02-497e-48a0-8b6b-c6b1dbea5568" (UID: "cb38ca02-497e-48a0-8b6b-c6b1dbea5568"). InnerVolumeSpecName "kube-api-access-7465p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.367957 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cb38ca02-497e-48a0-8b6b-c6b1dbea5568" (UID: "cb38ca02-497e-48a0-8b6b-c6b1dbea5568"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.394952 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb38ca02-497e-48a0-8b6b-c6b1dbea5568" (UID: "cb38ca02-497e-48a0-8b6b-c6b1dbea5568"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.449241 4921 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.449269 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7465p\" (UniqueName: \"kubernetes.io/projected/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-kube-api-access-7465p\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.449279 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.451892 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-config-data" (OuterVolumeSpecName: "config-data") pod "cb38ca02-497e-48a0-8b6b-c6b1dbea5568" (UID: "cb38ca02-497e-48a0-8b6b-c6b1dbea5568"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.457773 4921 scope.go:117] "RemoveContainer" containerID="c6a1e4e25694e1b84b5ed7ffb7cafecd22b85b9a5bf70c69336d6c2e42112506" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.472488 4921 scope.go:117] "RemoveContainer" containerID="588d974d9e58cc3b539e3f1d86ba4f7dcc00d32eb710a7c99a6008eb0dcdcf79" Mar 12 13:29:46 crc kubenswrapper[4921]: E0312 13:29:46.472919 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"588d974d9e58cc3b539e3f1d86ba4f7dcc00d32eb710a7c99a6008eb0dcdcf79\": container with ID starting with 588d974d9e58cc3b539e3f1d86ba4f7dcc00d32eb710a7c99a6008eb0dcdcf79 not found: ID does not exist" containerID="588d974d9e58cc3b539e3f1d86ba4f7dcc00d32eb710a7c99a6008eb0dcdcf79" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.472950 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"588d974d9e58cc3b539e3f1d86ba4f7dcc00d32eb710a7c99a6008eb0dcdcf79"} err="failed to get container status \"588d974d9e58cc3b539e3f1d86ba4f7dcc00d32eb710a7c99a6008eb0dcdcf79\": rpc error: code = NotFound desc = could not find container \"588d974d9e58cc3b539e3f1d86ba4f7dcc00d32eb710a7c99a6008eb0dcdcf79\": container with ID starting with 588d974d9e58cc3b539e3f1d86ba4f7dcc00d32eb710a7c99a6008eb0dcdcf79 not found: ID does not exist" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.472969 4921 scope.go:117] "RemoveContainer" containerID="c6a1e4e25694e1b84b5ed7ffb7cafecd22b85b9a5bf70c69336d6c2e42112506" Mar 12 13:29:46 crc kubenswrapper[4921]: E0312 13:29:46.473184 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6a1e4e25694e1b84b5ed7ffb7cafecd22b85b9a5bf70c69336d6c2e42112506\": container with ID starting with c6a1e4e25694e1b84b5ed7ffb7cafecd22b85b9a5bf70c69336d6c2e42112506 not found: ID does not exist" containerID="c6a1e4e25694e1b84b5ed7ffb7cafecd22b85b9a5bf70c69336d6c2e42112506" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.473218 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6a1e4e25694e1b84b5ed7ffb7cafecd22b85b9a5bf70c69336d6c2e42112506"} err="failed to get container status \"c6a1e4e25694e1b84b5ed7ffb7cafecd22b85b9a5bf70c69336d6c2e42112506\": rpc error: code = NotFound desc = could not find container \"c6a1e4e25694e1b84b5ed7ffb7cafecd22b85b9a5bf70c69336d6c2e42112506\": container with ID starting with c6a1e4e25694e1b84b5ed7ffb7cafecd22b85b9a5bf70c69336d6c2e42112506 not found: ID does not exist" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.550501 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb38ca02-497e-48a0-8b6b-c6b1dbea5568-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.711613 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5856fbd666-n2nmr"] Mar 12 13:29:46 crc kubenswrapper[4921]: I0312 13:29:46.718424 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5856fbd666-n2nmr"] Mar 12 13:29:47 crc kubenswrapper[4921]: I0312 13:29:47.993026 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb38ca02-497e-48a0-8b6b-c6b1dbea5568" path="/var/lib/kubelet/pods/cb38ca02-497e-48a0-8b6b-c6b1dbea5568/volumes" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.069171 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 12 13:29:48 crc kubenswrapper[4921]: E0312 13:29:48.069490 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb38ca02-497e-48a0-8b6b-c6b1dbea5568" containerName="barbican-api-log" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.069505 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb38ca02-497e-48a0-8b6b-c6b1dbea5568" containerName="barbican-api-log" Mar 12 13:29:48 crc kubenswrapper[4921]: E0312 13:29:48.069524 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d73461-cb3e-4790-9576-1cb19e03815c" containerName="neutron-httpd" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.069531 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d73461-cb3e-4790-9576-1cb19e03815c" containerName="neutron-httpd" Mar 12 13:29:48 crc kubenswrapper[4921]: E0312 13:29:48.069546 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb38ca02-497e-48a0-8b6b-c6b1dbea5568" containerName="barbican-api" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.069552 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb38ca02-497e-48a0-8b6b-c6b1dbea5568" containerName="barbican-api" Mar 12 13:29:48 crc kubenswrapper[4921]: E0312 13:29:48.069563 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d73461-cb3e-4790-9576-1cb19e03815c" containerName="neutron-api" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.069569 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d73461-cb3e-4790-9576-1cb19e03815c" containerName="neutron-api" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.069723 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb38ca02-497e-48a0-8b6b-c6b1dbea5568" containerName="barbican-api-log" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.069733 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb38ca02-497e-48a0-8b6b-c6b1dbea5568" containerName="barbican-api" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.069744 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d73461-cb3e-4790-9576-1cb19e03815c" containerName="neutron-api" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.069755 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d73461-cb3e-4790-9576-1cb19e03815c" containerName="neutron-httpd" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.070275 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.073007 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.073584 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-s5x4c" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.073735 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.096116 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.175981 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/345031e5-3e52-4b4e-ba3d-73bc5c3fe95d-openstack-config\") pod \"openstackclient\" (UID: \"345031e5-3e52-4b4e-ba3d-73bc5c3fe95d\") " pod="openstack/openstackclient" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.176040 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/345031e5-3e52-4b4e-ba3d-73bc5c3fe95d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"345031e5-3e52-4b4e-ba3d-73bc5c3fe95d\") " pod="openstack/openstackclient" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.176065 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqznm\" (UniqueName: \"kubernetes.io/projected/345031e5-3e52-4b4e-ba3d-73bc5c3fe95d-kube-api-access-vqznm\") pod \"openstackclient\" (UID: \"345031e5-3e52-4b4e-ba3d-73bc5c3fe95d\") " pod="openstack/openstackclient" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.176133 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/345031e5-3e52-4b4e-ba3d-73bc5c3fe95d-openstack-config-secret\") pod \"openstackclient\" (UID: \"345031e5-3e52-4b4e-ba3d-73bc5c3fe95d\") " pod="openstack/openstackclient" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.277751 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/345031e5-3e52-4b4e-ba3d-73bc5c3fe95d-openstack-config-secret\") pod \"openstackclient\" (UID: \"345031e5-3e52-4b4e-ba3d-73bc5c3fe95d\") " pod="openstack/openstackclient" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.277863 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/345031e5-3e52-4b4e-ba3d-73bc5c3fe95d-openstack-config\") pod \"openstackclient\" (UID: \"345031e5-3e52-4b4e-ba3d-73bc5c3fe95d\") " pod="openstack/openstackclient" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.277897 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/345031e5-3e52-4b4e-ba3d-73bc5c3fe95d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"345031e5-3e52-4b4e-ba3d-73bc5c3fe95d\") " pod="openstack/openstackclient" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.277921 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqznm\" (UniqueName: \"kubernetes.io/projected/345031e5-3e52-4b4e-ba3d-73bc5c3fe95d-kube-api-access-vqznm\") pod \"openstackclient\" (UID: \"345031e5-3e52-4b4e-ba3d-73bc5c3fe95d\") " pod="openstack/openstackclient" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.278946 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/345031e5-3e52-4b4e-ba3d-73bc5c3fe95d-openstack-config\") pod \"openstackclient\" (UID: \"345031e5-3e52-4b4e-ba3d-73bc5c3fe95d\") " pod="openstack/openstackclient" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.284132 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/345031e5-3e52-4b4e-ba3d-73bc5c3fe95d-openstack-config-secret\") pod \"openstackclient\" (UID: \"345031e5-3e52-4b4e-ba3d-73bc5c3fe95d\") " pod="openstack/openstackclient" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.284485 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/345031e5-3e52-4b4e-ba3d-73bc5c3fe95d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"345031e5-3e52-4b4e-ba3d-73bc5c3fe95d\") " pod="openstack/openstackclient" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.298408 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqznm\" (UniqueName: \"kubernetes.io/projected/345031e5-3e52-4b4e-ba3d-73bc5c3fe95d-kube-api-access-vqznm\") pod \"openstackclient\" (UID: \"345031e5-3e52-4b4e-ba3d-73bc5c3fe95d\") " pod="openstack/openstackclient" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.430573 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 13:29:48 crc kubenswrapper[4921]: I0312 13:29:48.935156 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 12 13:29:49 crc kubenswrapper[4921]: I0312 13:29:49.370569 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"345031e5-3e52-4b4e-ba3d-73bc5c3fe95d","Type":"ContainerStarted","Data":"14a8a03c7e5d77f95c65a625bbb691dd786062671dc066a83a93f0265169690d"} Mar 12 13:29:50 crc kubenswrapper[4921]: I0312 13:29:50.806652 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.406381 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-m98rn"] Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.408492 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m98rn" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.425543 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-m98rn"] Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.534051 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-glss8"] Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.535196 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-glss8" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.543264 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-glss8"] Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.605655 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/314bb914-0157-480d-b873-57bcc6c6eaad-operator-scripts\") pod \"nova-api-db-create-m98rn\" (UID: \"314bb914-0157-480d-b873-57bcc6c6eaad\") " pod="openstack/nova-api-db-create-m98rn" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.606020 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpv8p\" (UniqueName: \"kubernetes.io/projected/314bb914-0157-480d-b873-57bcc6c6eaad-kube-api-access-hpv8p\") pod \"nova-api-db-create-m98rn\" (UID: \"314bb914-0157-480d-b873-57bcc6c6eaad\") " pod="openstack/nova-api-db-create-m98rn" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.612419 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c4c4-account-create-update-p77m4"] Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.614063 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c4c4-account-create-update-p77m4" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.624921 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.638458 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-mxzdk"] Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.639553 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mxzdk" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.647561 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c4c4-account-create-update-p77m4"] Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.658770 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mxzdk"] Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.707886 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba94cc5-7d33-4ec9-a923-f711f9794a5a-operator-scripts\") pod \"nova-cell0-db-create-glss8\" (UID: \"eba94cc5-7d33-4ec9-a923-f711f9794a5a\") " pod="openstack/nova-cell0-db-create-glss8" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.708199 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpv8p\" (UniqueName: \"kubernetes.io/projected/314bb914-0157-480d-b873-57bcc6c6eaad-kube-api-access-hpv8p\") pod \"nova-api-db-create-m98rn\" (UID: \"314bb914-0157-480d-b873-57bcc6c6eaad\") " pod="openstack/nova-api-db-create-m98rn" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.708459 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751eb572-c5a1-4154-bbf0-e076d215faed-operator-scripts\") pod \"nova-api-c4c4-account-create-update-p77m4\" (UID: \"751eb572-c5a1-4154-bbf0-e076d215faed\") " pod="openstack/nova-api-c4c4-account-create-update-p77m4" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.708670 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdxsq\" (UniqueName: \"kubernetes.io/projected/39bbbc63-15d5-418d-bd13-95a97ae85e63-kube-api-access-qdxsq\") pod \"nova-cell1-db-create-mxzdk\" (UID: \"39bbbc63-15d5-418d-bd13-95a97ae85e63\") " pod="openstack/nova-cell1-db-create-mxzdk" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.708800 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39bbbc63-15d5-418d-bd13-95a97ae85e63-operator-scripts\") pod \"nova-cell1-db-create-mxzdk\" (UID: \"39bbbc63-15d5-418d-bd13-95a97ae85e63\") " pod="openstack/nova-cell1-db-create-mxzdk" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.708956 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r64ql\" (UniqueName: \"kubernetes.io/projected/751eb572-c5a1-4154-bbf0-e076d215faed-kube-api-access-r64ql\") pod \"nova-api-c4c4-account-create-update-p77m4\" (UID: \"751eb572-c5a1-4154-bbf0-e076d215faed\") " pod="openstack/nova-api-c4c4-account-create-update-p77m4" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.709094 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pvk6\" (UniqueName: \"kubernetes.io/projected/eba94cc5-7d33-4ec9-a923-f711f9794a5a-kube-api-access-8pvk6\") pod \"nova-cell0-db-create-glss8\" (UID: \"eba94cc5-7d33-4ec9-a923-f711f9794a5a\") " pod="openstack/nova-cell0-db-create-glss8" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.709243 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/314bb914-0157-480d-b873-57bcc6c6eaad-operator-scripts\") pod \"nova-api-db-create-m98rn\" (UID: \"314bb914-0157-480d-b873-57bcc6c6eaad\") " pod="openstack/nova-api-db-create-m98rn" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.710326 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/314bb914-0157-480d-b873-57bcc6c6eaad-operator-scripts\") pod \"nova-api-db-create-m98rn\" (UID: \"314bb914-0157-480d-b873-57bcc6c6eaad\") " pod="openstack/nova-api-db-create-m98rn" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.732285 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpv8p\" (UniqueName: \"kubernetes.io/projected/314bb914-0157-480d-b873-57bcc6c6eaad-kube-api-access-hpv8p\") pod \"nova-api-db-create-m98rn\" (UID: \"314bb914-0157-480d-b873-57bcc6c6eaad\") " pod="openstack/nova-api-db-create-m98rn" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.737732 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m98rn" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.811256 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba94cc5-7d33-4ec9-a923-f711f9794a5a-operator-scripts\") pod \"nova-cell0-db-create-glss8\" (UID: \"eba94cc5-7d33-4ec9-a923-f711f9794a5a\") " pod="openstack/nova-cell0-db-create-glss8" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.811359 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751eb572-c5a1-4154-bbf0-e076d215faed-operator-scripts\") pod \"nova-api-c4c4-account-create-update-p77m4\" (UID: \"751eb572-c5a1-4154-bbf0-e076d215faed\") " pod="openstack/nova-api-c4c4-account-create-update-p77m4" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.811399 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdxsq\" (UniqueName: \"kubernetes.io/projected/39bbbc63-15d5-418d-bd13-95a97ae85e63-kube-api-access-qdxsq\") pod \"nova-cell1-db-create-mxzdk\" (UID: \"39bbbc63-15d5-418d-bd13-95a97ae85e63\") " pod="openstack/nova-cell1-db-create-mxzdk" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.811456 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39bbbc63-15d5-418d-bd13-95a97ae85e63-operator-scripts\") pod \"nova-cell1-db-create-mxzdk\" (UID: \"39bbbc63-15d5-418d-bd13-95a97ae85e63\") " pod="openstack/nova-cell1-db-create-mxzdk" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.811482 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r64ql\" (UniqueName: \"kubernetes.io/projected/751eb572-c5a1-4154-bbf0-e076d215faed-kube-api-access-r64ql\") pod \"nova-api-c4c4-account-create-update-p77m4\" (UID: \"751eb572-c5a1-4154-bbf0-e076d215faed\") " pod="openstack/nova-api-c4c4-account-create-update-p77m4" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.811538 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pvk6\" (UniqueName: \"kubernetes.io/projected/eba94cc5-7d33-4ec9-a923-f711f9794a5a-kube-api-access-8pvk6\") pod \"nova-cell0-db-create-glss8\" (UID: \"eba94cc5-7d33-4ec9-a923-f711f9794a5a\") " pod="openstack/nova-cell0-db-create-glss8" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.812307 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba94cc5-7d33-4ec9-a923-f711f9794a5a-operator-scripts\") pod \"nova-cell0-db-create-glss8\" (UID: \"eba94cc5-7d33-4ec9-a923-f711f9794a5a\") " pod="openstack/nova-cell0-db-create-glss8" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.812785 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39bbbc63-15d5-418d-bd13-95a97ae85e63-operator-scripts\") pod \"nova-cell1-db-create-mxzdk\" (UID: \"39bbbc63-15d5-418d-bd13-95a97ae85e63\") " pod="openstack/nova-cell1-db-create-mxzdk" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.812901 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751eb572-c5a1-4154-bbf0-e076d215faed-operator-scripts\") pod \"nova-api-c4c4-account-create-update-p77m4\" (UID: \"751eb572-c5a1-4154-bbf0-e076d215faed\") " pod="openstack/nova-api-c4c4-account-create-update-p77m4" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.817165 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ce1b-account-create-update-dpqvt"] Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.818157 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ce1b-account-create-update-dpqvt" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.820973 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.830942 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ce1b-account-create-update-dpqvt"] Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.838225 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r64ql\" (UniqueName: \"kubernetes.io/projected/751eb572-c5a1-4154-bbf0-e076d215faed-kube-api-access-r64ql\") pod \"nova-api-c4c4-account-create-update-p77m4\" (UID: \"751eb572-c5a1-4154-bbf0-e076d215faed\") " pod="openstack/nova-api-c4c4-account-create-update-p77m4" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.838624 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdxsq\" (UniqueName: \"kubernetes.io/projected/39bbbc63-15d5-418d-bd13-95a97ae85e63-kube-api-access-qdxsq\") pod \"nova-cell1-db-create-mxzdk\" (UID: \"39bbbc63-15d5-418d-bd13-95a97ae85e63\") " pod="openstack/nova-cell1-db-create-mxzdk" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.841431 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pvk6\" (UniqueName: \"kubernetes.io/projected/eba94cc5-7d33-4ec9-a923-f711f9794a5a-kube-api-access-8pvk6\") pod \"nova-cell0-db-create-glss8\" (UID: \"eba94cc5-7d33-4ec9-a923-f711f9794a5a\") " pod="openstack/nova-cell0-db-create-glss8" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.858118 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-glss8" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.912048 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd52e387-66f4-4b53-bfa7-23199af03b5e-operator-scripts\") pod \"nova-cell0-ce1b-account-create-update-dpqvt\" (UID: \"cd52e387-66f4-4b53-bfa7-23199af03b5e\") " pod="openstack/nova-cell0-ce1b-account-create-update-dpqvt" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.912193 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tww4v\" (UniqueName: \"kubernetes.io/projected/cd52e387-66f4-4b53-bfa7-23199af03b5e-kube-api-access-tww4v\") pod \"nova-cell0-ce1b-account-create-update-dpqvt\" (UID: \"cd52e387-66f4-4b53-bfa7-23199af03b5e\") " pod="openstack/nova-cell0-ce1b-account-create-update-dpqvt" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.931394 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c4c4-account-create-update-p77m4" Mar 12 13:29:55 crc kubenswrapper[4921]: I0312 13:29:55.954955 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mxzdk" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.014069 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd52e387-66f4-4b53-bfa7-23199af03b5e-operator-scripts\") pod \"nova-cell0-ce1b-account-create-update-dpqvt\" (UID: \"cd52e387-66f4-4b53-bfa7-23199af03b5e\") " pod="openstack/nova-cell0-ce1b-account-create-update-dpqvt" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.014240 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tww4v\" (UniqueName: \"kubernetes.io/projected/cd52e387-66f4-4b53-bfa7-23199af03b5e-kube-api-access-tww4v\") pod \"nova-cell0-ce1b-account-create-update-dpqvt\" (UID: \"cd52e387-66f4-4b53-bfa7-23199af03b5e\") " pod="openstack/nova-cell0-ce1b-account-create-update-dpqvt" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.015025 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd52e387-66f4-4b53-bfa7-23199af03b5e-operator-scripts\") pod \"nova-cell0-ce1b-account-create-update-dpqvt\" (UID: \"cd52e387-66f4-4b53-bfa7-23199af03b5e\") " pod="openstack/nova-cell0-ce1b-account-create-update-dpqvt" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.016009 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-247a-account-create-update-4l9sn"] Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.017257 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-247a-account-create-update-4l9sn" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.021997 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.035127 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tww4v\" (UniqueName: \"kubernetes.io/projected/cd52e387-66f4-4b53-bfa7-23199af03b5e-kube-api-access-tww4v\") pod \"nova-cell0-ce1b-account-create-update-dpqvt\" (UID: \"cd52e387-66f4-4b53-bfa7-23199af03b5e\") " pod="openstack/nova-cell0-ce1b-account-create-update-dpqvt" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.049342 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-247a-account-create-update-4l9sn"] Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.088140 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.209581 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ce1b-account-create-update-dpqvt" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.218174 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5bgx\" (UniqueName: \"kubernetes.io/projected/956a8195-f544-4073-9042-544d311ef500-kube-api-access-v5bgx\") pod \"nova-cell1-247a-account-create-update-4l9sn\" (UID: \"956a8195-f544-4073-9042-544d311ef500\") " pod="openstack/nova-cell1-247a-account-create-update-4l9sn" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.218372 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/956a8195-f544-4073-9042-544d311ef500-operator-scripts\") pod \"nova-cell1-247a-account-create-update-4l9sn\" (UID: \"956a8195-f544-4073-9042-544d311ef500\") " pod="openstack/nova-cell1-247a-account-create-update-4l9sn" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.320459 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/956a8195-f544-4073-9042-544d311ef500-operator-scripts\") pod \"nova-cell1-247a-account-create-update-4l9sn\" (UID: \"956a8195-f544-4073-9042-544d311ef500\") " pod="openstack/nova-cell1-247a-account-create-update-4l9sn" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.320526 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5bgx\" (UniqueName: \"kubernetes.io/projected/956a8195-f544-4073-9042-544d311ef500-kube-api-access-v5bgx\") pod \"nova-cell1-247a-account-create-update-4l9sn\" (UID: \"956a8195-f544-4073-9042-544d311ef500\") " pod="openstack/nova-cell1-247a-account-create-update-4l9sn" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.322207 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/956a8195-f544-4073-9042-544d311ef500-operator-scripts\") pod \"nova-cell1-247a-account-create-update-4l9sn\" (UID: \"956a8195-f544-4073-9042-544d311ef500\") " pod="openstack/nova-cell1-247a-account-create-update-4l9sn" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.332630 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.332677 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.332713 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.333369 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f7722c7345ffa51f6b2d5016c3d605416a6961812caddc8f13639d2d6299573d"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.333424 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://f7722c7345ffa51f6b2d5016c3d605416a6961812caddc8f13639d2d6299573d" gracePeriod=600 Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.339396 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5bgx\" (UniqueName: \"kubernetes.io/projected/956a8195-f544-4073-9042-544d311ef500-kube-api-access-v5bgx\") pod \"nova-cell1-247a-account-create-update-4l9sn\" (UID: \"956a8195-f544-4073-9042-544d311ef500\") " pod="openstack/nova-cell1-247a-account-create-update-4l9sn" Mar 12 13:29:56 crc kubenswrapper[4921]: I0312 13:29:56.635009 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-247a-account-create-update-4l9sn" Mar 12 13:29:57 crc kubenswrapper[4921]: I0312 13:29:57.455803 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="f7722c7345ffa51f6b2d5016c3d605416a6961812caddc8f13639d2d6299573d" exitCode=0 Mar 12 13:29:57 crc kubenswrapper[4921]: I0312 13:29:57.455876 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"f7722c7345ffa51f6b2d5016c3d605416a6961812caddc8f13639d2d6299573d"} Mar 12 13:29:57 crc kubenswrapper[4921]: I0312 13:29:57.456577 4921 scope.go:117] "RemoveContainer" containerID="d3b38af5e8a74ac4ff0f8e664ea487d80830b0618d599c37b78cc47d7d985662" Mar 12 13:29:57 crc kubenswrapper[4921]: I0312 13:29:57.668678 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:57 crc kubenswrapper[4921]: I0312 13:29:57.671859 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="ceilometer-central-agent" containerID="cri-o://8a4e54ba77111d37ff7d676f549552ee755ce8d297a481ce87b665e127397111" gracePeriod=30 Mar 12 13:29:57 crc kubenswrapper[4921]: I0312 13:29:57.672027 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="proxy-httpd" containerID="cri-o://7e4a80126550f6bb637e807eeaddfb628a7fa3237155c65c26691d13eaab778f" gracePeriod=30 Mar 12 13:29:57 crc kubenswrapper[4921]: I0312 13:29:57.672081 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="sg-core" containerID="cri-o://08f5b742b9a925ddd2db57514e75c2687399998cd21ba35f6b3848036beb9705" gracePeriod=30 Mar 12 13:29:57 crc kubenswrapper[4921]: I0312 13:29:57.672124 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="ceilometer-notification-agent" containerID="cri-o://9f27556e5c3f50472dd219423dccabe1ab44e19b36b80ab0ee364851ae33325a" gracePeriod=30 Mar 12 13:29:57 crc kubenswrapper[4921]: W0312 13:29:57.828901 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod751eb572_c5a1_4154_bbf0_e076d215faed.slice/crio-eb45c98de6481c8a20c40f61f2286b8b19969c0d10315728e321388947b5c17a WatchSource:0}: Error finding container eb45c98de6481c8a20c40f61f2286b8b19969c0d10315728e321388947b5c17a: Status 404 returned error can't find the container with id eb45c98de6481c8a20c40f61f2286b8b19969c0d10315728e321388947b5c17a Mar 12 13:29:57 crc kubenswrapper[4921]: I0312 13:29:57.834030 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c4c4-account-create-update-p77m4"] Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.129087 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-247a-account-create-update-4l9sn"] Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.138452 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-mxzdk"] Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.152449 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ce1b-account-create-update-dpqvt"] Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.168853 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-m98rn"] Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.317576 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-glss8"] Mar 12 13:29:58 crc kubenswrapper[4921]: W0312 13:29:58.318325 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeba94cc5_7d33_4ec9_a923_f711f9794a5a.slice/crio-7dae6af1b86cc9782763fa86646a9e1782e4c8a26eee03fc571e36d034e3b109 WatchSource:0}: Error finding container 7dae6af1b86cc9782763fa86646a9e1782e4c8a26eee03fc571e36d034e3b109: Status 404 returned error can't find the container with id 7dae6af1b86cc9782763fa86646a9e1782e4c8a26eee03fc571e36d034e3b109 Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.469330 4921 generic.go:334] "Generic (PLEG): container finished" podID="751eb572-c5a1-4154-bbf0-e076d215faed" containerID="cfea70834a70ba5fe61f533f89246186810dd47d5fec6d592137de4774f7d54b" exitCode=0 Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.469396 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c4c4-account-create-update-p77m4" event={"ID":"751eb572-c5a1-4154-bbf0-e076d215faed","Type":"ContainerDied","Data":"cfea70834a70ba5fe61f533f89246186810dd47d5fec6d592137de4774f7d54b"} Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.469622 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c4c4-account-create-update-p77m4" event={"ID":"751eb572-c5a1-4154-bbf0-e076d215faed","Type":"ContainerStarted","Data":"eb45c98de6481c8a20c40f61f2286b8b19969c0d10315728e321388947b5c17a"} Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.473564 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-glss8" event={"ID":"eba94cc5-7d33-4ec9-a923-f711f9794a5a","Type":"ContainerStarted","Data":"7dae6af1b86cc9782763fa86646a9e1782e4c8a26eee03fc571e36d034e3b109"} Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.475750 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-247a-account-create-update-4l9sn" event={"ID":"956a8195-f544-4073-9042-544d311ef500","Type":"ContainerStarted","Data":"d491ffa6fae0b7ad9dd243372520b1f28648d649297b509fd54ca94e3731a022"} Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.477730 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ce1b-account-create-update-dpqvt" event={"ID":"cd52e387-66f4-4b53-bfa7-23199af03b5e","Type":"ContainerStarted","Data":"460a141c22d92926c4d5bc68ebe44322882f34cc26b83ce4343ec2205ec4caac"} Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.479791 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"345031e5-3e52-4b4e-ba3d-73bc5c3fe95d","Type":"ContainerStarted","Data":"81610cd2ff25acf15657f203fbb745bc10f747ae592f035d0fa0af2d94f07f77"} Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.488344 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m98rn" event={"ID":"314bb914-0157-480d-b873-57bcc6c6eaad","Type":"ContainerStarted","Data":"d7064031d6d0fb127ecbcc0fc4eabf62b5cf1e12191caf0a0b2b5ddc410cd5a2"} Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.489541 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mxzdk" event={"ID":"39bbbc63-15d5-418d-bd13-95a97ae85e63","Type":"ContainerStarted","Data":"efc1849dc48c6b80de3573d5cd0fc9eea0b3877ab5d6eb844173aff1645e54ff"} Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.491749 4921 generic.go:334] "Generic (PLEG): container finished" podID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerID="7e4a80126550f6bb637e807eeaddfb628a7fa3237155c65c26691d13eaab778f" exitCode=0 Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.491780 4921 generic.go:334] "Generic (PLEG): container finished" podID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerID="08f5b742b9a925ddd2db57514e75c2687399998cd21ba35f6b3848036beb9705" exitCode=2 Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.491793 4921 generic.go:334] "Generic (PLEG): container finished" podID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerID="9f27556e5c3f50472dd219423dccabe1ab44e19b36b80ab0ee364851ae33325a" exitCode=0 Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.491802 4921 generic.go:334] "Generic (PLEG): container finished" podID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerID="8a4e54ba77111d37ff7d676f549552ee755ce8d297a481ce87b665e127397111" exitCode=0 Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.491858 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500","Type":"ContainerDied","Data":"7e4a80126550f6bb637e807eeaddfb628a7fa3237155c65c26691d13eaab778f"} Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.491879 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500","Type":"ContainerDied","Data":"08f5b742b9a925ddd2db57514e75c2687399998cd21ba35f6b3848036beb9705"} Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.491892 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500","Type":"ContainerDied","Data":"9f27556e5c3f50472dd219423dccabe1ab44e19b36b80ab0ee364851ae33325a"} Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.491905 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500","Type":"ContainerDied","Data":"8a4e54ba77111d37ff7d676f549552ee755ce8d297a481ce87b665e127397111"} Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.495422 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"7697d191845361ae138c9b2df2cde8ebed453242ceaff45c19913d28c03c6fd3"} Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.516805 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.024538207 podStartE2EDuration="10.516781913s" podCreationTimestamp="2026-03-12 13:29:48 +0000 UTC" firstStartedPulling="2026-03-12 13:29:48.930411015 +0000 UTC m=+1211.620482986" lastFinishedPulling="2026-03-12 13:29:57.422654721 +0000 UTC m=+1220.112726692" observedRunningTime="2026-03-12 13:29:58.508274646 +0000 UTC m=+1221.198346617" watchObservedRunningTime="2026-03-12 13:29:58.516781913 +0000 UTC m=+1221.206853884" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.555410 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.677216 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgdpp\" (UniqueName: \"kubernetes.io/projected/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-kube-api-access-qgdpp\") pod \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.677266 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-scripts\") pod \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.677293 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-log-httpd\") pod \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.677312 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-run-httpd\") pod \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.677351 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-sg-core-conf-yaml\") pod \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.677409 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-combined-ca-bundle\") pod \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.677466 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-config-data\") pod \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\" (UID: \"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500\") " Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.678032 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" (UID: "a0c119c7-ae7a-4f1c-9d57-8ae81fe32500"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.678143 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.678174 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" (UID: "a0c119c7-ae7a-4f1c-9d57-8ae81fe32500"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.686508 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-scripts" (OuterVolumeSpecName: "scripts") pod "a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" (UID: "a0c119c7-ae7a-4f1c-9d57-8ae81fe32500"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.687541 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-kube-api-access-qgdpp" (OuterVolumeSpecName: "kube-api-access-qgdpp") pod "a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" (UID: "a0c119c7-ae7a-4f1c-9d57-8ae81fe32500"). InnerVolumeSpecName "kube-api-access-qgdpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.735012 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" (UID: "a0c119c7-ae7a-4f1c-9d57-8ae81fe32500"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.777010 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" (UID: "a0c119c7-ae7a-4f1c-9d57-8ae81fe32500"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.780008 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.780049 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgdpp\" (UniqueName: \"kubernetes.io/projected/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-kube-api-access-qgdpp\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.780066 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.780081 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.780094 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.818850 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-config-data" (OuterVolumeSpecName: "config-data") pod "a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" (UID: "a0c119c7-ae7a-4f1c-9d57-8ae81fe32500"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:29:58 crc kubenswrapper[4921]: I0312 13:29:58.882097 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.506926 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.506930 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0c119c7-ae7a-4f1c-9d57-8ae81fe32500","Type":"ContainerDied","Data":"9c854b2f3ee25a0a9399c8f6a66d36484e172682ac103aed5971a8392f0d47f4"} Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.507502 4921 scope.go:117] "RemoveContainer" containerID="7e4a80126550f6bb637e807eeaddfb628a7fa3237155c65c26691d13eaab778f" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.509871 4921 generic.go:334] "Generic (PLEG): container finished" podID="eba94cc5-7d33-4ec9-a923-f711f9794a5a" containerID="cb20a646af29c4b79c48b9d2b43126e629bcb9df396a8e9a5b583c72aa642e88" exitCode=0 Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.509988 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-glss8" event={"ID":"eba94cc5-7d33-4ec9-a923-f711f9794a5a","Type":"ContainerDied","Data":"cb20a646af29c4b79c48b9d2b43126e629bcb9df396a8e9a5b583c72aa642e88"} Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.522333 4921 generic.go:334] "Generic (PLEG): container finished" podID="cd52e387-66f4-4b53-bfa7-23199af03b5e" containerID="a363a2891ae209c0738384e64d213015ecdd740a02bafc4e7683a47b027c8f77" exitCode=0 Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.522497 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ce1b-account-create-update-dpqvt" event={"ID":"cd52e387-66f4-4b53-bfa7-23199af03b5e","Type":"ContainerDied","Data":"a363a2891ae209c0738384e64d213015ecdd740a02bafc4e7683a47b027c8f77"} Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.531512 4921 generic.go:334] "Generic (PLEG): container finished" podID="956a8195-f544-4073-9042-544d311ef500" containerID="95e6cfa841f2a0604bdaa1a3bc441224014b19a3878b899fd92d48dd8160d5af" exitCode=0 Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.531613 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-247a-account-create-update-4l9sn" event={"ID":"956a8195-f544-4073-9042-544d311ef500","Type":"ContainerDied","Data":"95e6cfa841f2a0604bdaa1a3bc441224014b19a3878b899fd92d48dd8160d5af"} Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.534529 4921 generic.go:334] "Generic (PLEG): container finished" podID="314bb914-0157-480d-b873-57bcc6c6eaad" containerID="4132b1c906451df3acc71d98e9d47f8d05eed4f07057f8c7a5a3285dea15d9cb" exitCode=0 Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.534591 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m98rn" event={"ID":"314bb914-0157-480d-b873-57bcc6c6eaad","Type":"ContainerDied","Data":"4132b1c906451df3acc71d98e9d47f8d05eed4f07057f8c7a5a3285dea15d9cb"} Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.536191 4921 generic.go:334] "Generic (PLEG): container finished" podID="39bbbc63-15d5-418d-bd13-95a97ae85e63" containerID="d6f5cbdfc8deaeb448f96c4bc2f6407691a67838a0ec753aad5250273989bbc6" exitCode=0 Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.539699 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mxzdk" event={"ID":"39bbbc63-15d5-418d-bd13-95a97ae85e63","Type":"ContainerDied","Data":"d6f5cbdfc8deaeb448f96c4bc2f6407691a67838a0ec753aad5250273989bbc6"} Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.548940 4921 scope.go:117] "RemoveContainer" containerID="08f5b742b9a925ddd2db57514e75c2687399998cd21ba35f6b3848036beb9705" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.583006 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.603595 4921 scope.go:117] "RemoveContainer" containerID="9f27556e5c3f50472dd219423dccabe1ab44e19b36b80ab0ee364851ae33325a" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.610627 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.626324 4921 scope.go:117] "RemoveContainer" containerID="8a4e54ba77111d37ff7d676f549552ee755ce8d297a481ce87b665e127397111" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.632064 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:59 crc kubenswrapper[4921]: E0312 13:29:59.632437 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="ceilometer-notification-agent" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.632454 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="ceilometer-notification-agent" Mar 12 13:29:59 crc kubenswrapper[4921]: E0312 13:29:59.632465 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="ceilometer-central-agent" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.632470 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="ceilometer-central-agent" Mar 12 13:29:59 crc kubenswrapper[4921]: E0312 13:29:59.632487 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="proxy-httpd" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.632494 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="proxy-httpd" Mar 12 13:29:59 crc kubenswrapper[4921]: E0312 13:29:59.632518 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="sg-core" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.632523 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="sg-core" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.632759 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="proxy-httpd" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.632885 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="ceilometer-central-agent" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.632922 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="sg-core" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.632940 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" containerName="ceilometer-notification-agent" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.634696 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.637239 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.637551 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.654865 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.802515 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.803651 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-scripts\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.803705 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.803802 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-config-data\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.803845 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-run-httpd\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.803867 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfcgh\" (UniqueName: \"kubernetes.io/projected/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-kube-api-access-mfcgh\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.803907 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-log-httpd\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.905673 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.905750 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-scripts\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.905792 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.905903 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-config-data\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.905938 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-run-httpd\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.905967 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfcgh\" (UniqueName: \"kubernetes.io/projected/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-kube-api-access-mfcgh\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.906014 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-log-httpd\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.906569 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-log-httpd\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.906880 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-run-httpd\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.913341 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-scripts\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.913396 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.913731 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.914035 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-config-data\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.930937 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfcgh\" (UniqueName: \"kubernetes.io/projected/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-kube-api-access-mfcgh\") pod \"ceilometer-0\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.963884 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:29:59 crc kubenswrapper[4921]: I0312 13:29:59.996980 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0c119c7-ae7a-4f1c-9d57-8ae81fe32500" path="/var/lib/kubelet/pods/a0c119c7-ae7a-4f1c-9d57-8ae81fe32500/volumes" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.002507 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c4c4-account-create-update-p77m4" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.111361 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751eb572-c5a1-4154-bbf0-e076d215faed-operator-scripts\") pod \"751eb572-c5a1-4154-bbf0-e076d215faed\" (UID: \"751eb572-c5a1-4154-bbf0-e076d215faed\") " Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.112476 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r64ql\" (UniqueName: \"kubernetes.io/projected/751eb572-c5a1-4154-bbf0-e076d215faed-kube-api-access-r64ql\") pod \"751eb572-c5a1-4154-bbf0-e076d215faed\" (UID: \"751eb572-c5a1-4154-bbf0-e076d215faed\") " Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.115302 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/751eb572-c5a1-4154-bbf0-e076d215faed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "751eb572-c5a1-4154-bbf0-e076d215faed" (UID: "751eb572-c5a1-4154-bbf0-e076d215faed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.118148 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/751eb572-c5a1-4154-bbf0-e076d215faed-kube-api-access-r64ql" (OuterVolumeSpecName: "kube-api-access-r64ql") pod "751eb572-c5a1-4154-bbf0-e076d215faed" (UID: "751eb572-c5a1-4154-bbf0-e076d215faed"). InnerVolumeSpecName "kube-api-access-r64ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.132240 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555370-6g99d"] Mar 12 13:30:00 crc kubenswrapper[4921]: E0312 13:30:00.132612 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="751eb572-c5a1-4154-bbf0-e076d215faed" containerName="mariadb-account-create-update" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.132630 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="751eb572-c5a1-4154-bbf0-e076d215faed" containerName="mariadb-account-create-update" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.132799 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="751eb572-c5a1-4154-bbf0-e076d215faed" containerName="mariadb-account-create-update" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.133339 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555370-6g99d" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.137393 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.137633 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.137926 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.146258 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555370-6g99d"] Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.157111 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l"] Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.158199 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.164422 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.164613 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.170858 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l"] Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.214383 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/751eb572-c5a1-4154-bbf0-e076d215faed-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.214407 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r64ql\" (UniqueName: \"kubernetes.io/projected/751eb572-c5a1-4154-bbf0-e076d215faed-kube-api-access-r64ql\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.315780 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-config-volume\") pod \"collect-profiles-29555370-7cx7l\" (UID: \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.315915 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5hds\" (UniqueName: \"kubernetes.io/projected/f4c99a71-792e-4bc1-81d5-e75c67437787-kube-api-access-l5hds\") pod \"auto-csr-approver-29555370-6g99d\" (UID: \"f4c99a71-792e-4bc1-81d5-e75c67437787\") " pod="openshift-infra/auto-csr-approver-29555370-6g99d" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.315967 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6gn2\" (UniqueName: \"kubernetes.io/projected/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-kube-api-access-c6gn2\") pod \"collect-profiles-29555370-7cx7l\" (UID: \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.315995 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-secret-volume\") pod \"collect-profiles-29555370-7cx7l\" (UID: \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.418293 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-config-volume\") pod \"collect-profiles-29555370-7cx7l\" (UID: \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.418418 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5hds\" (UniqueName: \"kubernetes.io/projected/f4c99a71-792e-4bc1-81d5-e75c67437787-kube-api-access-l5hds\") pod \"auto-csr-approver-29555370-6g99d\" (UID: \"f4c99a71-792e-4bc1-81d5-e75c67437787\") " pod="openshift-infra/auto-csr-approver-29555370-6g99d" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.418490 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gn2\" (UniqueName: \"kubernetes.io/projected/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-kube-api-access-c6gn2\") pod \"collect-profiles-29555370-7cx7l\" (UID: \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.418518 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-secret-volume\") pod \"collect-profiles-29555370-7cx7l\" (UID: \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.419716 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-config-volume\") pod \"collect-profiles-29555370-7cx7l\" (UID: \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.426375 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-secret-volume\") pod \"collect-profiles-29555370-7cx7l\" (UID: \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.437211 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gn2\" (UniqueName: \"kubernetes.io/projected/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-kube-api-access-c6gn2\") pod \"collect-profiles-29555370-7cx7l\" (UID: \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.444880 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5hds\" (UniqueName: \"kubernetes.io/projected/f4c99a71-792e-4bc1-81d5-e75c67437787-kube-api-access-l5hds\") pod \"auto-csr-approver-29555370-6g99d\" (UID: \"f4c99a71-792e-4bc1-81d5-e75c67437787\") " pod="openshift-infra/auto-csr-approver-29555370-6g99d" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.460066 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555370-6g99d" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.483351 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.494731 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.570952 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c4c4-account-create-update-p77m4" event={"ID":"751eb572-c5a1-4154-bbf0-e076d215faed","Type":"ContainerDied","Data":"eb45c98de6481c8a20c40f61f2286b8b19969c0d10315728e321388947b5c17a"} Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.571207 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb45c98de6481c8a20c40f61f2286b8b19969c0d10315728e321388947b5c17a" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.571373 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c4c4-account-create-update-p77m4" Mar 12 13:30:00 crc kubenswrapper[4921]: I0312 13:30:00.574103 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76e28dd2-ce66-4efc-b2fd-b3b221144aa6","Type":"ContainerStarted","Data":"b034ec47c3ead02211681f6ab8a6b4e915d1aca4bb25c7a373e56b072220bf56"} Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.031501 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mxzdk" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.037597 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m98rn" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.052775 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-247a-account-create-update-4l9sn" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.063789 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ce1b-account-create-update-dpqvt" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.135277 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5bgx\" (UniqueName: \"kubernetes.io/projected/956a8195-f544-4073-9042-544d311ef500-kube-api-access-v5bgx\") pod \"956a8195-f544-4073-9042-544d311ef500\" (UID: \"956a8195-f544-4073-9042-544d311ef500\") " Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.135386 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39bbbc63-15d5-418d-bd13-95a97ae85e63-operator-scripts\") pod \"39bbbc63-15d5-418d-bd13-95a97ae85e63\" (UID: \"39bbbc63-15d5-418d-bd13-95a97ae85e63\") " Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.135472 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/956a8195-f544-4073-9042-544d311ef500-operator-scripts\") pod \"956a8195-f544-4073-9042-544d311ef500\" (UID: \"956a8195-f544-4073-9042-544d311ef500\") " Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.135537 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdxsq\" (UniqueName: \"kubernetes.io/projected/39bbbc63-15d5-418d-bd13-95a97ae85e63-kube-api-access-qdxsq\") pod \"39bbbc63-15d5-418d-bd13-95a97ae85e63\" (UID: \"39bbbc63-15d5-418d-bd13-95a97ae85e63\") " Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.135606 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/314bb914-0157-480d-b873-57bcc6c6eaad-operator-scripts\") pod \"314bb914-0157-480d-b873-57bcc6c6eaad\" (UID: \"314bb914-0157-480d-b873-57bcc6c6eaad\") " Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.135657 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpv8p\" (UniqueName: \"kubernetes.io/projected/314bb914-0157-480d-b873-57bcc6c6eaad-kube-api-access-hpv8p\") pod \"314bb914-0157-480d-b873-57bcc6c6eaad\" (UID: \"314bb914-0157-480d-b873-57bcc6c6eaad\") " Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.136305 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/956a8195-f544-4073-9042-544d311ef500-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "956a8195-f544-4073-9042-544d311ef500" (UID: "956a8195-f544-4073-9042-544d311ef500"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.136733 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/314bb914-0157-480d-b873-57bcc6c6eaad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "314bb914-0157-480d-b873-57bcc6c6eaad" (UID: "314bb914-0157-480d-b873-57bcc6c6eaad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.137162 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39bbbc63-15d5-418d-bd13-95a97ae85e63-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39bbbc63-15d5-418d-bd13-95a97ae85e63" (UID: "39bbbc63-15d5-418d-bd13-95a97ae85e63"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.142010 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39bbbc63-15d5-418d-bd13-95a97ae85e63-kube-api-access-qdxsq" (OuterVolumeSpecName: "kube-api-access-qdxsq") pod "39bbbc63-15d5-418d-bd13-95a97ae85e63" (UID: "39bbbc63-15d5-418d-bd13-95a97ae85e63"). InnerVolumeSpecName "kube-api-access-qdxsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.152838 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/314bb914-0157-480d-b873-57bcc6c6eaad-kube-api-access-hpv8p" (OuterVolumeSpecName: "kube-api-access-hpv8p") pod "314bb914-0157-480d-b873-57bcc6c6eaad" (UID: "314bb914-0157-480d-b873-57bcc6c6eaad"). InnerVolumeSpecName "kube-api-access-hpv8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.152885 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956a8195-f544-4073-9042-544d311ef500-kube-api-access-v5bgx" (OuterVolumeSpecName: "kube-api-access-v5bgx") pod "956a8195-f544-4073-9042-544d311ef500" (UID: "956a8195-f544-4073-9042-544d311ef500"). InnerVolumeSpecName "kube-api-access-v5bgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.178220 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-glss8" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.190682 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555370-6g99d"] Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.237221 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd52e387-66f4-4b53-bfa7-23199af03b5e-operator-scripts\") pod \"cd52e387-66f4-4b53-bfa7-23199af03b5e\" (UID: \"cd52e387-66f4-4b53-bfa7-23199af03b5e\") " Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.237434 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tww4v\" (UniqueName: \"kubernetes.io/projected/cd52e387-66f4-4b53-bfa7-23199af03b5e-kube-api-access-tww4v\") pod \"cd52e387-66f4-4b53-bfa7-23199af03b5e\" (UID: \"cd52e387-66f4-4b53-bfa7-23199af03b5e\") " Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.237987 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/314bb914-0157-480d-b873-57bcc6c6eaad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.238011 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpv8p\" (UniqueName: \"kubernetes.io/projected/314bb914-0157-480d-b873-57bcc6c6eaad-kube-api-access-hpv8p\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.238024 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5bgx\" (UniqueName: \"kubernetes.io/projected/956a8195-f544-4073-9042-544d311ef500-kube-api-access-v5bgx\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.238037 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39bbbc63-15d5-418d-bd13-95a97ae85e63-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.238047 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/956a8195-f544-4073-9042-544d311ef500-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.238060 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdxsq\" (UniqueName: \"kubernetes.io/projected/39bbbc63-15d5-418d-bd13-95a97ae85e63-kube-api-access-qdxsq\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.238890 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd52e387-66f4-4b53-bfa7-23199af03b5e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd52e387-66f4-4b53-bfa7-23199af03b5e" (UID: "cd52e387-66f4-4b53-bfa7-23199af03b5e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.245011 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd52e387-66f4-4b53-bfa7-23199af03b5e-kube-api-access-tww4v" (OuterVolumeSpecName: "kube-api-access-tww4v") pod "cd52e387-66f4-4b53-bfa7-23199af03b5e" (UID: "cd52e387-66f4-4b53-bfa7-23199af03b5e"). InnerVolumeSpecName "kube-api-access-tww4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.325495 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l"] Mar 12 13:30:01 crc kubenswrapper[4921]: W0312 13:30:01.329922 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e52e7fe_8b59_4e0f_a70d_cc63836749b4.slice/crio-79bb3d25af8930a14a3721f3a2ebf6a63cdef8aa0f4df87280f9d84dcada7aa1 WatchSource:0}: Error finding container 79bb3d25af8930a14a3721f3a2ebf6a63cdef8aa0f4df87280f9d84dcada7aa1: Status 404 returned error can't find the container with id 79bb3d25af8930a14a3721f3a2ebf6a63cdef8aa0f4df87280f9d84dcada7aa1 Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.339228 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba94cc5-7d33-4ec9-a923-f711f9794a5a-operator-scripts\") pod \"eba94cc5-7d33-4ec9-a923-f711f9794a5a\" (UID: \"eba94cc5-7d33-4ec9-a923-f711f9794a5a\") " Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.339578 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pvk6\" (UniqueName: \"kubernetes.io/projected/eba94cc5-7d33-4ec9-a923-f711f9794a5a-kube-api-access-8pvk6\") pod \"eba94cc5-7d33-4ec9-a923-f711f9794a5a\" (UID: \"eba94cc5-7d33-4ec9-a923-f711f9794a5a\") " Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.340026 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tww4v\" (UniqueName: \"kubernetes.io/projected/cd52e387-66f4-4b53-bfa7-23199af03b5e-kube-api-access-tww4v\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.340048 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd52e387-66f4-4b53-bfa7-23199af03b5e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.340930 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eba94cc5-7d33-4ec9-a923-f711f9794a5a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eba94cc5-7d33-4ec9-a923-f711f9794a5a" (UID: "eba94cc5-7d33-4ec9-a923-f711f9794a5a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.343408 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eba94cc5-7d33-4ec9-a923-f711f9794a5a-kube-api-access-8pvk6" (OuterVolumeSpecName: "kube-api-access-8pvk6") pod "eba94cc5-7d33-4ec9-a923-f711f9794a5a" (UID: "eba94cc5-7d33-4ec9-a923-f711f9794a5a"). InnerVolumeSpecName "kube-api-access-8pvk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.442256 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pvk6\" (UniqueName: \"kubernetes.io/projected/eba94cc5-7d33-4ec9-a923-f711f9794a5a-kube-api-access-8pvk6\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.442289 4921 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba94cc5-7d33-4ec9-a923-f711f9794a5a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.598282 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ce1b-account-create-update-dpqvt" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.598299 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ce1b-account-create-update-dpqvt" event={"ID":"cd52e387-66f4-4b53-bfa7-23199af03b5e","Type":"ContainerDied","Data":"460a141c22d92926c4d5bc68ebe44322882f34cc26b83ce4343ec2205ec4caac"} Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.598339 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="460a141c22d92926c4d5bc68ebe44322882f34cc26b83ce4343ec2205ec4caac" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.606260 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m98rn" event={"ID":"314bb914-0157-480d-b873-57bcc6c6eaad","Type":"ContainerDied","Data":"d7064031d6d0fb127ecbcc0fc4eabf62b5cf1e12191caf0a0b2b5ddc410cd5a2"} Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.606298 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7064031d6d0fb127ecbcc0fc4eabf62b5cf1e12191caf0a0b2b5ddc410cd5a2" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.606362 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m98rn" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.610791 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-247a-account-create-update-4l9sn" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.611163 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-247a-account-create-update-4l9sn" event={"ID":"956a8195-f544-4073-9042-544d311ef500","Type":"ContainerDied","Data":"d491ffa6fae0b7ad9dd243372520b1f28648d649297b509fd54ca94e3731a022"} Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.611193 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d491ffa6fae0b7ad9dd243372520b1f28648d649297b509fd54ca94e3731a022" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.613340 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" event={"ID":"9e52e7fe-8b59-4e0f-a70d-cc63836749b4","Type":"ContainerStarted","Data":"79bb3d25af8930a14a3721f3a2ebf6a63cdef8aa0f4df87280f9d84dcada7aa1"} Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.614591 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-mxzdk" event={"ID":"39bbbc63-15d5-418d-bd13-95a97ae85e63","Type":"ContainerDied","Data":"efc1849dc48c6b80de3573d5cd0fc9eea0b3877ab5d6eb844173aff1645e54ff"} Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.614609 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-mxzdk" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.614617 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efc1849dc48c6b80de3573d5cd0fc9eea0b3877ab5d6eb844173aff1645e54ff" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.615751 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555370-6g99d" event={"ID":"f4c99a71-792e-4bc1-81d5-e75c67437787","Type":"ContainerStarted","Data":"ba58d44aacf1bffcaebef566a1a069ef98d78f503251ef9c28ac14de6c066656"} Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.624106 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-glss8" event={"ID":"eba94cc5-7d33-4ec9-a923-f711f9794a5a","Type":"ContainerDied","Data":"7dae6af1b86cc9782763fa86646a9e1782e4c8a26eee03fc571e36d034e3b109"} Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.624141 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dae6af1b86cc9782763fa86646a9e1782e4c8a26eee03fc571e36d034e3b109" Mar 12 13:30:01 crc kubenswrapper[4921]: I0312 13:30:01.624202 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-glss8" Mar 12 13:30:02 crc kubenswrapper[4921]: I0312 13:30:02.634255 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76e28dd2-ce66-4efc-b2fd-b3b221144aa6","Type":"ContainerStarted","Data":"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7"} Mar 12 13:30:02 crc kubenswrapper[4921]: I0312 13:30:02.636177 4921 generic.go:334] "Generic (PLEG): container finished" podID="9e52e7fe-8b59-4e0f-a70d-cc63836749b4" containerID="3c6e616b4b05287a4a4056d55160b0b809b8de7a24bcd9a779b790e54a669cb9" exitCode=0 Mar 12 13:30:02 crc kubenswrapper[4921]: I0312 13:30:02.636223 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" event={"ID":"9e52e7fe-8b59-4e0f-a70d-cc63836749b4","Type":"ContainerDied","Data":"3c6e616b4b05287a4a4056d55160b0b809b8de7a24bcd9a779b790e54a669cb9"} Mar 12 13:30:03 crc kubenswrapper[4921]: I0312 13:30:03.368199 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:30:03 crc kubenswrapper[4921]: I0312 13:30:03.645843 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555370-6g99d" event={"ID":"f4c99a71-792e-4bc1-81d5-e75c67437787","Type":"ContainerStarted","Data":"514a39081be976dbb2b8573ffca91f0b4371f6c7fa4a3fbf11e8f04783c97598"} Mar 12 13:30:03 crc kubenswrapper[4921]: I0312 13:30:03.648656 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76e28dd2-ce66-4efc-b2fd-b3b221144aa6","Type":"ContainerStarted","Data":"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2"} Mar 12 13:30:03 crc kubenswrapper[4921]: I0312 13:30:03.648725 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76e28dd2-ce66-4efc-b2fd-b3b221144aa6","Type":"ContainerStarted","Data":"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327"} Mar 12 13:30:03 crc kubenswrapper[4921]: I0312 13:30:03.669519 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555370-6g99d" podStartSLOduration=1.8338831789999999 podStartE2EDuration="3.669502335s" podCreationTimestamp="2026-03-12 13:30:00 +0000 UTC" firstStartedPulling="2026-03-12 13:30:01.189859032 +0000 UTC m=+1223.879931003" lastFinishedPulling="2026-03-12 13:30:03.025478158 +0000 UTC m=+1225.715550159" observedRunningTime="2026-03-12 13:30:03.664129662 +0000 UTC m=+1226.354201633" watchObservedRunningTime="2026-03-12 13:30:03.669502335 +0000 UTC m=+1226.359574306" Mar 12 13:30:03 crc kubenswrapper[4921]: I0312 13:30:03.965095 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" Mar 12 13:30:04 crc kubenswrapper[4921]: I0312 13:30:04.135426 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-config-volume\") pod \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\" (UID: \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\") " Mar 12 13:30:04 crc kubenswrapper[4921]: I0312 13:30:04.135477 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-secret-volume\") pod \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\" (UID: \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\") " Mar 12 13:30:04 crc kubenswrapper[4921]: I0312 13:30:04.135602 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6gn2\" (UniqueName: \"kubernetes.io/projected/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-kube-api-access-c6gn2\") pod \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\" (UID: \"9e52e7fe-8b59-4e0f-a70d-cc63836749b4\") " Mar 12 13:30:04 crc kubenswrapper[4921]: I0312 13:30:04.136336 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-config-volume" (OuterVolumeSpecName: "config-volume") pod "9e52e7fe-8b59-4e0f-a70d-cc63836749b4" (UID: "9e52e7fe-8b59-4e0f-a70d-cc63836749b4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:30:04 crc kubenswrapper[4921]: I0312 13:30:04.136686 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:04 crc kubenswrapper[4921]: I0312 13:30:04.142046 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9e52e7fe-8b59-4e0f-a70d-cc63836749b4" (UID: "9e52e7fe-8b59-4e0f-a70d-cc63836749b4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:04 crc kubenswrapper[4921]: I0312 13:30:04.148162 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-kube-api-access-c6gn2" (OuterVolumeSpecName: "kube-api-access-c6gn2") pod "9e52e7fe-8b59-4e0f-a70d-cc63836749b4" (UID: "9e52e7fe-8b59-4e0f-a70d-cc63836749b4"). InnerVolumeSpecName "kube-api-access-c6gn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:04 crc kubenswrapper[4921]: I0312 13:30:04.238239 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:04 crc kubenswrapper[4921]: I0312 13:30:04.238279 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6gn2\" (UniqueName: \"kubernetes.io/projected/9e52e7fe-8b59-4e0f-a70d-cc63836749b4-kube-api-access-c6gn2\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:04 crc kubenswrapper[4921]: I0312 13:30:04.657736 4921 generic.go:334] "Generic (PLEG): container finished" podID="f4c99a71-792e-4bc1-81d5-e75c67437787" containerID="514a39081be976dbb2b8573ffca91f0b4371f6c7fa4a3fbf11e8f04783c97598" exitCode=0 Mar 12 13:30:04 crc kubenswrapper[4921]: I0312 13:30:04.657980 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555370-6g99d" event={"ID":"f4c99a71-792e-4bc1-81d5-e75c67437787","Type":"ContainerDied","Data":"514a39081be976dbb2b8573ffca91f0b4371f6c7fa4a3fbf11e8f04783c97598"} Mar 12 13:30:04 crc kubenswrapper[4921]: I0312 13:30:04.659561 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" event={"ID":"9e52e7fe-8b59-4e0f-a70d-cc63836749b4","Type":"ContainerDied","Data":"79bb3d25af8930a14a3721f3a2ebf6a63cdef8aa0f4df87280f9d84dcada7aa1"} Mar 12 13:30:04 crc kubenswrapper[4921]: I0312 13:30:04.659585 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79bb3d25af8930a14a3721f3a2ebf6a63cdef8aa0f4df87280f9d84dcada7aa1" Mar 12 13:30:04 crc kubenswrapper[4921]: I0312 13:30:04.659646 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.028960 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t9g6d"] Mar 12 13:30:06 crc kubenswrapper[4921]: E0312 13:30:06.029727 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956a8195-f544-4073-9042-544d311ef500" containerName="mariadb-account-create-update" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.029739 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="956a8195-f544-4073-9042-544d311ef500" containerName="mariadb-account-create-update" Mar 12 13:30:06 crc kubenswrapper[4921]: E0312 13:30:06.029754 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="314bb914-0157-480d-b873-57bcc6c6eaad" containerName="mariadb-database-create" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.029769 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="314bb914-0157-480d-b873-57bcc6c6eaad" containerName="mariadb-database-create" Mar 12 13:30:06 crc kubenswrapper[4921]: E0312 13:30:06.029785 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e52e7fe-8b59-4e0f-a70d-cc63836749b4" containerName="collect-profiles" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.029790 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e52e7fe-8b59-4e0f-a70d-cc63836749b4" containerName="collect-profiles" Mar 12 13:30:06 crc kubenswrapper[4921]: E0312 13:30:06.029806 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eba94cc5-7d33-4ec9-a923-f711f9794a5a" containerName="mariadb-database-create" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.029824 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba94cc5-7d33-4ec9-a923-f711f9794a5a" containerName="mariadb-database-create" Mar 12 13:30:06 crc kubenswrapper[4921]: E0312 13:30:06.029834 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd52e387-66f4-4b53-bfa7-23199af03b5e" containerName="mariadb-account-create-update" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.029840 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd52e387-66f4-4b53-bfa7-23199af03b5e" containerName="mariadb-account-create-update" Mar 12 13:30:06 crc kubenswrapper[4921]: E0312 13:30:06.029852 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39bbbc63-15d5-418d-bd13-95a97ae85e63" containerName="mariadb-database-create" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.029858 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="39bbbc63-15d5-418d-bd13-95a97ae85e63" containerName="mariadb-database-create" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.030138 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e52e7fe-8b59-4e0f-a70d-cc63836749b4" containerName="collect-profiles" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.030170 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="eba94cc5-7d33-4ec9-a923-f711f9794a5a" containerName="mariadb-database-create" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.030178 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="39bbbc63-15d5-418d-bd13-95a97ae85e63" containerName="mariadb-database-create" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.030193 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd52e387-66f4-4b53-bfa7-23199af03b5e" containerName="mariadb-account-create-update" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.030201 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="314bb914-0157-480d-b873-57bcc6c6eaad" containerName="mariadb-database-create" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.030214 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="956a8195-f544-4073-9042-544d311ef500" containerName="mariadb-account-create-update" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.030739 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.033939 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.034043 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wwpsw" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.034435 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.041828 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t9g6d"] Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.094386 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555370-6g99d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.175384 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-config-data\") pod \"nova-cell0-conductor-db-sync-t9g6d\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.175739 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-scripts\") pod \"nova-cell0-conductor-db-sync-t9g6d\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.179444 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znzgs\" (UniqueName: \"kubernetes.io/projected/409470f7-5137-49c5-8d79-358a4466e1db-kube-api-access-znzgs\") pod \"nova-cell0-conductor-db-sync-t9g6d\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.179718 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t9g6d\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.283518 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5hds\" (UniqueName: \"kubernetes.io/projected/f4c99a71-792e-4bc1-81d5-e75c67437787-kube-api-access-l5hds\") pod \"f4c99a71-792e-4bc1-81d5-e75c67437787\" (UID: \"f4c99a71-792e-4bc1-81d5-e75c67437787\") " Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.284090 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-scripts\") pod \"nova-cell0-conductor-db-sync-t9g6d\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.284154 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znzgs\" (UniqueName: \"kubernetes.io/projected/409470f7-5137-49c5-8d79-358a4466e1db-kube-api-access-znzgs\") pod \"nova-cell0-conductor-db-sync-t9g6d\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.284204 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t9g6d\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.284245 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-config-data\") pod \"nova-cell0-conductor-db-sync-t9g6d\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.297753 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4c99a71-792e-4bc1-81d5-e75c67437787-kube-api-access-l5hds" (OuterVolumeSpecName: "kube-api-access-l5hds") pod "f4c99a71-792e-4bc1-81d5-e75c67437787" (UID: "f4c99a71-792e-4bc1-81d5-e75c67437787"). InnerVolumeSpecName "kube-api-access-l5hds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.300264 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-config-data\") pod \"nova-cell0-conductor-db-sync-t9g6d\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.302437 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-scripts\") pod \"nova-cell0-conductor-db-sync-t9g6d\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.302891 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znzgs\" (UniqueName: \"kubernetes.io/projected/409470f7-5137-49c5-8d79-358a4466e1db-kube-api-access-znzgs\") pod \"nova-cell0-conductor-db-sync-t9g6d\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.303333 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-t9g6d\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.385973 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5hds\" (UniqueName: \"kubernetes.io/projected/f4c99a71-792e-4bc1-81d5-e75c67437787-kube-api-access-l5hds\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.421574 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.430795 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.512475 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-69d56fdd9b-bhnqx"] Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.512714 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-69d56fdd9b-bhnqx" podUID="8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" containerName="neutron-api" containerID="cri-o://5de3ad8d3563a1c396e01a64104cc10969c7df05da22fb2e628de17147678961" gracePeriod=30 Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.513200 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-69d56fdd9b-bhnqx" podUID="8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" containerName="neutron-httpd" containerID="cri-o://6208e6e16328b07ee57acad0b7d54e2206e69379e81312961f6b0b4954b50748" gracePeriod=30 Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.679120 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555370-6g99d" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.680002 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555370-6g99d" event={"ID":"f4c99a71-792e-4bc1-81d5-e75c67437787","Type":"ContainerDied","Data":"ba58d44aacf1bffcaebef566a1a069ef98d78f503251ef9c28ac14de6c066656"} Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.680040 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba58d44aacf1bffcaebef566a1a069ef98d78f503251ef9c28ac14de6c066656" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.684748 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76e28dd2-ce66-4efc-b2fd-b3b221144aa6","Type":"ContainerStarted","Data":"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1"} Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.684919 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="ceilometer-central-agent" containerID="cri-o://0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7" gracePeriod=30 Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.685138 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.685366 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="proxy-httpd" containerID="cri-o://f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1" gracePeriod=30 Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.685406 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="sg-core" containerID="cri-o://9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2" gracePeriod=30 Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.685439 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="ceilometer-notification-agent" containerID="cri-o://2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327" gracePeriod=30 Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.744163 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.39987484 podStartE2EDuration="7.744146531s" podCreationTimestamp="2026-03-12 13:29:59 +0000 UTC" firstStartedPulling="2026-03-12 13:30:00.503781884 +0000 UTC m=+1223.193853845" lastFinishedPulling="2026-03-12 13:30:05.848053555 +0000 UTC m=+1228.538125536" observedRunningTime="2026-03-12 13:30:06.734034146 +0000 UTC m=+1229.424106117" watchObservedRunningTime="2026-03-12 13:30:06.744146531 +0000 UTC m=+1229.434218502" Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.751250 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555364-rx4zr"] Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.757626 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555364-rx4zr"] Mar 12 13:30:06 crc kubenswrapper[4921]: I0312 13:30:06.981651 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t9g6d"] Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.386733 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.421829 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-log-httpd\") pod \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.421898 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-combined-ca-bundle\") pod \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.421969 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-run-httpd\") pod \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.421987 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfcgh\" (UniqueName: \"kubernetes.io/projected/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-kube-api-access-mfcgh\") pod \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.422033 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-scripts\") pod \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.422137 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-sg-core-conf-yaml\") pod \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.422183 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-config-data\") pod \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\" (UID: \"76e28dd2-ce66-4efc-b2fd-b3b221144aa6\") " Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.422545 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "76e28dd2-ce66-4efc-b2fd-b3b221144aa6" (UID: "76e28dd2-ce66-4efc-b2fd-b3b221144aa6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.422742 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "76e28dd2-ce66-4efc-b2fd-b3b221144aa6" (UID: "76e28dd2-ce66-4efc-b2fd-b3b221144aa6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.427285 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-kube-api-access-mfcgh" (OuterVolumeSpecName: "kube-api-access-mfcgh") pod "76e28dd2-ce66-4efc-b2fd-b3b221144aa6" (UID: "76e28dd2-ce66-4efc-b2fd-b3b221144aa6"). InnerVolumeSpecName "kube-api-access-mfcgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.428100 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-scripts" (OuterVolumeSpecName: "scripts") pod "76e28dd2-ce66-4efc-b2fd-b3b221144aa6" (UID: "76e28dd2-ce66-4efc-b2fd-b3b221144aa6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.450039 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "76e28dd2-ce66-4efc-b2fd-b3b221144aa6" (UID: "76e28dd2-ce66-4efc-b2fd-b3b221144aa6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.479992 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76e28dd2-ce66-4efc-b2fd-b3b221144aa6" (UID: "76e28dd2-ce66-4efc-b2fd-b3b221144aa6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.496597 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-config-data" (OuterVolumeSpecName: "config-data") pod "76e28dd2-ce66-4efc-b2fd-b3b221144aa6" (UID: "76e28dd2-ce66-4efc-b2fd-b3b221144aa6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.524373 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.524429 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.524439 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.524449 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfcgh\" (UniqueName: \"kubernetes.io/projected/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-kube-api-access-mfcgh\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.524477 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.524487 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.524497 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76e28dd2-ce66-4efc-b2fd-b3b221144aa6-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.713357 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t9g6d" event={"ID":"409470f7-5137-49c5-8d79-358a4466e1db","Type":"ContainerStarted","Data":"d5648e8d7429e7905b8234a23256e1a5c2272db0152ab7de628e58591e6f95ca"} Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.717422 4921 generic.go:334] "Generic (PLEG): container finished" podID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerID="f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1" exitCode=0 Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.717458 4921 generic.go:334] "Generic (PLEG): container finished" podID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerID="9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2" exitCode=2 Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.717469 4921 generic.go:334] "Generic (PLEG): container finished" podID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerID="2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327" exitCode=0 Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.717477 4921 generic.go:334] "Generic (PLEG): container finished" podID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerID="0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7" exitCode=0 Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.717481 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.717568 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76e28dd2-ce66-4efc-b2fd-b3b221144aa6","Type":"ContainerDied","Data":"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1"} Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.717608 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76e28dd2-ce66-4efc-b2fd-b3b221144aa6","Type":"ContainerDied","Data":"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2"} Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.717627 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76e28dd2-ce66-4efc-b2fd-b3b221144aa6","Type":"ContainerDied","Data":"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327"} Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.717639 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76e28dd2-ce66-4efc-b2fd-b3b221144aa6","Type":"ContainerDied","Data":"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7"} Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.717651 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"76e28dd2-ce66-4efc-b2fd-b3b221144aa6","Type":"ContainerDied","Data":"b034ec47c3ead02211681f6ab8a6b4e915d1aca4bb25c7a373e56b072220bf56"} Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.717670 4921 scope.go:117] "RemoveContainer" containerID="f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.719694 4921 generic.go:334] "Generic (PLEG): container finished" podID="8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" containerID="6208e6e16328b07ee57acad0b7d54e2206e69379e81312961f6b0b4954b50748" exitCode=0 Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.719726 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d56fdd9b-bhnqx" event={"ID":"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444","Type":"ContainerDied","Data":"6208e6e16328b07ee57acad0b7d54e2206e69379e81312961f6b0b4954b50748"} Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.751131 4921 scope.go:117] "RemoveContainer" containerID="9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.772758 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.772830 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.781171 4921 scope.go:117] "RemoveContainer" containerID="2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.792178 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:30:07 crc kubenswrapper[4921]: E0312 13:30:07.792640 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="proxy-httpd" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.792656 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="proxy-httpd" Mar 12 13:30:07 crc kubenswrapper[4921]: E0312 13:30:07.792671 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4c99a71-792e-4bc1-81d5-e75c67437787" containerName="oc" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.792679 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4c99a71-792e-4bc1-81d5-e75c67437787" containerName="oc" Mar 12 13:30:07 crc kubenswrapper[4921]: E0312 13:30:07.792716 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="ceilometer-central-agent" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.792725 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="ceilometer-central-agent" Mar 12 13:30:07 crc kubenswrapper[4921]: E0312 13:30:07.792739 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="ceilometer-notification-agent" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.792747 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="ceilometer-notification-agent" Mar 12 13:30:07 crc kubenswrapper[4921]: E0312 13:30:07.792756 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="sg-core" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.792763 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="sg-core" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.793010 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="proxy-httpd" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.793035 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="ceilometer-central-agent" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.793054 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4c99a71-792e-4bc1-81d5-e75c67437787" containerName="oc" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.793072 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="sg-core" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.793085 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" containerName="ceilometer-notification-agent" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.795010 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.797562 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.797749 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.805461 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.812241 4921 scope.go:117] "RemoveContainer" containerID="0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.829013 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.829287 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-scripts\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.829359 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zws4\" (UniqueName: \"kubernetes.io/projected/8eddb82e-04d0-438b-9cb0-f66249bcd276-kube-api-access-2zws4\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.829393 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-config-data\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.829428 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eddb82e-04d0-438b-9cb0-f66249bcd276-run-httpd\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.829460 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eddb82e-04d0-438b-9cb0-f66249bcd276-log-httpd\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.829486 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.840451 4921 scope.go:117] "RemoveContainer" containerID="f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1" Mar 12 13:30:07 crc kubenswrapper[4921]: E0312 13:30:07.840898 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1\": container with ID starting with f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1 not found: ID does not exist" containerID="f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.840932 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1"} err="failed to get container status \"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1\": rpc error: code = NotFound desc = could not find container \"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1\": container with ID starting with f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.840958 4921 scope.go:117] "RemoveContainer" containerID="9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2" Mar 12 13:30:07 crc kubenswrapper[4921]: E0312 13:30:07.841278 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2\": container with ID starting with 9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2 not found: ID does not exist" containerID="9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.841324 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2"} err="failed to get container status \"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2\": rpc error: code = NotFound desc = could not find container \"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2\": container with ID starting with 9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.841354 4921 scope.go:117] "RemoveContainer" containerID="2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327" Mar 12 13:30:07 crc kubenswrapper[4921]: E0312 13:30:07.841712 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327\": container with ID starting with 2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327 not found: ID does not exist" containerID="2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.841756 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327"} err="failed to get container status \"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327\": rpc error: code = NotFound desc = could not find container \"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327\": container with ID starting with 2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.841788 4921 scope.go:117] "RemoveContainer" containerID="0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7" Mar 12 13:30:07 crc kubenswrapper[4921]: E0312 13:30:07.842127 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7\": container with ID starting with 0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7 not found: ID does not exist" containerID="0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.842162 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7"} err="failed to get container status \"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7\": rpc error: code = NotFound desc = could not find container \"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7\": container with ID starting with 0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.842182 4921 scope.go:117] "RemoveContainer" containerID="f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.842492 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1"} err="failed to get container status \"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1\": rpc error: code = NotFound desc = could not find container \"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1\": container with ID starting with f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.842512 4921 scope.go:117] "RemoveContainer" containerID="9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.842740 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2"} err="failed to get container status \"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2\": rpc error: code = NotFound desc = could not find container \"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2\": container with ID starting with 9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.842760 4921 scope.go:117] "RemoveContainer" containerID="2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.843064 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327"} err="failed to get container status \"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327\": rpc error: code = NotFound desc = could not find container \"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327\": container with ID starting with 2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.843082 4921 scope.go:117] "RemoveContainer" containerID="0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.843270 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7"} err="failed to get container status \"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7\": rpc error: code = NotFound desc = could not find container \"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7\": container with ID starting with 0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.843287 4921 scope.go:117] "RemoveContainer" containerID="f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.843588 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1"} err="failed to get container status \"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1\": rpc error: code = NotFound desc = could not find container \"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1\": container with ID starting with f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.843612 4921 scope.go:117] "RemoveContainer" containerID="9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.843961 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2"} err="failed to get container status \"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2\": rpc error: code = NotFound desc = could not find container \"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2\": container with ID starting with 9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.843985 4921 scope.go:117] "RemoveContainer" containerID="2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.844249 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327"} err="failed to get container status \"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327\": rpc error: code = NotFound desc = could not find container \"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327\": container with ID starting with 2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.844327 4921 scope.go:117] "RemoveContainer" containerID="0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.844553 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7"} err="failed to get container status \"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7\": rpc error: code = NotFound desc = could not find container \"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7\": container with ID starting with 0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.844592 4921 scope.go:117] "RemoveContainer" containerID="f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.844940 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1"} err="failed to get container status \"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1\": rpc error: code = NotFound desc = could not find container \"f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1\": container with ID starting with f05985d315439d45117a009e75703b0d5a2a2e6ca67f07dc25c98db1a07227d1 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.844958 4921 scope.go:117] "RemoveContainer" containerID="9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.845223 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2"} err="failed to get container status \"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2\": rpc error: code = NotFound desc = could not find container \"9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2\": container with ID starting with 9ac38ced983ad823537138cef18f66f89414e497d8563f175cba71a5504ec6c2 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.845249 4921 scope.go:117] "RemoveContainer" containerID="2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.845487 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327"} err="failed to get container status \"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327\": rpc error: code = NotFound desc = could not find container \"2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327\": container with ID starting with 2d8c467ff6b58e9ba8ca613079891834d5627b408d4c1fd475caca4713976327 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.845505 4921 scope.go:117] "RemoveContainer" containerID="0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.845842 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7"} err="failed to get container status \"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7\": rpc error: code = NotFound desc = could not find container \"0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7\": container with ID starting with 0e60ae113dea5ea93899d6c791aeb7ba53990818de24e1b5e1d25fc84d0ed7b7 not found: ID does not exist" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.931512 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.931594 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-scripts\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.931653 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zws4\" (UniqueName: \"kubernetes.io/projected/8eddb82e-04d0-438b-9cb0-f66249bcd276-kube-api-access-2zws4\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.931683 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-config-data\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.931716 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eddb82e-04d0-438b-9cb0-f66249bcd276-run-httpd\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.931749 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eddb82e-04d0-438b-9cb0-f66249bcd276-log-httpd\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.931771 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.933363 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eddb82e-04d0-438b-9cb0-f66249bcd276-run-httpd\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.936122 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eddb82e-04d0-438b-9cb0-f66249bcd276-log-httpd\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.937676 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.938175 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.938785 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-scripts\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.939577 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-config-data\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.953738 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zws4\" (UniqueName: \"kubernetes.io/projected/8eddb82e-04d0-438b-9cb0-f66249bcd276-kube-api-access-2zws4\") pod \"ceilometer-0\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " pod="openstack/ceilometer-0" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.992908 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76e28dd2-ce66-4efc-b2fd-b3b221144aa6" path="/var/lib/kubelet/pods/76e28dd2-ce66-4efc-b2fd-b3b221144aa6/volumes" Mar 12 13:30:07 crc kubenswrapper[4921]: I0312 13:30:07.993551 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d2337cb-6456-4ad6-9be8-7da6c785025c" path="/var/lib/kubelet/pods/9d2337cb-6456-4ad6-9be8-7da6c785025c/volumes" Mar 12 13:30:08 crc kubenswrapper[4921]: I0312 13:30:08.113626 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:30:08 crc kubenswrapper[4921]: I0312 13:30:08.598354 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:30:08 crc kubenswrapper[4921]: W0312 13:30:08.612665 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8eddb82e_04d0_438b_9cb0_f66249bcd276.slice/crio-70b84c02c8add7625843f1ce6bcc7bb51d49621e7fde67436865b85519a9bff6 WatchSource:0}: Error finding container 70b84c02c8add7625843f1ce6bcc7bb51d49621e7fde67436865b85519a9bff6: Status 404 returned error can't find the container with id 70b84c02c8add7625843f1ce6bcc7bb51d49621e7fde67436865b85519a9bff6 Mar 12 13:30:08 crc kubenswrapper[4921]: I0312 13:30:08.729899 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eddb82e-04d0-438b-9cb0-f66249bcd276","Type":"ContainerStarted","Data":"70b84c02c8add7625843f1ce6bcc7bb51d49621e7fde67436865b85519a9bff6"} Mar 12 13:30:09 crc kubenswrapper[4921]: I0312 13:30:09.740369 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eddb82e-04d0-438b-9cb0-f66249bcd276","Type":"ContainerStarted","Data":"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff"} Mar 12 13:30:10 crc kubenswrapper[4921]: I0312 13:30:10.751220 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eddb82e-04d0-438b-9cb0-f66249bcd276","Type":"ContainerStarted","Data":"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0"} Mar 12 13:30:11 crc kubenswrapper[4921]: I0312 13:30:11.971604 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:30:12 crc kubenswrapper[4921]: I0312 13:30:12.306336 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7f7ffb8f48-l6m2k" Mar 12 13:30:12 crc kubenswrapper[4921]: I0312 13:30:12.374134 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6cff966cbd-8c6fq"] Mar 12 13:30:12 crc kubenswrapper[4921]: I0312 13:30:12.374366 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6cff966cbd-8c6fq" podUID="88666465-b61a-40e2-b20f-c8e6ad561ad8" containerName="placement-log" containerID="cri-o://902c256cb4282cba89b058f5c40357a799a90cac9c0a13138373c64e6b689cd2" gracePeriod=30 Mar 12 13:30:12 crc kubenswrapper[4921]: I0312 13:30:12.374749 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6cff966cbd-8c6fq" podUID="88666465-b61a-40e2-b20f-c8e6ad561ad8" containerName="placement-api" containerID="cri-o://da715f6ca5f367dbad5b9c201230eccc4275d1a915f195c25ae08bb7138d1cfb" gracePeriod=30 Mar 12 13:30:12 crc kubenswrapper[4921]: I0312 13:30:12.768832 4921 generic.go:334] "Generic (PLEG): container finished" podID="88666465-b61a-40e2-b20f-c8e6ad561ad8" containerID="902c256cb4282cba89b058f5c40357a799a90cac9c0a13138373c64e6b689cd2" exitCode=143 Mar 12 13:30:12 crc kubenswrapper[4921]: I0312 13:30:12.768850 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cff966cbd-8c6fq" event={"ID":"88666465-b61a-40e2-b20f-c8e6ad561ad8","Type":"ContainerDied","Data":"902c256cb4282cba89b058f5c40357a799a90cac9c0a13138373c64e6b689cd2"} Mar 12 13:30:12 crc kubenswrapper[4921]: I0312 13:30:12.909833 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:30:15 crc kubenswrapper[4921]: I0312 13:30:15.814097 4921 generic.go:334] "Generic (PLEG): container finished" podID="88666465-b61a-40e2-b20f-c8e6ad561ad8" containerID="da715f6ca5f367dbad5b9c201230eccc4275d1a915f195c25ae08bb7138d1cfb" exitCode=0 Mar 12 13:30:15 crc kubenswrapper[4921]: I0312 13:30:15.815228 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cff966cbd-8c6fq" event={"ID":"88666465-b61a-40e2-b20f-c8e6ad561ad8","Type":"ContainerDied","Data":"da715f6ca5f367dbad5b9c201230eccc4275d1a915f195c25ae08bb7138d1cfb"} Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.450201 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.586053 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-public-tls-certs\") pod \"88666465-b61a-40e2-b20f-c8e6ad561ad8\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.586107 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-combined-ca-bundle\") pod \"88666465-b61a-40e2-b20f-c8e6ad561ad8\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.586150 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88666465-b61a-40e2-b20f-c8e6ad561ad8-logs\") pod \"88666465-b61a-40e2-b20f-c8e6ad561ad8\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.586180 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62fhm\" (UniqueName: \"kubernetes.io/projected/88666465-b61a-40e2-b20f-c8e6ad561ad8-kube-api-access-62fhm\") pod \"88666465-b61a-40e2-b20f-c8e6ad561ad8\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.586210 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-internal-tls-certs\") pod \"88666465-b61a-40e2-b20f-c8e6ad561ad8\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.586266 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-config-data\") pod \"88666465-b61a-40e2-b20f-c8e6ad561ad8\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.586324 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-scripts\") pod \"88666465-b61a-40e2-b20f-c8e6ad561ad8\" (UID: \"88666465-b61a-40e2-b20f-c8e6ad561ad8\") " Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.587492 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88666465-b61a-40e2-b20f-c8e6ad561ad8-logs" (OuterVolumeSpecName: "logs") pod "88666465-b61a-40e2-b20f-c8e6ad561ad8" (UID: "88666465-b61a-40e2-b20f-c8e6ad561ad8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.592932 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88666465-b61a-40e2-b20f-c8e6ad561ad8-kube-api-access-62fhm" (OuterVolumeSpecName: "kube-api-access-62fhm") pod "88666465-b61a-40e2-b20f-c8e6ad561ad8" (UID: "88666465-b61a-40e2-b20f-c8e6ad561ad8"). InnerVolumeSpecName "kube-api-access-62fhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.593688 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-scripts" (OuterVolumeSpecName: "scripts") pod "88666465-b61a-40e2-b20f-c8e6ad561ad8" (UID: "88666465-b61a-40e2-b20f-c8e6ad561ad8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.636721 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88666465-b61a-40e2-b20f-c8e6ad561ad8" (UID: "88666465-b61a-40e2-b20f-c8e6ad561ad8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.661985 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-config-data" (OuterVolumeSpecName: "config-data") pod "88666465-b61a-40e2-b20f-c8e6ad561ad8" (UID: "88666465-b61a-40e2-b20f-c8e6ad561ad8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.674733 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "88666465-b61a-40e2-b20f-c8e6ad561ad8" (UID: "88666465-b61a-40e2-b20f-c8e6ad561ad8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.675619 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "88666465-b61a-40e2-b20f-c8e6ad561ad8" (UID: "88666465-b61a-40e2-b20f-c8e6ad561ad8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.688865 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.688902 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.688914 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88666465-b61a-40e2-b20f-c8e6ad561ad8-logs\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.688925 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62fhm\" (UniqueName: \"kubernetes.io/projected/88666465-b61a-40e2-b20f-c8e6ad561ad8-kube-api-access-62fhm\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.688936 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.688944 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.688953 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88666465-b61a-40e2-b20f-c8e6ad561ad8-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.830000 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eddb82e-04d0-438b-9cb0-f66249bcd276","Type":"ContainerStarted","Data":"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912"} Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.831641 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t9g6d" event={"ID":"409470f7-5137-49c5-8d79-358a4466e1db","Type":"ContainerStarted","Data":"74516089e9d6d419de312e366d1fb36ab956835375f0c9c9f071d367d710f30a"} Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.835549 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cff966cbd-8c6fq" event={"ID":"88666465-b61a-40e2-b20f-c8e6ad561ad8","Type":"ContainerDied","Data":"fcf080e4ff2a6b1d6dfbda4cbc58b1e7d01dcb622a9d09c419f1d4e5a8d5a242"} Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.835589 4921 scope.go:117] "RemoveContainer" containerID="da715f6ca5f367dbad5b9c201230eccc4275d1a915f195c25ae08bb7138d1cfb" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.835712 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cff966cbd-8c6fq" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.883184 4921 scope.go:117] "RemoveContainer" containerID="902c256cb4282cba89b058f5c40357a799a90cac9c0a13138373c64e6b689cd2" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.892149 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-t9g6d" podStartSLOduration=1.67106195 podStartE2EDuration="10.892128834s" podCreationTimestamp="2026-03-12 13:30:06 +0000 UTC" firstStartedPulling="2026-03-12 13:30:06.984149276 +0000 UTC m=+1229.674221247" lastFinishedPulling="2026-03-12 13:30:16.20521616 +0000 UTC m=+1238.895288131" observedRunningTime="2026-03-12 13:30:16.873608964 +0000 UTC m=+1239.563680965" watchObservedRunningTime="2026-03-12 13:30:16.892128834 +0000 UTC m=+1239.582200805" Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.927538 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6cff966cbd-8c6fq"] Mar 12 13:30:16 crc kubenswrapper[4921]: I0312 13:30:16.952334 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6cff966cbd-8c6fq"] Mar 12 13:30:17 crc kubenswrapper[4921]: I0312 13:30:17.848264 4921 generic.go:334] "Generic (PLEG): container finished" podID="8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" containerID="5de3ad8d3563a1c396e01a64104cc10969c7df05da22fb2e628de17147678961" exitCode=0 Mar 12 13:30:17 crc kubenswrapper[4921]: I0312 13:30:17.848350 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d56fdd9b-bhnqx" event={"ID":"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444","Type":"ContainerDied","Data":"5de3ad8d3563a1c396e01a64104cc10969c7df05da22fb2e628de17147678961"} Mar 12 13:30:17 crc kubenswrapper[4921]: I0312 13:30:17.848648 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-69d56fdd9b-bhnqx" event={"ID":"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444","Type":"ContainerDied","Data":"c86ec04c16847ce2fbe63232169f040cd0ee6d80498af3414c591b6e55fe6eff"} Mar 12 13:30:17 crc kubenswrapper[4921]: I0312 13:30:17.848664 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c86ec04c16847ce2fbe63232169f040cd0ee6d80498af3414c591b6e55fe6eff" Mar 12 13:30:17 crc kubenswrapper[4921]: I0312 13:30:17.906348 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.001429 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88666465-b61a-40e2-b20f-c8e6ad561ad8" path="/var/lib/kubelet/pods/88666465-b61a-40e2-b20f-c8e6ad561ad8/volumes" Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.017039 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-httpd-config\") pod \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.017099 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf4nf\" (UniqueName: \"kubernetes.io/projected/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-kube-api-access-wf4nf\") pod \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.017150 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-combined-ca-bundle\") pod \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.017202 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-ovndb-tls-certs\") pod \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.017386 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-config\") pod \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\" (UID: \"8bb8bed4-7b5d-4b51-82fc-2cb5ce749444\") " Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.023205 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-kube-api-access-wf4nf" (OuterVolumeSpecName: "kube-api-access-wf4nf") pod "8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" (UID: "8bb8bed4-7b5d-4b51-82fc-2cb5ce749444"). InnerVolumeSpecName "kube-api-access-wf4nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.026525 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" (UID: "8bb8bed4-7b5d-4b51-82fc-2cb5ce749444"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.066804 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" (UID: "8bb8bed4-7b5d-4b51-82fc-2cb5ce749444"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.080943 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-config" (OuterVolumeSpecName: "config") pod "8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" (UID: "8bb8bed4-7b5d-4b51-82fc-2cb5ce749444"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.109182 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" (UID: "8bb8bed4-7b5d-4b51-82fc-2cb5ce749444"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.119351 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.119388 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf4nf\" (UniqueName: \"kubernetes.io/projected/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-kube-api-access-wf4nf\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.119400 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.119408 4921 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.119416 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.857462 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-69d56fdd9b-bhnqx" Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.900530 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-69d56fdd9b-bhnqx"] Mar 12 13:30:18 crc kubenswrapper[4921]: I0312 13:30:18.910252 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-69d56fdd9b-bhnqx"] Mar 12 13:30:19 crc kubenswrapper[4921]: I0312 13:30:19.872676 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eddb82e-04d0-438b-9cb0-f66249bcd276","Type":"ContainerStarted","Data":"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199"} Mar 12 13:30:19 crc kubenswrapper[4921]: I0312 13:30:19.873214 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="ceilometer-central-agent" containerID="cri-o://0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff" gracePeriod=30 Mar 12 13:30:19 crc kubenswrapper[4921]: I0312 13:30:19.873370 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 13:30:19 crc kubenswrapper[4921]: I0312 13:30:19.873922 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="proxy-httpd" containerID="cri-o://1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199" gracePeriod=30 Mar 12 13:30:19 crc kubenswrapper[4921]: I0312 13:30:19.874053 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="ceilometer-notification-agent" containerID="cri-o://85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0" gracePeriod=30 Mar 12 13:30:19 crc kubenswrapper[4921]: I0312 13:30:19.874212 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="sg-core" containerID="cri-o://b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912" gracePeriod=30 Mar 12 13:30:19 crc kubenswrapper[4921]: I0312 13:30:19.918933 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.439476913 podStartE2EDuration="12.918896933s" podCreationTimestamp="2026-03-12 13:30:07 +0000 UTC" firstStartedPulling="2026-03-12 13:30:08.61627076 +0000 UTC m=+1231.306342731" lastFinishedPulling="2026-03-12 13:30:19.09569078 +0000 UTC m=+1241.785762751" observedRunningTime="2026-03-12 13:30:19.911857891 +0000 UTC m=+1242.601929962" watchObservedRunningTime="2026-03-12 13:30:19.918896933 +0000 UTC m=+1242.608968954" Mar 12 13:30:19 crc kubenswrapper[4921]: I0312 13:30:19.997955 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" path="/var/lib/kubelet/pods/8bb8bed4-7b5d-4b51-82fc-2cb5ce749444/volumes" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.816686 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.893102 4921 generic.go:334] "Generic (PLEG): container finished" podID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerID="1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199" exitCode=0 Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.893140 4921 generic.go:334] "Generic (PLEG): container finished" podID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerID="b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912" exitCode=2 Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.893150 4921 generic.go:334] "Generic (PLEG): container finished" podID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerID="85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0" exitCode=0 Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.893161 4921 generic.go:334] "Generic (PLEG): container finished" podID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerID="0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff" exitCode=0 Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.893186 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eddb82e-04d0-438b-9cb0-f66249bcd276","Type":"ContainerDied","Data":"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199"} Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.893222 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eddb82e-04d0-438b-9cb0-f66249bcd276","Type":"ContainerDied","Data":"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912"} Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.893249 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eddb82e-04d0-438b-9cb0-f66249bcd276","Type":"ContainerDied","Data":"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0"} Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.893267 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eddb82e-04d0-438b-9cb0-f66249bcd276","Type":"ContainerDied","Data":"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff"} Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.893280 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8eddb82e-04d0-438b-9cb0-f66249bcd276","Type":"ContainerDied","Data":"70b84c02c8add7625843f1ce6bcc7bb51d49621e7fde67436865b85519a9bff6"} Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.893304 4921 scope.go:117] "RemoveContainer" containerID="1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.893465 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.913518 4921 scope.go:117] "RemoveContainer" containerID="b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.929273 4921 scope.go:117] "RemoveContainer" containerID="85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.946047 4921 scope.go:117] "RemoveContainer" containerID="0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.965198 4921 scope.go:117] "RemoveContainer" containerID="1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199" Mar 12 13:30:20 crc kubenswrapper[4921]: E0312 13:30:20.965569 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199\": container with ID starting with 1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199 not found: ID does not exist" containerID="1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.965621 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199"} err="failed to get container status \"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199\": rpc error: code = NotFound desc = could not find container \"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199\": container with ID starting with 1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199 not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.965650 4921 scope.go:117] "RemoveContainer" containerID="b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912" Mar 12 13:30:20 crc kubenswrapper[4921]: E0312 13:30:20.965890 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912\": container with ID starting with b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912 not found: ID does not exist" containerID="b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.965923 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912"} err="failed to get container status \"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912\": rpc error: code = NotFound desc = could not find container \"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912\": container with ID starting with b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912 not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.965942 4921 scope.go:117] "RemoveContainer" containerID="85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0" Mar 12 13:30:20 crc kubenswrapper[4921]: E0312 13:30:20.966291 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0\": container with ID starting with 85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0 not found: ID does not exist" containerID="85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.966324 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0"} err="failed to get container status \"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0\": rpc error: code = NotFound desc = could not find container \"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0\": container with ID starting with 85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0 not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.966347 4921 scope.go:117] "RemoveContainer" containerID="0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff" Mar 12 13:30:20 crc kubenswrapper[4921]: E0312 13:30:20.966725 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff\": container with ID starting with 0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff not found: ID does not exist" containerID="0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.966782 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff"} err="failed to get container status \"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff\": rpc error: code = NotFound desc = could not find container \"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff\": container with ID starting with 0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.966836 4921 scope.go:117] "RemoveContainer" containerID="1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.967333 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199"} err="failed to get container status \"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199\": rpc error: code = NotFound desc = could not find container \"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199\": container with ID starting with 1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199 not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.967593 4921 scope.go:117] "RemoveContainer" containerID="b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.968069 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912"} err="failed to get container status \"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912\": rpc error: code = NotFound desc = could not find container \"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912\": container with ID starting with b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912 not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.968096 4921 scope.go:117] "RemoveContainer" containerID="85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.968836 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0"} err="failed to get container status \"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0\": rpc error: code = NotFound desc = could not find container \"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0\": container with ID starting with 85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0 not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.968879 4921 scope.go:117] "RemoveContainer" containerID="0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.969115 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff"} err="failed to get container status \"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff\": rpc error: code = NotFound desc = could not find container \"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff\": container with ID starting with 0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.969134 4921 scope.go:117] "RemoveContainer" containerID="1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.969577 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199"} err="failed to get container status \"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199\": rpc error: code = NotFound desc = could not find container \"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199\": container with ID starting with 1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199 not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.969609 4921 scope.go:117] "RemoveContainer" containerID="b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.969883 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912"} err="failed to get container status \"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912\": rpc error: code = NotFound desc = could not find container \"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912\": container with ID starting with b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912 not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.969902 4921 scope.go:117] "RemoveContainer" containerID="85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.970153 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0"} err="failed to get container status \"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0\": rpc error: code = NotFound desc = could not find container \"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0\": container with ID starting with 85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0 not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.970170 4921 scope.go:117] "RemoveContainer" containerID="0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.970557 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff"} err="failed to get container status \"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff\": rpc error: code = NotFound desc = could not find container \"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff\": container with ID starting with 0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.970587 4921 scope.go:117] "RemoveContainer" containerID="1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.970892 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199"} err="failed to get container status \"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199\": rpc error: code = NotFound desc = could not find container \"1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199\": container with ID starting with 1143562c228681390ca95fb0409111cfe90744dc2ebf16312daa00616ecb7199 not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.970915 4921 scope.go:117] "RemoveContainer" containerID="b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.971113 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912"} err="failed to get container status \"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912\": rpc error: code = NotFound desc = could not find container \"b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912\": container with ID starting with b6b69e2a3aa0eb37740a298d598510ed015954e9dd5cc8cc947d68792a535912 not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.971133 4921 scope.go:117] "RemoveContainer" containerID="85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.971296 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0"} err="failed to get container status \"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0\": rpc error: code = NotFound desc = could not find container \"85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0\": container with ID starting with 85660ca48a58cdb52cd2f067a1c5b27d4a506542cbae4904de7f19ae3e15d2d0 not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.971308 4921 scope.go:117] "RemoveContainer" containerID="0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.971449 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff"} err="failed to get container status \"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff\": rpc error: code = NotFound desc = could not find container \"0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff\": container with ID starting with 0b6f0a637b3f599f3bf485fd06835e67b5bf3c5359489806fc346be6b57679ff not found: ID does not exist" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.972698 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-config-data\") pod \"8eddb82e-04d0-438b-9cb0-f66249bcd276\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.972899 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eddb82e-04d0-438b-9cb0-f66249bcd276-run-httpd\") pod \"8eddb82e-04d0-438b-9cb0-f66249bcd276\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.973103 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eddb82e-04d0-438b-9cb0-f66249bcd276-log-httpd\") pod \"8eddb82e-04d0-438b-9cb0-f66249bcd276\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.973400 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zws4\" (UniqueName: \"kubernetes.io/projected/8eddb82e-04d0-438b-9cb0-f66249bcd276-kube-api-access-2zws4\") pod \"8eddb82e-04d0-438b-9cb0-f66249bcd276\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.973669 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-combined-ca-bundle\") pod \"8eddb82e-04d0-438b-9cb0-f66249bcd276\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.973408 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eddb82e-04d0-438b-9cb0-f66249bcd276-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8eddb82e-04d0-438b-9cb0-f66249bcd276" (UID: "8eddb82e-04d0-438b-9cb0-f66249bcd276"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.973623 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eddb82e-04d0-438b-9cb0-f66249bcd276-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8eddb82e-04d0-438b-9cb0-f66249bcd276" (UID: "8eddb82e-04d0-438b-9cb0-f66249bcd276"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.974191 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-sg-core-conf-yaml\") pod \"8eddb82e-04d0-438b-9cb0-f66249bcd276\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.974369 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-scripts\") pod \"8eddb82e-04d0-438b-9cb0-f66249bcd276\" (UID: \"8eddb82e-04d0-438b-9cb0-f66249bcd276\") " Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.975629 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eddb82e-04d0-438b-9cb0-f66249bcd276-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.976078 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8eddb82e-04d0-438b-9cb0-f66249bcd276-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.979937 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-scripts" (OuterVolumeSpecName: "scripts") pod "8eddb82e-04d0-438b-9cb0-f66249bcd276" (UID: "8eddb82e-04d0-438b-9cb0-f66249bcd276"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:20 crc kubenswrapper[4921]: I0312 13:30:20.997636 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eddb82e-04d0-438b-9cb0-f66249bcd276-kube-api-access-2zws4" (OuterVolumeSpecName: "kube-api-access-2zws4") pod "8eddb82e-04d0-438b-9cb0-f66249bcd276" (UID: "8eddb82e-04d0-438b-9cb0-f66249bcd276"). InnerVolumeSpecName "kube-api-access-2zws4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.004969 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8eddb82e-04d0-438b-9cb0-f66249bcd276" (UID: "8eddb82e-04d0-438b-9cb0-f66249bcd276"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.046695 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8eddb82e-04d0-438b-9cb0-f66249bcd276" (UID: "8eddb82e-04d0-438b-9cb0-f66249bcd276"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.064290 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-config-data" (OuterVolumeSpecName: "config-data") pod "8eddb82e-04d0-438b-9cb0-f66249bcd276" (UID: "8eddb82e-04d0-438b-9cb0-f66249bcd276"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.078186 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zws4\" (UniqueName: \"kubernetes.io/projected/8eddb82e-04d0-438b-9cb0-f66249bcd276-kube-api-access-2zws4\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.078220 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.078233 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.078243 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.078253 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8eddb82e-04d0-438b-9cb0-f66249bcd276-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.235464 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.244753 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.277415 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:30:21 crc kubenswrapper[4921]: E0312 13:30:21.277859 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="ceilometer-notification-agent" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.277886 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="ceilometer-notification-agent" Mar 12 13:30:21 crc kubenswrapper[4921]: E0312 13:30:21.277901 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="sg-core" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.277909 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="sg-core" Mar 12 13:30:21 crc kubenswrapper[4921]: E0312 13:30:21.277922 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88666465-b61a-40e2-b20f-c8e6ad561ad8" containerName="placement-api" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.277949 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="88666465-b61a-40e2-b20f-c8e6ad561ad8" containerName="placement-api" Mar 12 13:30:21 crc kubenswrapper[4921]: E0312 13:30:21.277966 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" containerName="neutron-api" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.277974 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" containerName="neutron-api" Mar 12 13:30:21 crc kubenswrapper[4921]: E0312 13:30:21.278003 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="ceilometer-central-agent" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.278011 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="ceilometer-central-agent" Mar 12 13:30:21 crc kubenswrapper[4921]: E0312 13:30:21.278024 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="proxy-httpd" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.278031 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="proxy-httpd" Mar 12 13:30:21 crc kubenswrapper[4921]: E0312 13:30:21.278042 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88666465-b61a-40e2-b20f-c8e6ad561ad8" containerName="placement-log" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.278050 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="88666465-b61a-40e2-b20f-c8e6ad561ad8" containerName="placement-log" Mar 12 13:30:21 crc kubenswrapper[4921]: E0312 13:30:21.278064 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" containerName="neutron-httpd" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.278073 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" containerName="neutron-httpd" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.278266 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="sg-core" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.278286 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="ceilometer-notification-agent" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.278299 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="proxy-httpd" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.278314 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" containerName="neutron-api" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.278327 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb8bed4-7b5d-4b51-82fc-2cb5ce749444" containerName="neutron-httpd" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.278335 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="88666465-b61a-40e2-b20f-c8e6ad561ad8" containerName="placement-log" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.278345 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="88666465-b61a-40e2-b20f-c8e6ad561ad8" containerName="placement-api" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.278353 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" containerName="ceilometer-central-agent" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.280220 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.284457 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.284739 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.303202 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.382771 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.382848 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-scripts\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.382967 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24ae94ce-75ed-4179-b457-22252ef9664b-run-httpd\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.382981 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-config-data\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.382999 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24ae94ce-75ed-4179-b457-22252ef9664b-log-httpd\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.383018 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.383035 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxqqf\" (UniqueName: \"kubernetes.io/projected/24ae94ce-75ed-4179-b457-22252ef9664b-kube-api-access-kxqqf\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.484169 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24ae94ce-75ed-4179-b457-22252ef9664b-run-httpd\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.484215 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-config-data\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.484257 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24ae94ce-75ed-4179-b457-22252ef9664b-log-httpd\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.484277 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxqqf\" (UniqueName: \"kubernetes.io/projected/24ae94ce-75ed-4179-b457-22252ef9664b-kube-api-access-kxqqf\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.485061 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24ae94ce-75ed-4179-b457-22252ef9664b-run-httpd\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.485100 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24ae94ce-75ed-4179-b457-22252ef9664b-log-httpd\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.485113 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.485396 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.485514 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-scripts\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.489185 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.490065 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-scripts\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.490741 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-config-data\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.490941 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.520177 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxqqf\" (UniqueName: \"kubernetes.io/projected/24ae94ce-75ed-4179-b457-22252ef9664b-kube-api-access-kxqqf\") pod \"ceilometer-0\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.656380 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:30:21 crc kubenswrapper[4921]: I0312 13:30:21.994329 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eddb82e-04d0-438b-9cb0-f66249bcd276" path="/var/lib/kubelet/pods/8eddb82e-04d0-438b-9cb0-f66249bcd276/volumes" Mar 12 13:30:22 crc kubenswrapper[4921]: W0312 13:30:22.146122 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24ae94ce_75ed_4179_b457_22252ef9664b.slice/crio-79cfdbd8708d65f421b12fff7e08b877335552b69559f630ddb2acb0130ad268 WatchSource:0}: Error finding container 79cfdbd8708d65f421b12fff7e08b877335552b69559f630ddb2acb0130ad268: Status 404 returned error can't find the container with id 79cfdbd8708d65f421b12fff7e08b877335552b69559f630ddb2acb0130ad268 Mar 12 13:30:22 crc kubenswrapper[4921]: I0312 13:30:22.151551 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:30:22 crc kubenswrapper[4921]: I0312 13:30:22.918248 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24ae94ce-75ed-4179-b457-22252ef9664b","Type":"ContainerStarted","Data":"79cfdbd8708d65f421b12fff7e08b877335552b69559f630ddb2acb0130ad268"} Mar 12 13:30:23 crc kubenswrapper[4921]: I0312 13:30:23.928270 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24ae94ce-75ed-4179-b457-22252ef9664b","Type":"ContainerStarted","Data":"5067117721fc092451c4f9b01c750a89c89680c8845f066ffd39b954a70bff68"} Mar 12 13:30:23 crc kubenswrapper[4921]: I0312 13:30:23.928606 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24ae94ce-75ed-4179-b457-22252ef9664b","Type":"ContainerStarted","Data":"b33ba9207b0baccb57a49d2265b852fe1696b657bc0dda39bc9d05884905bb61"} Mar 12 13:30:24 crc kubenswrapper[4921]: I0312 13:30:24.939663 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24ae94ce-75ed-4179-b457-22252ef9664b","Type":"ContainerStarted","Data":"c93d7f4e6563a5458f4745865be6dc06aceca7828b20f9e5878ac34b437a485e"} Mar 12 13:30:25 crc kubenswrapper[4921]: I0312 13:30:25.947327 4921 generic.go:334] "Generic (PLEG): container finished" podID="409470f7-5137-49c5-8d79-358a4466e1db" containerID="74516089e9d6d419de312e366d1fb36ab956835375f0c9c9f071d367d710f30a" exitCode=0 Mar 12 13:30:25 crc kubenswrapper[4921]: I0312 13:30:25.947393 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t9g6d" event={"ID":"409470f7-5137-49c5-8d79-358a4466e1db","Type":"ContainerDied","Data":"74516089e9d6d419de312e366d1fb36ab956835375f0c9c9f071d367d710f30a"} Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.268058 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.399297 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znzgs\" (UniqueName: \"kubernetes.io/projected/409470f7-5137-49c5-8d79-358a4466e1db-kube-api-access-znzgs\") pod \"409470f7-5137-49c5-8d79-358a4466e1db\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.399641 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-scripts\") pod \"409470f7-5137-49c5-8d79-358a4466e1db\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.400127 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-config-data\") pod \"409470f7-5137-49c5-8d79-358a4466e1db\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.400169 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-combined-ca-bundle\") pod \"409470f7-5137-49c5-8d79-358a4466e1db\" (UID: \"409470f7-5137-49c5-8d79-358a4466e1db\") " Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.403985 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-scripts" (OuterVolumeSpecName: "scripts") pod "409470f7-5137-49c5-8d79-358a4466e1db" (UID: "409470f7-5137-49c5-8d79-358a4466e1db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.404682 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/409470f7-5137-49c5-8d79-358a4466e1db-kube-api-access-znzgs" (OuterVolumeSpecName: "kube-api-access-znzgs") pod "409470f7-5137-49c5-8d79-358a4466e1db" (UID: "409470f7-5137-49c5-8d79-358a4466e1db"). InnerVolumeSpecName "kube-api-access-znzgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.426600 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "409470f7-5137-49c5-8d79-358a4466e1db" (UID: "409470f7-5137-49c5-8d79-358a4466e1db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.427698 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-config-data" (OuterVolumeSpecName: "config-data") pod "409470f7-5137-49c5-8d79-358a4466e1db" (UID: "409470f7-5137-49c5-8d79-358a4466e1db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.501841 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znzgs\" (UniqueName: \"kubernetes.io/projected/409470f7-5137-49c5-8d79-358a4466e1db-kube-api-access-znzgs\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.501874 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.501884 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.501896 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/409470f7-5137-49c5-8d79-358a4466e1db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.976937 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-t9g6d" event={"ID":"409470f7-5137-49c5-8d79-358a4466e1db","Type":"ContainerDied","Data":"d5648e8d7429e7905b8234a23256e1a5c2272db0152ab7de628e58591e6f95ca"} Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.976995 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5648e8d7429e7905b8234a23256e1a5c2272db0152ab7de628e58591e6f95ca" Mar 12 13:30:27 crc kubenswrapper[4921]: I0312 13:30:27.977048 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-t9g6d" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.009933 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.009979 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24ae94ce-75ed-4179-b457-22252ef9664b","Type":"ContainerStarted","Data":"60ad4625461cb042e7fca2f54c152e8678c5cf207605d18a176c244f6f5ec03a"} Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.068450 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 13:30:28 crc kubenswrapper[4921]: E0312 13:30:28.069522 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="409470f7-5137-49c5-8d79-358a4466e1db" containerName="nova-cell0-conductor-db-sync" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.069585 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="409470f7-5137-49c5-8d79-358a4466e1db" containerName="nova-cell0-conductor-db-sync" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.069942 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="409470f7-5137-49c5-8d79-358a4466e1db" containerName="nova-cell0-conductor-db-sync" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.070921 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.074333 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wwpsw" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.074606 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.075539 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.421577699 podStartE2EDuration="7.075512824s" podCreationTimestamp="2026-03-12 13:30:21 +0000 UTC" firstStartedPulling="2026-03-12 13:30:22.148466307 +0000 UTC m=+1244.838538278" lastFinishedPulling="2026-03-12 13:30:26.802401382 +0000 UTC m=+1249.492473403" observedRunningTime="2026-03-12 13:30:28.057713855 +0000 UTC m=+1250.747785866" watchObservedRunningTime="2026-03-12 13:30:28.075512824 +0000 UTC m=+1250.765584815" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.088389 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.137057 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072b6f7c-f4af-4657-82e6-ff8acb7404d5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"072b6f7c-f4af-4657-82e6-ff8acb7404d5\") " pod="openstack/nova-cell0-conductor-0" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.137414 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xclfk\" (UniqueName: \"kubernetes.io/projected/072b6f7c-f4af-4657-82e6-ff8acb7404d5-kube-api-access-xclfk\") pod \"nova-cell0-conductor-0\" (UID: \"072b6f7c-f4af-4657-82e6-ff8acb7404d5\") " pod="openstack/nova-cell0-conductor-0" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.137513 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/072b6f7c-f4af-4657-82e6-ff8acb7404d5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"072b6f7c-f4af-4657-82e6-ff8acb7404d5\") " pod="openstack/nova-cell0-conductor-0" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.239197 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xclfk\" (UniqueName: \"kubernetes.io/projected/072b6f7c-f4af-4657-82e6-ff8acb7404d5-kube-api-access-xclfk\") pod \"nova-cell0-conductor-0\" (UID: \"072b6f7c-f4af-4657-82e6-ff8acb7404d5\") " pod="openstack/nova-cell0-conductor-0" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.239242 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/072b6f7c-f4af-4657-82e6-ff8acb7404d5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"072b6f7c-f4af-4657-82e6-ff8acb7404d5\") " pod="openstack/nova-cell0-conductor-0" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.239300 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072b6f7c-f4af-4657-82e6-ff8acb7404d5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"072b6f7c-f4af-4657-82e6-ff8acb7404d5\") " pod="openstack/nova-cell0-conductor-0" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.255200 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072b6f7c-f4af-4657-82e6-ff8acb7404d5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"072b6f7c-f4af-4657-82e6-ff8acb7404d5\") " pod="openstack/nova-cell0-conductor-0" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.264914 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/072b6f7c-f4af-4657-82e6-ff8acb7404d5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"072b6f7c-f4af-4657-82e6-ff8acb7404d5\") " pod="openstack/nova-cell0-conductor-0" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.266130 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xclfk\" (UniqueName: \"kubernetes.io/projected/072b6f7c-f4af-4657-82e6-ff8acb7404d5-kube-api-access-xclfk\") pod \"nova-cell0-conductor-0\" (UID: \"072b6f7c-f4af-4657-82e6-ff8acb7404d5\") " pod="openstack/nova-cell0-conductor-0" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.387732 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.830735 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 13:30:28 crc kubenswrapper[4921]: W0312 13:30:28.835526 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod072b6f7c_f4af_4657_82e6_ff8acb7404d5.slice/crio-d40b2ffe3509e65651e0053f079151f9ad6b08f277ecf2df392132f360c3b781 WatchSource:0}: Error finding container d40b2ffe3509e65651e0053f079151f9ad6b08f277ecf2df392132f360c3b781: Status 404 returned error can't find the container with id d40b2ffe3509e65651e0053f079151f9ad6b08f277ecf2df392132f360c3b781 Mar 12 13:30:28 crc kubenswrapper[4921]: I0312 13:30:28.996785 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"072b6f7c-f4af-4657-82e6-ff8acb7404d5","Type":"ContainerStarted","Data":"d40b2ffe3509e65651e0053f079151f9ad6b08f277ecf2df392132f360c3b781"} Mar 12 13:30:30 crc kubenswrapper[4921]: I0312 13:30:30.004200 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"072b6f7c-f4af-4657-82e6-ff8acb7404d5","Type":"ContainerStarted","Data":"bcd93f23e812837c446f08a6ae149f2c3b7e2ede13b88272a28274cdc6268f86"} Mar 12 13:30:30 crc kubenswrapper[4921]: I0312 13:30:30.004414 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 12 13:30:30 crc kubenswrapper[4921]: I0312 13:30:30.030632 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.03060996 podStartE2EDuration="2.03060996s" podCreationTimestamp="2026-03-12 13:30:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:30:30.024533967 +0000 UTC m=+1252.714606028" watchObservedRunningTime="2026-03-12 13:30:30.03060996 +0000 UTC m=+1252.720681941" Mar 12 13:30:38 crc kubenswrapper[4921]: I0312 13:30:38.445336 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 12 13:30:38 crc kubenswrapper[4921]: I0312 13:30:38.983399 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pm2mb"] Mar 12 13:30:38 crc kubenswrapper[4921]: I0312 13:30:38.985069 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:38 crc kubenswrapper[4921]: I0312 13:30:38.989088 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 12 13:30:38 crc kubenswrapper[4921]: I0312 13:30:38.990013 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.029909 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pm2mb"] Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.074549 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g597t\" (UniqueName: \"kubernetes.io/projected/6f893a7c-5319-4a48-b19e-7405f7d64887-kube-api-access-g597t\") pod \"nova-cell0-cell-mapping-pm2mb\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.074857 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-scripts\") pod \"nova-cell0-cell-mapping-pm2mb\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.074942 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pm2mb\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.075030 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-config-data\") pod \"nova-cell0-cell-mapping-pm2mb\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.176683 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-scripts\") pod \"nova-cell0-cell-mapping-pm2mb\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.176765 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pm2mb\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.176797 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-config-data\") pod \"nova-cell0-cell-mapping-pm2mb\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.176850 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g597t\" (UniqueName: \"kubernetes.io/projected/6f893a7c-5319-4a48-b19e-7405f7d64887-kube-api-access-g597t\") pod \"nova-cell0-cell-mapping-pm2mb\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.186877 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pm2mb\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.186941 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-config-data\") pod \"nova-cell0-cell-mapping-pm2mb\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.190237 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-scripts\") pod \"nova-cell0-cell-mapping-pm2mb\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.194993 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g597t\" (UniqueName: \"kubernetes.io/projected/6f893a7c-5319-4a48-b19e-7405f7d64887-kube-api-access-g597t\") pod \"nova-cell0-cell-mapping-pm2mb\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.254400 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.256160 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.261185 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.278952 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.280569 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 13:30:39 crc kubenswrapper[4921]: W0312 13:30:39.286350 4921 reflector.go:561] object-"openstack"/"nova-metadata-config-data": failed to list *v1.Secret: secrets "nova-metadata-config-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Mar 12 13:30:39 crc kubenswrapper[4921]: E0312 13:30:39.286409 4921 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"nova-metadata-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"nova-metadata-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.287373 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.298652 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.314455 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.382777 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97h5l\" (UniqueName: \"kubernetes.io/projected/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-kube-api-access-97h5l\") pod \"nova-api-0\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " pod="openstack/nova-api-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.382839 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-config-data\") pod \"nova-api-0\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " pod="openstack/nova-api-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.382856 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-logs\") pod \"nova-api-0\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " pod="openstack/nova-api-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.382956 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/217d08e5-4eb0-4ba3-8c09-2249efa14d22-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " pod="openstack/nova-metadata-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.382971 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " pod="openstack/nova-api-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.382986 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/217d08e5-4eb0-4ba3-8c09-2249efa14d22-config-data\") pod \"nova-metadata-0\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " pod="openstack/nova-metadata-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.383006 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwg6l\" (UniqueName: \"kubernetes.io/projected/217d08e5-4eb0-4ba3-8c09-2249efa14d22-kube-api-access-xwg6l\") pod \"nova-metadata-0\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " pod="openstack/nova-metadata-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.383028 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/217d08e5-4eb0-4ba3-8c09-2249efa14d22-logs\") pod \"nova-metadata-0\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " pod="openstack/nova-metadata-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.398275 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.416039 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.435151 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.476174 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.491095 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-46l8p"] Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.494318 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.489069 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/217d08e5-4eb0-4ba3-8c09-2249efa14d22-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " pod="openstack/nova-metadata-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.503386 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " pod="openstack/nova-api-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.503568 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd9bc3b0-f90b-44d7-97af-c960fac4d831-config-data\") pod \"nova-scheduler-0\" (UID: \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.503878 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/217d08e5-4eb0-4ba3-8c09-2249efa14d22-config-data\") pod \"nova-metadata-0\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " pod="openstack/nova-metadata-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.504195 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwg6l\" (UniqueName: \"kubernetes.io/projected/217d08e5-4eb0-4ba3-8c09-2249efa14d22-kube-api-access-xwg6l\") pod \"nova-metadata-0\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " pod="openstack/nova-metadata-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.505034 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/217d08e5-4eb0-4ba3-8c09-2249efa14d22-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " pod="openstack/nova-metadata-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.540398 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-46l8p"] Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.541287 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd9bc3b0-f90b-44d7-97af-c960fac4d831-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.541322 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/217d08e5-4eb0-4ba3-8c09-2249efa14d22-logs\") pod \"nova-metadata-0\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " pod="openstack/nova-metadata-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.541378 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97h5l\" (UniqueName: \"kubernetes.io/projected/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-kube-api-access-97h5l\") pod \"nova-api-0\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " pod="openstack/nova-api-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.541410 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-config-data\") pod \"nova-api-0\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " pod="openstack/nova-api-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.541430 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-logs\") pod \"nova-api-0\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " pod="openstack/nova-api-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.541509 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzvfg\" (UniqueName: \"kubernetes.io/projected/bd9bc3b0-f90b-44d7-97af-c960fac4d831-kube-api-access-qzvfg\") pod \"nova-scheduler-0\" (UID: \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.541941 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/217d08e5-4eb0-4ba3-8c09-2249efa14d22-logs\") pod \"nova-metadata-0\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " pod="openstack/nova-metadata-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.542211 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-logs\") pod \"nova-api-0\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " pod="openstack/nova-api-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.543317 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " pod="openstack/nova-api-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.546628 4921 scope.go:117] "RemoveContainer" containerID="6dade520635acbe8d38599c5edab3eb49ce6c98a0fe8b1c22720cbc70fc59f28" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.547174 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-config-data\") pod \"nova-api-0\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " pod="openstack/nova-api-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.561571 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwg6l\" (UniqueName: \"kubernetes.io/projected/217d08e5-4eb0-4ba3-8c09-2249efa14d22-kube-api-access-xwg6l\") pod \"nova-metadata-0\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " pod="openstack/nova-metadata-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.564837 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97h5l\" (UniqueName: \"kubernetes.io/projected/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-kube-api-access-97h5l\") pod \"nova-api-0\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " pod="openstack/nova-api-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.577257 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.582983 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.585242 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.589241 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.606580 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.660879 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd9bc3b0-f90b-44d7-97af-c960fac4d831-config-data\") pod \"nova-scheduler-0\" (UID: \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.661279 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0fe145-4f1b-4673-8876-6fa26c82d046-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7b0fe145-4f1b-4673-8876-6fa26c82d046\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.661307 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckcsk\" (UniqueName: \"kubernetes.io/projected/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-kube-api-access-ckcsk\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.661333 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd9bc3b0-f90b-44d7-97af-c960fac4d831-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.661377 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.661425 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b0fe145-4f1b-4673-8876-6fa26c82d046-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7b0fe145-4f1b-4673-8876-6fa26c82d046\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.661440 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvt6n\" (UniqueName: \"kubernetes.io/projected/7b0fe145-4f1b-4673-8876-6fa26c82d046-kube-api-access-bvt6n\") pod \"nova-cell1-novncproxy-0\" (UID: \"7b0fe145-4f1b-4673-8876-6fa26c82d046\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.661469 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzvfg\" (UniqueName: \"kubernetes.io/projected/bd9bc3b0-f90b-44d7-97af-c960fac4d831-kube-api-access-qzvfg\") pod \"nova-scheduler-0\" (UID: \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.661493 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.661525 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-dns-svc\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.661570 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-config\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.667575 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd9bc3b0-f90b-44d7-97af-c960fac4d831-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.680369 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd9bc3b0-f90b-44d7-97af-c960fac4d831-config-data\") pod \"nova-scheduler-0\" (UID: \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.686845 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzvfg\" (UniqueName: \"kubernetes.io/projected/bd9bc3b0-f90b-44d7-97af-c960fac4d831-kube-api-access-qzvfg\") pod \"nova-scheduler-0\" (UID: \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.763790 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b0fe145-4f1b-4673-8876-6fa26c82d046-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7b0fe145-4f1b-4673-8876-6fa26c82d046\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.763861 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvt6n\" (UniqueName: \"kubernetes.io/projected/7b0fe145-4f1b-4673-8876-6fa26c82d046-kube-api-access-bvt6n\") pod \"nova-cell1-novncproxy-0\" (UID: \"7b0fe145-4f1b-4673-8876-6fa26c82d046\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.763901 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.763934 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-dns-svc\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.763979 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-config\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.764018 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0fe145-4f1b-4673-8876-6fa26c82d046-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7b0fe145-4f1b-4673-8876-6fa26c82d046\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.764036 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckcsk\" (UniqueName: \"kubernetes.io/projected/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-kube-api-access-ckcsk\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.764083 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.764942 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.765898 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-dns-svc\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.766778 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.770731 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-config\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.773715 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b0fe145-4f1b-4673-8876-6fa26c82d046-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7b0fe145-4f1b-4673-8876-6fa26c82d046\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.775316 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0fe145-4f1b-4673-8876-6fa26c82d046-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7b0fe145-4f1b-4673-8876-6fa26c82d046\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.789072 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvt6n\" (UniqueName: \"kubernetes.io/projected/7b0fe145-4f1b-4673-8876-6fa26c82d046-kube-api-access-bvt6n\") pod \"nova-cell1-novncproxy-0\" (UID: \"7b0fe145-4f1b-4673-8876-6fa26c82d046\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.790522 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckcsk\" (UniqueName: \"kubernetes.io/projected/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-kube-api-access-ckcsk\") pod \"dnsmasq-dns-566b5b7845-46l8p\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.790717 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.867766 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:39 crc kubenswrapper[4921]: I0312 13:30:39.912611 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.054270 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pm2mb"] Mar 12 13:30:40 crc kubenswrapper[4921]: W0312 13:30:40.063433 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f893a7c_5319_4a48_b19e_7405f7d64887.slice/crio-9570bc7e08ad97780d4a4dbb5da3ad9770558b76953eeb8e56846e971461f010 WatchSource:0}: Error finding container 9570bc7e08ad97780d4a4dbb5da3ad9770558b76953eeb8e56846e971461f010: Status 404 returned error can't find the container with id 9570bc7e08ad97780d4a4dbb5da3ad9770558b76953eeb8e56846e971461f010 Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.111066 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pm2mb" event={"ID":"6f893a7c-5319-4a48-b19e-7405f7d64887","Type":"ContainerStarted","Data":"9570bc7e08ad97780d4a4dbb5da3ad9770558b76953eeb8e56846e971461f010"} Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.127697 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lqd7h"] Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.129373 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.132074 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.132347 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.137283 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lqd7h"] Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.223379 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.283809 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nk96\" (UniqueName: \"kubernetes.io/projected/b8b648cc-0abb-4d1a-8287-9041c336c678-kube-api-access-2nk96\") pod \"nova-cell1-conductor-db-sync-lqd7h\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.283946 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-scripts\") pod \"nova-cell1-conductor-db-sync-lqd7h\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.283980 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-config-data\") pod \"nova-cell1-conductor-db-sync-lqd7h\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.284011 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lqd7h\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.380077 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.386174 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nk96\" (UniqueName: \"kubernetes.io/projected/b8b648cc-0abb-4d1a-8287-9041c336c678-kube-api-access-2nk96\") pod \"nova-cell1-conductor-db-sync-lqd7h\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.386268 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-scripts\") pod \"nova-cell1-conductor-db-sync-lqd7h\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.386312 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-config-data\") pod \"nova-cell1-conductor-db-sync-lqd7h\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.386338 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lqd7h\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.408634 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/217d08e5-4eb0-4ba3-8c09-2249efa14d22-config-data\") pod \"nova-metadata-0\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " pod="openstack/nova-metadata-0" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.424451 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-scripts\") pod \"nova-cell1-conductor-db-sync-lqd7h\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.433390 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nk96\" (UniqueName: \"kubernetes.io/projected/b8b648cc-0abb-4d1a-8287-9041c336c678-kube-api-access-2nk96\") pod \"nova-cell1-conductor-db-sync-lqd7h\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.446699 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-config-data\") pod \"nova-cell1-conductor-db-sync-lqd7h\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.465731 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.472665 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-lqd7h\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.515580 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-46l8p"] Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.526976 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.535842 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 13:30:40 crc kubenswrapper[4921]: I0312 13:30:40.764152 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:41 crc kubenswrapper[4921]: I0312 13:30:41.110602 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:30:41 crc kubenswrapper[4921]: I0312 13:30:41.122977 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd9bc3b0-f90b-44d7-97af-c960fac4d831","Type":"ContainerStarted","Data":"d6ff25260633fe282db74e0404e8c3ddbc9ebc223d782f92a7c811f349d59e57"} Mar 12 13:30:41 crc kubenswrapper[4921]: I0312 13:30:41.132125 4921 generic.go:334] "Generic (PLEG): container finished" podID="cc91c058-9ddc-41e2-b22d-0c83a87afbd7" containerID="68a5f0b7caf542de11475d4ba6d07ddc238595f1dbe4db0f2c92cd56ec3162b1" exitCode=0 Mar 12 13:30:41 crc kubenswrapper[4921]: I0312 13:30:41.132202 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-46l8p" event={"ID":"cc91c058-9ddc-41e2-b22d-0c83a87afbd7","Type":"ContainerDied","Data":"68a5f0b7caf542de11475d4ba6d07ddc238595f1dbe4db0f2c92cd56ec3162b1"} Mar 12 13:30:41 crc kubenswrapper[4921]: I0312 13:30:41.132307 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-46l8p" event={"ID":"cc91c058-9ddc-41e2-b22d-0c83a87afbd7","Type":"ContainerStarted","Data":"28812fb312b4ab863d3d1011c6826ca6e4d8020c603880690f43120927c149a8"} Mar 12 13:30:41 crc kubenswrapper[4921]: I0312 13:30:41.143126 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f783b72a-b838-4000-8d46-fa9fd1a7f8e1","Type":"ContainerStarted","Data":"95d48a7052b9d6ba2e28ccc01d1b70df9ce813708b65da33fb5f55a42d026d8e"} Mar 12 13:30:41 crc kubenswrapper[4921]: I0312 13:30:41.146644 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pm2mb" event={"ID":"6f893a7c-5319-4a48-b19e-7405f7d64887","Type":"ContainerStarted","Data":"6aa235037faf86183a1d5c20a523b0adf18be64c6990fdca817c654d763fe1ae"} Mar 12 13:30:41 crc kubenswrapper[4921]: I0312 13:30:41.149935 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7b0fe145-4f1b-4673-8876-6fa26c82d046","Type":"ContainerStarted","Data":"283d05d61aa2293a6acb2efdc540cb35c152b432dbb0745ad144e3b6bf2783f5"} Mar 12 13:30:41 crc kubenswrapper[4921]: I0312 13:30:41.188471 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pm2mb" podStartSLOduration=3.188451229 podStartE2EDuration="3.188451229s" podCreationTimestamp="2026-03-12 13:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:30:41.180806018 +0000 UTC m=+1263.870877989" watchObservedRunningTime="2026-03-12 13:30:41.188451229 +0000 UTC m=+1263.878523200" Mar 12 13:30:41 crc kubenswrapper[4921]: I0312 13:30:41.261153 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lqd7h"] Mar 12 13:30:41 crc kubenswrapper[4921]: W0312 13:30:41.270492 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8b648cc_0abb_4d1a_8287_9041c336c678.slice/crio-bd3d7d136350a03df30eacd019dbe5533d17e58144a85d6f3dbb417070e6303f WatchSource:0}: Error finding container bd3d7d136350a03df30eacd019dbe5533d17e58144a85d6f3dbb417070e6303f: Status 404 returned error can't find the container with id bd3d7d136350a03df30eacd019dbe5533d17e58144a85d6f3dbb417070e6303f Mar 12 13:30:42 crc kubenswrapper[4921]: I0312 13:30:42.165596 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lqd7h" event={"ID":"b8b648cc-0abb-4d1a-8287-9041c336c678","Type":"ContainerStarted","Data":"cec077e080f0d09301e16fc0bedc02ab7146565f449d5c63057e60869a1d8412"} Mar 12 13:30:42 crc kubenswrapper[4921]: I0312 13:30:42.166038 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lqd7h" event={"ID":"b8b648cc-0abb-4d1a-8287-9041c336c678","Type":"ContainerStarted","Data":"bd3d7d136350a03df30eacd019dbe5533d17e58144a85d6f3dbb417070e6303f"} Mar 12 13:30:42 crc kubenswrapper[4921]: I0312 13:30:42.169055 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-46l8p" event={"ID":"cc91c058-9ddc-41e2-b22d-0c83a87afbd7","Type":"ContainerStarted","Data":"f0fc8526058df2f75a2124c282888f548632652aadbf67b6c5ac4f03d4b8701a"} Mar 12 13:30:42 crc kubenswrapper[4921]: I0312 13:30:42.169124 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:42 crc kubenswrapper[4921]: I0312 13:30:42.170875 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"217d08e5-4eb0-4ba3-8c09-2249efa14d22","Type":"ContainerStarted","Data":"8bf322cf22cc787a35393302923184ec61b51eb83047e31ff13c9cdb5637d346"} Mar 12 13:30:42 crc kubenswrapper[4921]: I0312 13:30:42.188362 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-lqd7h" podStartSLOduration=2.188347183 podStartE2EDuration="2.188347183s" podCreationTimestamp="2026-03-12 13:30:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:30:42.188159697 +0000 UTC m=+1264.878231668" watchObservedRunningTime="2026-03-12 13:30:42.188347183 +0000 UTC m=+1264.878419144" Mar 12 13:30:42 crc kubenswrapper[4921]: I0312 13:30:42.210109 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-46l8p" podStartSLOduration=3.21009189 podStartE2EDuration="3.21009189s" podCreationTimestamp="2026-03-12 13:30:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:30:42.203722747 +0000 UTC m=+1264.893794718" watchObservedRunningTime="2026-03-12 13:30:42.21009189 +0000 UTC m=+1264.900163851" Mar 12 13:30:42 crc kubenswrapper[4921]: I0312 13:30:42.783222 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 13:30:42 crc kubenswrapper[4921]: I0312 13:30:42.799515 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.190453 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"217d08e5-4eb0-4ba3-8c09-2249efa14d22","Type":"ContainerStarted","Data":"8a9ab96a81d8a70955c31c19fd61b688c079bef8c3e44a71f91176aa5f39f2e3"} Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.191185 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"217d08e5-4eb0-4ba3-8c09-2249efa14d22","Type":"ContainerStarted","Data":"042a7c6eb55877775c0c17124d7a30aea19906eca4cdd64ed922b72bf671d89a"} Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.190731 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="217d08e5-4eb0-4ba3-8c09-2249efa14d22" containerName="nova-metadata-log" containerID="cri-o://042a7c6eb55877775c0c17124d7a30aea19906eca4cdd64ed922b72bf671d89a" gracePeriod=30 Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.191230 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="217d08e5-4eb0-4ba3-8c09-2249efa14d22" containerName="nova-metadata-metadata" containerID="cri-o://8a9ab96a81d8a70955c31c19fd61b688c079bef8c3e44a71f91176aa5f39f2e3" gracePeriod=30 Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.195093 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f783b72a-b838-4000-8d46-fa9fd1a7f8e1","Type":"ContainerStarted","Data":"8c12873786ff586b88facbebd212b1bcc8dc6c06e68643b6e0c522f61b30a87f"} Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.195172 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f783b72a-b838-4000-8d46-fa9fd1a7f8e1","Type":"ContainerStarted","Data":"074ed423de1b8cff82d8a0d4d19347b873547722dedc884f8452fca4999101b6"} Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.197657 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7b0fe145-4f1b-4673-8876-6fa26c82d046","Type":"ContainerStarted","Data":"0f56e514eb4f3b0c3c628b6e958dd0dec3f9253bed6d2fd6846ea94f9a6d4894"} Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.197918 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="7b0fe145-4f1b-4673-8876-6fa26c82d046" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0f56e514eb4f3b0c3c628b6e958dd0dec3f9253bed6d2fd6846ea94f9a6d4894" gracePeriod=30 Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.201277 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd9bc3b0-f90b-44d7-97af-c960fac4d831","Type":"ContainerStarted","Data":"ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee"} Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.215842 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.849868641 podStartE2EDuration="5.215799466s" podCreationTimestamp="2026-03-12 13:30:39 +0000 UTC" firstStartedPulling="2026-03-12 13:30:41.149371678 +0000 UTC m=+1263.839443649" lastFinishedPulling="2026-03-12 13:30:43.515302503 +0000 UTC m=+1266.205374474" observedRunningTime="2026-03-12 13:30:44.214314501 +0000 UTC m=+1266.904386502" watchObservedRunningTime="2026-03-12 13:30:44.215799466 +0000 UTC m=+1266.905871437" Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.239351 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.189226463 podStartE2EDuration="5.239327087s" podCreationTimestamp="2026-03-12 13:30:39 +0000 UTC" firstStartedPulling="2026-03-12 13:30:40.463924339 +0000 UTC m=+1263.153996310" lastFinishedPulling="2026-03-12 13:30:43.514024963 +0000 UTC m=+1266.204096934" observedRunningTime="2026-03-12 13:30:44.229316815 +0000 UTC m=+1266.919388786" watchObservedRunningTime="2026-03-12 13:30:44.239327087 +0000 UTC m=+1266.929399058" Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.262944 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.971636213 podStartE2EDuration="5.26292092s" podCreationTimestamp="2026-03-12 13:30:39 +0000 UTC" firstStartedPulling="2026-03-12 13:30:40.223955684 +0000 UTC m=+1262.914027655" lastFinishedPulling="2026-03-12 13:30:43.515240381 +0000 UTC m=+1266.205312362" observedRunningTime="2026-03-12 13:30:44.24869884 +0000 UTC m=+1266.938770821" watchObservedRunningTime="2026-03-12 13:30:44.26292092 +0000 UTC m=+1266.952992891" Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.274789 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.274252322 podStartE2EDuration="5.274768938s" podCreationTimestamp="2026-03-12 13:30:39 +0000 UTC" firstStartedPulling="2026-03-12 13:30:40.520012194 +0000 UTC m=+1263.210084165" lastFinishedPulling="2026-03-12 13:30:43.52052881 +0000 UTC m=+1266.210600781" observedRunningTime="2026-03-12 13:30:44.270066326 +0000 UTC m=+1266.960138297" watchObservedRunningTime="2026-03-12 13:30:44.274768938 +0000 UTC m=+1266.964840909" Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.791618 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 12 13:30:44 crc kubenswrapper[4921]: I0312 13:30:44.913065 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:30:45 crc kubenswrapper[4921]: I0312 13:30:45.211740 4921 generic.go:334] "Generic (PLEG): container finished" podID="217d08e5-4eb0-4ba3-8c09-2249efa14d22" containerID="042a7c6eb55877775c0c17124d7a30aea19906eca4cdd64ed922b72bf671d89a" exitCode=143 Mar 12 13:30:45 crc kubenswrapper[4921]: I0312 13:30:45.212047 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"217d08e5-4eb0-4ba3-8c09-2249efa14d22","Type":"ContainerDied","Data":"042a7c6eb55877775c0c17124d7a30aea19906eca4cdd64ed922b72bf671d89a"} Mar 12 13:30:48 crc kubenswrapper[4921]: I0312 13:30:48.242962 4921 generic.go:334] "Generic (PLEG): container finished" podID="6f893a7c-5319-4a48-b19e-7405f7d64887" containerID="6aa235037faf86183a1d5c20a523b0adf18be64c6990fdca817c654d763fe1ae" exitCode=0 Mar 12 13:30:48 crc kubenswrapper[4921]: I0312 13:30:48.243066 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pm2mb" event={"ID":"6f893a7c-5319-4a48-b19e-7405f7d64887","Type":"ContainerDied","Data":"6aa235037faf86183a1d5c20a523b0adf18be64c6990fdca817c654d763fe1ae"} Mar 12 13:30:48 crc kubenswrapper[4921]: I0312 13:30:48.245760 4921 generic.go:334] "Generic (PLEG): container finished" podID="b8b648cc-0abb-4d1a-8287-9041c336c678" containerID="cec077e080f0d09301e16fc0bedc02ab7146565f449d5c63057e60869a1d8412" exitCode=0 Mar 12 13:30:48 crc kubenswrapper[4921]: I0312 13:30:48.245857 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lqd7h" event={"ID":"b8b648cc-0abb-4d1a-8287-9041c336c678","Type":"ContainerDied","Data":"cec077e080f0d09301e16fc0bedc02ab7146565f449d5c63057e60869a1d8412"} Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.608200 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.608640 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.733379 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.745288 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.791473 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g597t\" (UniqueName: \"kubernetes.io/projected/6f893a7c-5319-4a48-b19e-7405f7d64887-kube-api-access-g597t\") pod \"6f893a7c-5319-4a48-b19e-7405f7d64887\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.791589 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-combined-ca-bundle\") pod \"b8b648cc-0abb-4d1a-8287-9041c336c678\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.791633 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nk96\" (UniqueName: \"kubernetes.io/projected/b8b648cc-0abb-4d1a-8287-9041c336c678-kube-api-access-2nk96\") pod \"b8b648cc-0abb-4d1a-8287-9041c336c678\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.791691 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-config-data\") pod \"6f893a7c-5319-4a48-b19e-7405f7d64887\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.791712 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.791751 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-scripts\") pod \"6f893a7c-5319-4a48-b19e-7405f7d64887\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.791836 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-scripts\") pod \"b8b648cc-0abb-4d1a-8287-9041c336c678\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.791894 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-config-data\") pod \"b8b648cc-0abb-4d1a-8287-9041c336c678\" (UID: \"b8b648cc-0abb-4d1a-8287-9041c336c678\") " Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.791956 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-combined-ca-bundle\") pod \"6f893a7c-5319-4a48-b19e-7405f7d64887\" (UID: \"6f893a7c-5319-4a48-b19e-7405f7d64887\") " Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.806760 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-scripts" (OuterVolumeSpecName: "scripts") pod "6f893a7c-5319-4a48-b19e-7405f7d64887" (UID: "6f893a7c-5319-4a48-b19e-7405f7d64887"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.806776 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f893a7c-5319-4a48-b19e-7405f7d64887-kube-api-access-g597t" (OuterVolumeSpecName: "kube-api-access-g597t") pod "6f893a7c-5319-4a48-b19e-7405f7d64887" (UID: "6f893a7c-5319-4a48-b19e-7405f7d64887"). InnerVolumeSpecName "kube-api-access-g597t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.808637 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-scripts" (OuterVolumeSpecName: "scripts") pod "b8b648cc-0abb-4d1a-8287-9041c336c678" (UID: "b8b648cc-0abb-4d1a-8287-9041c336c678"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.820070 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8b648cc-0abb-4d1a-8287-9041c336c678-kube-api-access-2nk96" (OuterVolumeSpecName: "kube-api-access-2nk96") pod "b8b648cc-0abb-4d1a-8287-9041c336c678" (UID: "b8b648cc-0abb-4d1a-8287-9041c336c678"). InnerVolumeSpecName "kube-api-access-2nk96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.833890 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-config-data" (OuterVolumeSpecName: "config-data") pod "b8b648cc-0abb-4d1a-8287-9041c336c678" (UID: "b8b648cc-0abb-4d1a-8287-9041c336c678"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.836175 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8b648cc-0abb-4d1a-8287-9041c336c678" (UID: "b8b648cc-0abb-4d1a-8287-9041c336c678"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.838654 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-config-data" (OuterVolumeSpecName: "config-data") pod "6f893a7c-5319-4a48-b19e-7405f7d64887" (UID: "6f893a7c-5319-4a48-b19e-7405f7d64887"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.840364 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.862469 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f893a7c-5319-4a48-b19e-7405f7d64887" (UID: "6f893a7c-5319-4a48-b19e-7405f7d64887"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.869543 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.895673 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.895711 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.895724 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g597t\" (UniqueName: \"kubernetes.io/projected/6f893a7c-5319-4a48-b19e-7405f7d64887-kube-api-access-g597t\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.895735 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.895746 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nk96\" (UniqueName: \"kubernetes.io/projected/b8b648cc-0abb-4d1a-8287-9041c336c678-kube-api-access-2nk96\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.895756 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.895765 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f893a7c-5319-4a48-b19e-7405f7d64887-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.895778 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8b648cc-0abb-4d1a-8287-9041c336c678-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.949424 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-msxxx"] Mar 12 13:30:49 crc kubenswrapper[4921]: I0312 13:30:49.949750 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" podUID="c073128e-fc26-48f6-98d1-cdbb747363c6" containerName="dnsmasq-dns" containerID="cri-o://17d350bba769855dcb48166e7328c4f44fbc010288bd9d064cc318b9c562d02e" gracePeriod=10 Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.269555 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pm2mb" event={"ID":"6f893a7c-5319-4a48-b19e-7405f7d64887","Type":"ContainerDied","Data":"9570bc7e08ad97780d4a4dbb5da3ad9770558b76953eeb8e56846e971461f010"} Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.269625 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9570bc7e08ad97780d4a4dbb5da3ad9770558b76953eeb8e56846e971461f010" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.269727 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pm2mb" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.272342 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-lqd7h" event={"ID":"b8b648cc-0abb-4d1a-8287-9041c336c678","Type":"ContainerDied","Data":"bd3d7d136350a03df30eacd019dbe5533d17e58144a85d6f3dbb417070e6303f"} Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.272384 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-lqd7h" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.272384 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd3d7d136350a03df30eacd019dbe5533d17e58144a85d6f3dbb417070e6303f" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.276381 4921 generic.go:334] "Generic (PLEG): container finished" podID="c073128e-fc26-48f6-98d1-cdbb747363c6" containerID="17d350bba769855dcb48166e7328c4f44fbc010288bd9d064cc318b9c562d02e" exitCode=0 Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.277260 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" event={"ID":"c073128e-fc26-48f6-98d1-cdbb747363c6","Type":"ContainerDied","Data":"17d350bba769855dcb48166e7328c4f44fbc010288bd9d064cc318b9c562d02e"} Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.357023 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.371782 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.374133 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 13:30:50 crc kubenswrapper[4921]: E0312 13:30:50.374426 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8b648cc-0abb-4d1a-8287-9041c336c678" containerName="nova-cell1-conductor-db-sync" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.374442 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b648cc-0abb-4d1a-8287-9041c336c678" containerName="nova-cell1-conductor-db-sync" Mar 12 13:30:50 crc kubenswrapper[4921]: E0312 13:30:50.374459 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c073128e-fc26-48f6-98d1-cdbb747363c6" containerName="init" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.374467 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c073128e-fc26-48f6-98d1-cdbb747363c6" containerName="init" Mar 12 13:30:50 crc kubenswrapper[4921]: E0312 13:30:50.374487 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c073128e-fc26-48f6-98d1-cdbb747363c6" containerName="dnsmasq-dns" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.374493 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c073128e-fc26-48f6-98d1-cdbb747363c6" containerName="dnsmasq-dns" Mar 12 13:30:50 crc kubenswrapper[4921]: E0312 13:30:50.374508 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f893a7c-5319-4a48-b19e-7405f7d64887" containerName="nova-manage" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.374514 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f893a7c-5319-4a48-b19e-7405f7d64887" containerName="nova-manage" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.374673 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8b648cc-0abb-4d1a-8287-9041c336c678" containerName="nova-cell1-conductor-db-sync" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.374685 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f893a7c-5319-4a48-b19e-7405f7d64887" containerName="nova-manage" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.374693 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c073128e-fc26-48f6-98d1-cdbb747363c6" containerName="dnsmasq-dns" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.375207 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.380392 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.388912 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.472966 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.473551 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f783b72a-b838-4000-8d46-fa9fd1a7f8e1" containerName="nova-api-log" containerID="cri-o://074ed423de1b8cff82d8a0d4d19347b873547722dedc884f8452fca4999101b6" gracePeriod=30 Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.473887 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f783b72a-b838-4000-8d46-fa9fd1a7f8e1" containerName="nova-api-api" containerID="cri-o://8c12873786ff586b88facbebd212b1bcc8dc6c06e68643b6e0c522f61b30a87f" gracePeriod=30 Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.482502 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f783b72a-b838-4000-8d46-fa9fd1a7f8e1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": EOF" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.482644 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f783b72a-b838-4000-8d46-fa9fd1a7f8e1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.177:8774/\": EOF" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.507487 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-ovsdbserver-nb\") pod \"c073128e-fc26-48f6-98d1-cdbb747363c6\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.507539 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krrmn\" (UniqueName: \"kubernetes.io/projected/c073128e-fc26-48f6-98d1-cdbb747363c6-kube-api-access-krrmn\") pod \"c073128e-fc26-48f6-98d1-cdbb747363c6\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.507590 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-config\") pod \"c073128e-fc26-48f6-98d1-cdbb747363c6\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.507668 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-dns-svc\") pod \"c073128e-fc26-48f6-98d1-cdbb747363c6\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.507686 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-ovsdbserver-sb\") pod \"c073128e-fc26-48f6-98d1-cdbb747363c6\" (UID: \"c073128e-fc26-48f6-98d1-cdbb747363c6\") " Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.507963 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz6hh\" (UniqueName: \"kubernetes.io/projected/a7798e1f-b22a-4ebd-a812-e8c17694cf60-kube-api-access-rz6hh\") pod \"nova-cell1-conductor-0\" (UID: \"a7798e1f-b22a-4ebd-a812-e8c17694cf60\") " pod="openstack/nova-cell1-conductor-0" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.508028 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7798e1f-b22a-4ebd-a812-e8c17694cf60-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a7798e1f-b22a-4ebd-a812-e8c17694cf60\") " pod="openstack/nova-cell1-conductor-0" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.508046 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7798e1f-b22a-4ebd-a812-e8c17694cf60-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a7798e1f-b22a-4ebd-a812-e8c17694cf60\") " pod="openstack/nova-cell1-conductor-0" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.511489 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c073128e-fc26-48f6-98d1-cdbb747363c6-kube-api-access-krrmn" (OuterVolumeSpecName: "kube-api-access-krrmn") pod "c073128e-fc26-48f6-98d1-cdbb747363c6" (UID: "c073128e-fc26-48f6-98d1-cdbb747363c6"). InnerVolumeSpecName "kube-api-access-krrmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.559746 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-config" (OuterVolumeSpecName: "config") pod "c073128e-fc26-48f6-98d1-cdbb747363c6" (UID: "c073128e-fc26-48f6-98d1-cdbb747363c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.565259 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c073128e-fc26-48f6-98d1-cdbb747363c6" (UID: "c073128e-fc26-48f6-98d1-cdbb747363c6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.568910 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c073128e-fc26-48f6-98d1-cdbb747363c6" (UID: "c073128e-fc26-48f6-98d1-cdbb747363c6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.570239 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c073128e-fc26-48f6-98d1-cdbb747363c6" (UID: "c073128e-fc26-48f6-98d1-cdbb747363c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.609609 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7798e1f-b22a-4ebd-a812-e8c17694cf60-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a7798e1f-b22a-4ebd-a812-e8c17694cf60\") " pod="openstack/nova-cell1-conductor-0" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.609650 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7798e1f-b22a-4ebd-a812-e8c17694cf60-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a7798e1f-b22a-4ebd-a812-e8c17694cf60\") " pod="openstack/nova-cell1-conductor-0" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.609759 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz6hh\" (UniqueName: \"kubernetes.io/projected/a7798e1f-b22a-4ebd-a812-e8c17694cf60-kube-api-access-rz6hh\") pod \"nova-cell1-conductor-0\" (UID: \"a7798e1f-b22a-4ebd-a812-e8c17694cf60\") " pod="openstack/nova-cell1-conductor-0" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.609847 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.609858 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krrmn\" (UniqueName: \"kubernetes.io/projected/c073128e-fc26-48f6-98d1-cdbb747363c6-kube-api-access-krrmn\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.609870 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.609879 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.609886 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c073128e-fc26-48f6-98d1-cdbb747363c6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.627880 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7798e1f-b22a-4ebd-a812-e8c17694cf60-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a7798e1f-b22a-4ebd-a812-e8c17694cf60\") " pod="openstack/nova-cell1-conductor-0" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.634474 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz6hh\" (UniqueName: \"kubernetes.io/projected/a7798e1f-b22a-4ebd-a812-e8c17694cf60-kube-api-access-rz6hh\") pod \"nova-cell1-conductor-0\" (UID: \"a7798e1f-b22a-4ebd-a812-e8c17694cf60\") " pod="openstack/nova-cell1-conductor-0" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.643310 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7798e1f-b22a-4ebd-a812-e8c17694cf60-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a7798e1f-b22a-4ebd-a812-e8c17694cf60\") " pod="openstack/nova-cell1-conductor-0" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.695866 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 12 13:30:50 crc kubenswrapper[4921]: I0312 13:30:50.911602 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:30:51 crc kubenswrapper[4921]: I0312 13:30:51.157829 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 13:30:51 crc kubenswrapper[4921]: I0312 13:30:51.285010 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" Mar 12 13:30:51 crc kubenswrapper[4921]: I0312 13:30:51.285036 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-msxxx" event={"ID":"c073128e-fc26-48f6-98d1-cdbb747363c6","Type":"ContainerDied","Data":"ed7a6a0820ca72a0749c72a878e56251a1e44693b3cbfa5cdcb49feba2d62f8c"} Mar 12 13:30:51 crc kubenswrapper[4921]: I0312 13:30:51.285521 4921 scope.go:117] "RemoveContainer" containerID="17d350bba769855dcb48166e7328c4f44fbc010288bd9d064cc318b9c562d02e" Mar 12 13:30:51 crc kubenswrapper[4921]: I0312 13:30:51.286890 4921 generic.go:334] "Generic (PLEG): container finished" podID="f783b72a-b838-4000-8d46-fa9fd1a7f8e1" containerID="074ed423de1b8cff82d8a0d4d19347b873547722dedc884f8452fca4999101b6" exitCode=143 Mar 12 13:30:51 crc kubenswrapper[4921]: I0312 13:30:51.286947 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f783b72a-b838-4000-8d46-fa9fd1a7f8e1","Type":"ContainerDied","Data":"074ed423de1b8cff82d8a0d4d19347b873547722dedc884f8452fca4999101b6"} Mar 12 13:30:51 crc kubenswrapper[4921]: I0312 13:30:51.288737 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a7798e1f-b22a-4ebd-a812-e8c17694cf60","Type":"ContainerStarted","Data":"a3153fda6e30d638e3b689adc11080d35e2c929826237ac7484f5969eebbcad8"} Mar 12 13:30:51 crc kubenswrapper[4921]: I0312 13:30:51.312607 4921 scope.go:117] "RemoveContainer" containerID="c00a16430304386cbce881d70ab096702dcb2de626bce4ecbbbe89486aabafac" Mar 12 13:30:51 crc kubenswrapper[4921]: I0312 13:30:51.325109 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-msxxx"] Mar 12 13:30:51 crc kubenswrapper[4921]: I0312 13:30:51.333339 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-msxxx"] Mar 12 13:30:51 crc kubenswrapper[4921]: I0312 13:30:51.668459 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 12 13:30:51 crc kubenswrapper[4921]: I0312 13:30:51.992841 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c073128e-fc26-48f6-98d1-cdbb747363c6" path="/var/lib/kubelet/pods/c073128e-fc26-48f6-98d1-cdbb747363c6/volumes" Mar 12 13:30:52 crc kubenswrapper[4921]: I0312 13:30:52.296610 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a7798e1f-b22a-4ebd-a812-e8c17694cf60","Type":"ContainerStarted","Data":"18775f664112dc1b6900d167caeeb9bd5675a22352b96a361cc397ea4f6626b7"} Mar 12 13:30:52 crc kubenswrapper[4921]: I0312 13:30:52.297007 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 12 13:30:52 crc kubenswrapper[4921]: I0312 13:30:52.299014 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bd9bc3b0-f90b-44d7-97af-c960fac4d831" containerName="nova-scheduler-scheduler" containerID="cri-o://ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee" gracePeriod=30 Mar 12 13:30:52 crc kubenswrapper[4921]: I0312 13:30:52.329520 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.329498748 podStartE2EDuration="2.329498748s" podCreationTimestamp="2026-03-12 13:30:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:30:52.319279689 +0000 UTC m=+1275.009351670" watchObservedRunningTime="2026-03-12 13:30:52.329498748 +0000 UTC m=+1275.019570719" Mar 12 13:30:54 crc kubenswrapper[4921]: I0312 13:30:54.553243 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 13:30:54 crc kubenswrapper[4921]: I0312 13:30:54.553772 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="0f49cecf-a341-4a70-b7f7-e2f61c313f0a" containerName="kube-state-metrics" containerID="cri-o://eee5f029f99aabaec6892b8ba75bc02e91da92cdf3d217f8b389f467fbd7c8e4" gracePeriod=30 Mar 12 13:30:54 crc kubenswrapper[4921]: E0312 13:30:54.802241 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 13:30:54 crc kubenswrapper[4921]: E0312 13:30:54.805388 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 13:30:54 crc kubenswrapper[4921]: E0312 13:30:54.809145 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 13:30:54 crc kubenswrapper[4921]: E0312 13:30:54.809223 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bd9bc3b0-f90b-44d7-97af-c960fac4d831" containerName="nova-scheduler-scheduler" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.115733 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.204830 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nkvb\" (UniqueName: \"kubernetes.io/projected/0f49cecf-a341-4a70-b7f7-e2f61c313f0a-kube-api-access-4nkvb\") pod \"0f49cecf-a341-4a70-b7f7-e2f61c313f0a\" (UID: \"0f49cecf-a341-4a70-b7f7-e2f61c313f0a\") " Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.212617 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f49cecf-a341-4a70-b7f7-e2f61c313f0a-kube-api-access-4nkvb" (OuterVolumeSpecName: "kube-api-access-4nkvb") pod "0f49cecf-a341-4a70-b7f7-e2f61c313f0a" (UID: "0f49cecf-a341-4a70-b7f7-e2f61c313f0a"). InnerVolumeSpecName "kube-api-access-4nkvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.306173 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nkvb\" (UniqueName: \"kubernetes.io/projected/0f49cecf-a341-4a70-b7f7-e2f61c313f0a-kube-api-access-4nkvb\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.307096 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.345297 4921 generic.go:334] "Generic (PLEG): container finished" podID="bd9bc3b0-f90b-44d7-97af-c960fac4d831" containerID="ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee" exitCode=0 Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.345366 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.345403 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd9bc3b0-f90b-44d7-97af-c960fac4d831","Type":"ContainerDied","Data":"ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee"} Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.345466 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bd9bc3b0-f90b-44d7-97af-c960fac4d831","Type":"ContainerDied","Data":"d6ff25260633fe282db74e0404e8c3ddbc9ebc223d782f92a7c811f349d59e57"} Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.345488 4921 scope.go:117] "RemoveContainer" containerID="ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.357217 4921 generic.go:334] "Generic (PLEG): container finished" podID="0f49cecf-a341-4a70-b7f7-e2f61c313f0a" containerID="eee5f029f99aabaec6892b8ba75bc02e91da92cdf3d217f8b389f467fbd7c8e4" exitCode=2 Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.357288 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0f49cecf-a341-4a70-b7f7-e2f61c313f0a","Type":"ContainerDied","Data":"eee5f029f99aabaec6892b8ba75bc02e91da92cdf3d217f8b389f467fbd7c8e4"} Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.357348 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0f49cecf-a341-4a70-b7f7-e2f61c313f0a","Type":"ContainerDied","Data":"21e3ee46f781bab5c6a324ec045f8a8d497763cde1001138980f05c6078e90f6"} Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.357431 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.388272 4921 scope.go:117] "RemoveContainer" containerID="ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee" Mar 12 13:30:55 crc kubenswrapper[4921]: E0312 13:30:55.388769 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee\": container with ID starting with ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee not found: ID does not exist" containerID="ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.388799 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee"} err="failed to get container status \"ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee\": rpc error: code = NotFound desc = could not find container \"ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee\": container with ID starting with ccb76cc46533833d6a5937198e64ea818c5fc3e3b11e2bd31a7cc7d500a9d9ee not found: ID does not exist" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.388835 4921 scope.go:117] "RemoveContainer" containerID="eee5f029f99aabaec6892b8ba75bc02e91da92cdf3d217f8b389f467fbd7c8e4" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.406031 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.411154 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd9bc3b0-f90b-44d7-97af-c960fac4d831-config-data\") pod \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\" (UID: \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\") " Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.411211 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzvfg\" (UniqueName: \"kubernetes.io/projected/bd9bc3b0-f90b-44d7-97af-c960fac4d831-kube-api-access-qzvfg\") pod \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\" (UID: \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\") " Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.411284 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd9bc3b0-f90b-44d7-97af-c960fac4d831-combined-ca-bundle\") pod \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\" (UID: \"bd9bc3b0-f90b-44d7-97af-c960fac4d831\") " Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.417982 4921 scope.go:117] "RemoveContainer" containerID="eee5f029f99aabaec6892b8ba75bc02e91da92cdf3d217f8b389f467fbd7c8e4" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.418005 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 13:30:55 crc kubenswrapper[4921]: E0312 13:30:55.418921 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eee5f029f99aabaec6892b8ba75bc02e91da92cdf3d217f8b389f467fbd7c8e4\": container with ID starting with eee5f029f99aabaec6892b8ba75bc02e91da92cdf3d217f8b389f467fbd7c8e4 not found: ID does not exist" containerID="eee5f029f99aabaec6892b8ba75bc02e91da92cdf3d217f8b389f467fbd7c8e4" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.418962 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eee5f029f99aabaec6892b8ba75bc02e91da92cdf3d217f8b389f467fbd7c8e4"} err="failed to get container status \"eee5f029f99aabaec6892b8ba75bc02e91da92cdf3d217f8b389f467fbd7c8e4\": rpc error: code = NotFound desc = could not find container \"eee5f029f99aabaec6892b8ba75bc02e91da92cdf3d217f8b389f467fbd7c8e4\": container with ID starting with eee5f029f99aabaec6892b8ba75bc02e91da92cdf3d217f8b389f467fbd7c8e4 not found: ID does not exist" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.425106 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd9bc3b0-f90b-44d7-97af-c960fac4d831-kube-api-access-qzvfg" (OuterVolumeSpecName: "kube-api-access-qzvfg") pod "bd9bc3b0-f90b-44d7-97af-c960fac4d831" (UID: "bd9bc3b0-f90b-44d7-97af-c960fac4d831"). InnerVolumeSpecName "kube-api-access-qzvfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.425280 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 13:30:55 crc kubenswrapper[4921]: E0312 13:30:55.425946 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f49cecf-a341-4a70-b7f7-e2f61c313f0a" containerName="kube-state-metrics" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.425974 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f49cecf-a341-4a70-b7f7-e2f61c313f0a" containerName="kube-state-metrics" Mar 12 13:30:55 crc kubenswrapper[4921]: E0312 13:30:55.425999 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd9bc3b0-f90b-44d7-97af-c960fac4d831" containerName="nova-scheduler-scheduler" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.426013 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd9bc3b0-f90b-44d7-97af-c960fac4d831" containerName="nova-scheduler-scheduler" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.426435 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f49cecf-a341-4a70-b7f7-e2f61c313f0a" containerName="kube-state-metrics" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.426528 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd9bc3b0-f90b-44d7-97af-c960fac4d831" containerName="nova-scheduler-scheduler" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.427805 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.431267 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.431485 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.433048 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.451455 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd9bc3b0-f90b-44d7-97af-c960fac4d831-config-data" (OuterVolumeSpecName: "config-data") pod "bd9bc3b0-f90b-44d7-97af-c960fac4d831" (UID: "bd9bc3b0-f90b-44d7-97af-c960fac4d831"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.457447 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd9bc3b0-f90b-44d7-97af-c960fac4d831-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd9bc3b0-f90b-44d7-97af-c960fac4d831" (UID: "bd9bc3b0-f90b-44d7-97af-c960fac4d831"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.512716 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d94a77-b0dc-48b9-863b-71dbccd74bfb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"01d94a77-b0dc-48b9-863b-71dbccd74bfb\") " pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.512776 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d94a77-b0dc-48b9-863b-71dbccd74bfb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"01d94a77-b0dc-48b9-863b-71dbccd74bfb\") " pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.512845 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/01d94a77-b0dc-48b9-863b-71dbccd74bfb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"01d94a77-b0dc-48b9-863b-71dbccd74bfb\") " pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.512937 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbmzt\" (UniqueName: \"kubernetes.io/projected/01d94a77-b0dc-48b9-863b-71dbccd74bfb-kube-api-access-qbmzt\") pod \"kube-state-metrics-0\" (UID: \"01d94a77-b0dc-48b9-863b-71dbccd74bfb\") " pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.513022 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd9bc3b0-f90b-44d7-97af-c960fac4d831-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.513033 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzvfg\" (UniqueName: \"kubernetes.io/projected/bd9bc3b0-f90b-44d7-97af-c960fac4d831-kube-api-access-qzvfg\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.513044 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd9bc3b0-f90b-44d7-97af-c960fac4d831-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.614537 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbmzt\" (UniqueName: \"kubernetes.io/projected/01d94a77-b0dc-48b9-863b-71dbccd74bfb-kube-api-access-qbmzt\") pod \"kube-state-metrics-0\" (UID: \"01d94a77-b0dc-48b9-863b-71dbccd74bfb\") " pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.614643 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d94a77-b0dc-48b9-863b-71dbccd74bfb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"01d94a77-b0dc-48b9-863b-71dbccd74bfb\") " pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.614692 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d94a77-b0dc-48b9-863b-71dbccd74bfb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"01d94a77-b0dc-48b9-863b-71dbccd74bfb\") " pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.614763 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/01d94a77-b0dc-48b9-863b-71dbccd74bfb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"01d94a77-b0dc-48b9-863b-71dbccd74bfb\") " pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.618440 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/01d94a77-b0dc-48b9-863b-71dbccd74bfb-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"01d94a77-b0dc-48b9-863b-71dbccd74bfb\") " pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.618576 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d94a77-b0dc-48b9-863b-71dbccd74bfb-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"01d94a77-b0dc-48b9-863b-71dbccd74bfb\") " pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.618649 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d94a77-b0dc-48b9-863b-71dbccd74bfb-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"01d94a77-b0dc-48b9-863b-71dbccd74bfb\") " pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.636281 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbmzt\" (UniqueName: \"kubernetes.io/projected/01d94a77-b0dc-48b9-863b-71dbccd74bfb-kube-api-access-qbmzt\") pod \"kube-state-metrics-0\" (UID: \"01d94a77-b0dc-48b9-863b-71dbccd74bfb\") " pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.666420 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.666684 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="ceilometer-central-agent" containerID="cri-o://b33ba9207b0baccb57a49d2265b852fe1696b657bc0dda39bc9d05884905bb61" gracePeriod=30 Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.666851 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="proxy-httpd" containerID="cri-o://60ad4625461cb042e7fca2f54c152e8678c5cf207605d18a176c244f6f5ec03a" gracePeriod=30 Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.666912 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="sg-core" containerID="cri-o://c93d7f4e6563a5458f4745865be6dc06aceca7828b20f9e5878ac34b437a485e" gracePeriod=30 Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.666968 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="ceilometer-notification-agent" containerID="cri-o://5067117721fc092451c4f9b01c750a89c89680c8845f066ffd39b954a70bff68" gracePeriod=30 Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.690911 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.703663 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.714794 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.715948 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.717921 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.728290 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.749166 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.817358 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89gpn\" (UniqueName: \"kubernetes.io/projected/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-kube-api-access-89gpn\") pod \"nova-scheduler-0\" (UID: \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.817492 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-config-data\") pod \"nova-scheduler-0\" (UID: \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.817546 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.919756 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89gpn\" (UniqueName: \"kubernetes.io/projected/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-kube-api-access-89gpn\") pod \"nova-scheduler-0\" (UID: \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.920090 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-config-data\") pod \"nova-scheduler-0\" (UID: \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.920145 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.924831 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.937858 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-config-data\") pod \"nova-scheduler-0\" (UID: \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.942204 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89gpn\" (UniqueName: \"kubernetes.io/projected/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-kube-api-access-89gpn\") pod \"nova-scheduler-0\" (UID: \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\") " pod="openstack/nova-scheduler-0" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.997979 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f49cecf-a341-4a70-b7f7-e2f61c313f0a" path="/var/lib/kubelet/pods/0f49cecf-a341-4a70-b7f7-e2f61c313f0a/volumes" Mar 12 13:30:55 crc kubenswrapper[4921]: I0312 13:30:55.999063 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd9bc3b0-f90b-44d7-97af-c960fac4d831" path="/var/lib/kubelet/pods/bd9bc3b0-f90b-44d7-97af-c960fac4d831/volumes" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.033552 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.209753 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.248191 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.329577 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-logs\") pod \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.329737 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-config-data\") pod \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.329830 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97h5l\" (UniqueName: \"kubernetes.io/projected/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-kube-api-access-97h5l\") pod \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.329884 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-combined-ca-bundle\") pod \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\" (UID: \"f783b72a-b838-4000-8d46-fa9fd1a7f8e1\") " Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.331262 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-logs" (OuterVolumeSpecName: "logs") pod "f783b72a-b838-4000-8d46-fa9fd1a7f8e1" (UID: "f783b72a-b838-4000-8d46-fa9fd1a7f8e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.333940 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-kube-api-access-97h5l" (OuterVolumeSpecName: "kube-api-access-97h5l") pod "f783b72a-b838-4000-8d46-fa9fd1a7f8e1" (UID: "f783b72a-b838-4000-8d46-fa9fd1a7f8e1"). InnerVolumeSpecName "kube-api-access-97h5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.353460 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f783b72a-b838-4000-8d46-fa9fd1a7f8e1" (UID: "f783b72a-b838-4000-8d46-fa9fd1a7f8e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.354195 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-config-data" (OuterVolumeSpecName: "config-data") pod "f783b72a-b838-4000-8d46-fa9fd1a7f8e1" (UID: "f783b72a-b838-4000-8d46-fa9fd1a7f8e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.372158 4921 generic.go:334] "Generic (PLEG): container finished" podID="24ae94ce-75ed-4179-b457-22252ef9664b" containerID="60ad4625461cb042e7fca2f54c152e8678c5cf207605d18a176c244f6f5ec03a" exitCode=0 Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.372183 4921 generic.go:334] "Generic (PLEG): container finished" podID="24ae94ce-75ed-4179-b457-22252ef9664b" containerID="c93d7f4e6563a5458f4745865be6dc06aceca7828b20f9e5878ac34b437a485e" exitCode=2 Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.372191 4921 generic.go:334] "Generic (PLEG): container finished" podID="24ae94ce-75ed-4179-b457-22252ef9664b" containerID="b33ba9207b0baccb57a49d2265b852fe1696b657bc0dda39bc9d05884905bb61" exitCode=0 Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.372221 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24ae94ce-75ed-4179-b457-22252ef9664b","Type":"ContainerDied","Data":"60ad4625461cb042e7fca2f54c152e8678c5cf207605d18a176c244f6f5ec03a"} Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.372241 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24ae94ce-75ed-4179-b457-22252ef9664b","Type":"ContainerDied","Data":"c93d7f4e6563a5458f4745865be6dc06aceca7828b20f9e5878ac34b437a485e"} Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.372252 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24ae94ce-75ed-4179-b457-22252ef9664b","Type":"ContainerDied","Data":"b33ba9207b0baccb57a49d2265b852fe1696b657bc0dda39bc9d05884905bb61"} Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.373385 4921 generic.go:334] "Generic (PLEG): container finished" podID="f783b72a-b838-4000-8d46-fa9fd1a7f8e1" containerID="8c12873786ff586b88facbebd212b1bcc8dc6c06e68643b6e0c522f61b30a87f" exitCode=0 Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.373414 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f783b72a-b838-4000-8d46-fa9fd1a7f8e1","Type":"ContainerDied","Data":"8c12873786ff586b88facbebd212b1bcc8dc6c06e68643b6e0c522f61b30a87f"} Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.373426 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f783b72a-b838-4000-8d46-fa9fd1a7f8e1","Type":"ContainerDied","Data":"95d48a7052b9d6ba2e28ccc01d1b70df9ce813708b65da33fb5f55a42d026d8e"} Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.373441 4921 scope.go:117] "RemoveContainer" containerID="8c12873786ff586b88facbebd212b1bcc8dc6c06e68643b6e0c522f61b30a87f" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.373518 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.377950 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"01d94a77-b0dc-48b9-863b-71dbccd74bfb","Type":"ContainerStarted","Data":"99804c01ca57c49ce6cb766756c77faab2ad2d44248973c99cd4889d523acf86"} Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.407645 4921 scope.go:117] "RemoveContainer" containerID="074ed423de1b8cff82d8a0d4d19347b873547722dedc884f8452fca4999101b6" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.415945 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.427164 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.442330 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-logs\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.442678 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.442696 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97h5l\" (UniqueName: \"kubernetes.io/projected/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-kube-api-access-97h5l\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.442711 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f783b72a-b838-4000-8d46-fa9fd1a7f8e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.448838 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 13:30:56 crc kubenswrapper[4921]: E0312 13:30:56.449630 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f783b72a-b838-4000-8d46-fa9fd1a7f8e1" containerName="nova-api-api" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.449658 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f783b72a-b838-4000-8d46-fa9fd1a7f8e1" containerName="nova-api-api" Mar 12 13:30:56 crc kubenswrapper[4921]: E0312 13:30:56.449724 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f783b72a-b838-4000-8d46-fa9fd1a7f8e1" containerName="nova-api-log" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.449735 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f783b72a-b838-4000-8d46-fa9fd1a7f8e1" containerName="nova-api-log" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.450336 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f783b72a-b838-4000-8d46-fa9fd1a7f8e1" containerName="nova-api-api" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.450420 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f783b72a-b838-4000-8d46-fa9fd1a7f8e1" containerName="nova-api-log" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.452403 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.457756 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.459962 4921 scope.go:117] "RemoveContainer" containerID="8c12873786ff586b88facbebd212b1bcc8dc6c06e68643b6e0c522f61b30a87f" Mar 12 13:30:56 crc kubenswrapper[4921]: E0312 13:30:56.463028 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c12873786ff586b88facbebd212b1bcc8dc6c06e68643b6e0c522f61b30a87f\": container with ID starting with 8c12873786ff586b88facbebd212b1bcc8dc6c06e68643b6e0c522f61b30a87f not found: ID does not exist" containerID="8c12873786ff586b88facbebd212b1bcc8dc6c06e68643b6e0c522f61b30a87f" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.463081 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c12873786ff586b88facbebd212b1bcc8dc6c06e68643b6e0c522f61b30a87f"} err="failed to get container status \"8c12873786ff586b88facbebd212b1bcc8dc6c06e68643b6e0c522f61b30a87f\": rpc error: code = NotFound desc = could not find container \"8c12873786ff586b88facbebd212b1bcc8dc6c06e68643b6e0c522f61b30a87f\": container with ID starting with 8c12873786ff586b88facbebd212b1bcc8dc6c06e68643b6e0c522f61b30a87f not found: ID does not exist" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.463107 4921 scope.go:117] "RemoveContainer" containerID="074ed423de1b8cff82d8a0d4d19347b873547722dedc884f8452fca4999101b6" Mar 12 13:30:56 crc kubenswrapper[4921]: E0312 13:30:56.463516 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"074ed423de1b8cff82d8a0d4d19347b873547722dedc884f8452fca4999101b6\": container with ID starting with 074ed423de1b8cff82d8a0d4d19347b873547722dedc884f8452fca4999101b6 not found: ID does not exist" containerID="074ed423de1b8cff82d8a0d4d19347b873547722dedc884f8452fca4999101b6" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.463551 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"074ed423de1b8cff82d8a0d4d19347b873547722dedc884f8452fca4999101b6"} err="failed to get container status \"074ed423de1b8cff82d8a0d4d19347b873547722dedc884f8452fca4999101b6\": rpc error: code = NotFound desc = could not find container \"074ed423de1b8cff82d8a0d4d19347b873547722dedc884f8452fca4999101b6\": container with ID starting with 074ed423de1b8cff82d8a0d4d19347b873547722dedc884f8452fca4999101b6 not found: ID does not exist" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.471701 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.507947 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.543828 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3302ef33-557d-4934-9dab-57dcbc94d090-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.543917 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f786c\" (UniqueName: \"kubernetes.io/projected/3302ef33-557d-4934-9dab-57dcbc94d090-kube-api-access-f786c\") pod \"nova-api-0\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.543948 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3302ef33-557d-4934-9dab-57dcbc94d090-config-data\") pod \"nova-api-0\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.543999 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3302ef33-557d-4934-9dab-57dcbc94d090-logs\") pod \"nova-api-0\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.645487 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f786c\" (UniqueName: \"kubernetes.io/projected/3302ef33-557d-4934-9dab-57dcbc94d090-kube-api-access-f786c\") pod \"nova-api-0\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.646683 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3302ef33-557d-4934-9dab-57dcbc94d090-config-data\") pod \"nova-api-0\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.646886 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3302ef33-557d-4934-9dab-57dcbc94d090-logs\") pod \"nova-api-0\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.647068 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3302ef33-557d-4934-9dab-57dcbc94d090-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.647346 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3302ef33-557d-4934-9dab-57dcbc94d090-logs\") pod \"nova-api-0\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.651776 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3302ef33-557d-4934-9dab-57dcbc94d090-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.652004 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3302ef33-557d-4934-9dab-57dcbc94d090-config-data\") pod \"nova-api-0\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.661421 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f786c\" (UniqueName: \"kubernetes.io/projected/3302ef33-557d-4934-9dab-57dcbc94d090-kube-api-access-f786c\") pod \"nova-api-0\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " pod="openstack/nova-api-0" Mar 12 13:30:56 crc kubenswrapper[4921]: I0312 13:30:56.779036 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 13:30:57 crc kubenswrapper[4921]: I0312 13:30:57.021384 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:30:57 crc kubenswrapper[4921]: W0312 13:30:57.030716 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3302ef33_557d_4934_9dab_57dcbc94d090.slice/crio-279aa8e19d2a0647b8e6e6835620fd749fb44bd47ff9d8d04661c17e8704f715 WatchSource:0}: Error finding container 279aa8e19d2a0647b8e6e6835620fd749fb44bd47ff9d8d04661c17e8704f715: Status 404 returned error can't find the container with id 279aa8e19d2a0647b8e6e6835620fd749fb44bd47ff9d8d04661c17e8704f715 Mar 12 13:30:57 crc kubenswrapper[4921]: I0312 13:30:57.401952 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3302ef33-557d-4934-9dab-57dcbc94d090","Type":"ContainerStarted","Data":"f4c133ae0816fdf8b088f36ffe494a374bd98719ec665c76f4196ddead7e50cb"} Mar 12 13:30:57 crc kubenswrapper[4921]: I0312 13:30:57.402332 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3302ef33-557d-4934-9dab-57dcbc94d090","Type":"ContainerStarted","Data":"279aa8e19d2a0647b8e6e6835620fd749fb44bd47ff9d8d04661c17e8704f715"} Mar 12 13:30:57 crc kubenswrapper[4921]: I0312 13:30:57.403965 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9","Type":"ContainerStarted","Data":"38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5"} Mar 12 13:30:57 crc kubenswrapper[4921]: I0312 13:30:57.404004 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9","Type":"ContainerStarted","Data":"9d901e7663c04a70aff38e920862937467e2f11e17b0debdcb5e219da6d3d37f"} Mar 12 13:30:57 crc kubenswrapper[4921]: I0312 13:30:57.423586 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.423567097 podStartE2EDuration="2.423567097s" podCreationTimestamp="2026-03-12 13:30:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:30:57.419781702 +0000 UTC m=+1280.109853673" watchObservedRunningTime="2026-03-12 13:30:57.423567097 +0000 UTC m=+1280.113639088" Mar 12 13:30:58 crc kubenswrapper[4921]: I0312 13:30:58.000198 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f783b72a-b838-4000-8d46-fa9fd1a7f8e1" path="/var/lib/kubelet/pods/f783b72a-b838-4000-8d46-fa9fd1a7f8e1/volumes" Mar 12 13:30:58 crc kubenswrapper[4921]: I0312 13:30:58.412445 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3302ef33-557d-4934-9dab-57dcbc94d090","Type":"ContainerStarted","Data":"1211cbc35862694895c201473af095d18af65a659488bc8390f5330c26215a41"} Mar 12 13:30:58 crc kubenswrapper[4921]: I0312 13:30:58.437467 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.437443563 podStartE2EDuration="2.437443563s" podCreationTimestamp="2026-03-12 13:30:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:30:58.435463003 +0000 UTC m=+1281.125534984" watchObservedRunningTime="2026-03-12 13:30:58.437443563 +0000 UTC m=+1281.127515564" Mar 12 13:30:58 crc kubenswrapper[4921]: I0312 13:30:58.537193 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 13:30:58 crc kubenswrapper[4921]: I0312 13:30:58.537239 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.431466 4921 generic.go:334] "Generic (PLEG): container finished" podID="24ae94ce-75ed-4179-b457-22252ef9664b" containerID="5067117721fc092451c4f9b01c750a89c89680c8845f066ffd39b954a70bff68" exitCode=0 Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.431538 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24ae94ce-75ed-4179-b457-22252ef9664b","Type":"ContainerDied","Data":"5067117721fc092451c4f9b01c750a89c89680c8845f066ffd39b954a70bff68"} Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.602548 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.702614 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-scripts\") pod \"24ae94ce-75ed-4179-b457-22252ef9664b\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.702701 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxqqf\" (UniqueName: \"kubernetes.io/projected/24ae94ce-75ed-4179-b457-22252ef9664b-kube-api-access-kxqqf\") pod \"24ae94ce-75ed-4179-b457-22252ef9664b\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.702738 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24ae94ce-75ed-4179-b457-22252ef9664b-run-httpd\") pod \"24ae94ce-75ed-4179-b457-22252ef9664b\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.702782 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-combined-ca-bundle\") pod \"24ae94ce-75ed-4179-b457-22252ef9664b\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.702837 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-config-data\") pod \"24ae94ce-75ed-4179-b457-22252ef9664b\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.702898 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24ae94ce-75ed-4179-b457-22252ef9664b-log-httpd\") pod \"24ae94ce-75ed-4179-b457-22252ef9664b\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.703016 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-sg-core-conf-yaml\") pod \"24ae94ce-75ed-4179-b457-22252ef9664b\" (UID: \"24ae94ce-75ed-4179-b457-22252ef9664b\") " Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.708503 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24ae94ce-75ed-4179-b457-22252ef9664b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "24ae94ce-75ed-4179-b457-22252ef9664b" (UID: "24ae94ce-75ed-4179-b457-22252ef9664b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.708981 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24ae94ce-75ed-4179-b457-22252ef9664b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "24ae94ce-75ed-4179-b457-22252ef9664b" (UID: "24ae94ce-75ed-4179-b457-22252ef9664b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.711485 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24ae94ce-75ed-4179-b457-22252ef9664b-kube-api-access-kxqqf" (OuterVolumeSpecName: "kube-api-access-kxqqf") pod "24ae94ce-75ed-4179-b457-22252ef9664b" (UID: "24ae94ce-75ed-4179-b457-22252ef9664b"). InnerVolumeSpecName "kube-api-access-kxqqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.712710 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-scripts" (OuterVolumeSpecName: "scripts") pod "24ae94ce-75ed-4179-b457-22252ef9664b" (UID: "24ae94ce-75ed-4179-b457-22252ef9664b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.730671 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "24ae94ce-75ed-4179-b457-22252ef9664b" (UID: "24ae94ce-75ed-4179-b457-22252ef9664b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.775629 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24ae94ce-75ed-4179-b457-22252ef9664b" (UID: "24ae94ce-75ed-4179-b457-22252ef9664b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.798245 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-config-data" (OuterVolumeSpecName: "config-data") pod "24ae94ce-75ed-4179-b457-22252ef9664b" (UID: "24ae94ce-75ed-4179-b457-22252ef9664b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.804699 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.804729 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxqqf\" (UniqueName: \"kubernetes.io/projected/24ae94ce-75ed-4179-b457-22252ef9664b-kube-api-access-kxqqf\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.804740 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24ae94ce-75ed-4179-b457-22252ef9664b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.804749 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.804758 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.804767 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24ae94ce-75ed-4179-b457-22252ef9664b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:30:59 crc kubenswrapper[4921]: I0312 13:30:59.804774 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24ae94ce-75ed-4179-b457-22252ef9664b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.443744 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"01d94a77-b0dc-48b9-863b-71dbccd74bfb","Type":"ContainerStarted","Data":"cd4ff41cc5352817c7ac3f525a8514b49c37550852873618974a9788199d8b93"} Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.443856 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.447135 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24ae94ce-75ed-4179-b457-22252ef9664b","Type":"ContainerDied","Data":"79cfdbd8708d65f421b12fff7e08b877335552b69559f630ddb2acb0130ad268"} Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.447225 4921 scope.go:117] "RemoveContainer" containerID="60ad4625461cb042e7fca2f54c152e8678c5cf207605d18a176c244f6f5ec03a" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.447459 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.472772 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.703941505 podStartE2EDuration="5.472742444s" podCreationTimestamp="2026-03-12 13:30:55 +0000 UTC" firstStartedPulling="2026-03-12 13:30:56.212636664 +0000 UTC m=+1278.902708635" lastFinishedPulling="2026-03-12 13:30:59.981437603 +0000 UTC m=+1282.671509574" observedRunningTime="2026-03-12 13:31:00.466543717 +0000 UTC m=+1283.156615698" watchObservedRunningTime="2026-03-12 13:31:00.472742444 +0000 UTC m=+1283.162814415" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.478910 4921 scope.go:117] "RemoveContainer" containerID="c93d7f4e6563a5458f4745865be6dc06aceca7828b20f9e5878ac34b437a485e" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.509985 4921 scope.go:117] "RemoveContainer" containerID="5067117721fc092451c4f9b01c750a89c89680c8845f066ffd39b954a70bff68" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.510847 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.528152 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.537338 4921 scope.go:117] "RemoveContainer" containerID="b33ba9207b0baccb57a49d2265b852fe1696b657bc0dda39bc9d05884905bb61" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.542423 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:00 crc kubenswrapper[4921]: E0312 13:31:00.542857 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="proxy-httpd" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.542878 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="proxy-httpd" Mar 12 13:31:00 crc kubenswrapper[4921]: E0312 13:31:00.542914 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="ceilometer-notification-agent" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.542922 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="ceilometer-notification-agent" Mar 12 13:31:00 crc kubenswrapper[4921]: E0312 13:31:00.542943 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="ceilometer-central-agent" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.542951 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="ceilometer-central-agent" Mar 12 13:31:00 crc kubenswrapper[4921]: E0312 13:31:00.542965 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="sg-core" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.542974 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="sg-core" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.543155 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="proxy-httpd" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.543177 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="sg-core" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.543186 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="ceilometer-notification-agent" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.543202 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" containerName="ceilometer-central-agent" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.544796 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.547256 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.547434 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.548689 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.550240 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.619951 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.620074 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.620323 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-scripts\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.620502 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5hr4\" (UniqueName: \"kubernetes.io/projected/44836ae0-9135-463a-8694-19de955d2e66-kube-api-access-q5hr4\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.621109 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44836ae0-9135-463a-8694-19de955d2e66-run-httpd\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.621185 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.621258 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44836ae0-9135-463a-8694-19de955d2e66-log-httpd\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.621370 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-config-data\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.723709 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.724349 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-scripts\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.724473 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5hr4\" (UniqueName: \"kubernetes.io/projected/44836ae0-9135-463a-8694-19de955d2e66-kube-api-access-q5hr4\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.724566 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44836ae0-9135-463a-8694-19de955d2e66-run-httpd\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.724643 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.724737 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44836ae0-9135-463a-8694-19de955d2e66-log-httpd\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.724843 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-config-data\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.725015 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.726272 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.727756 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44836ae0-9135-463a-8694-19de955d2e66-log-httpd\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.726675 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44836ae0-9135-463a-8694-19de955d2e66-run-httpd\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.732303 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.738750 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.739956 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-config-data\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.743363 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.743503 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-scripts\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.745450 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5hr4\" (UniqueName: \"kubernetes.io/projected/44836ae0-9135-463a-8694-19de955d2e66-kube-api-access-q5hr4\") pod \"ceilometer-0\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " pod="openstack/ceilometer-0" Mar 12 13:31:00 crc kubenswrapper[4921]: I0312 13:31:00.886797 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:31:01 crc kubenswrapper[4921]: I0312 13:31:01.034245 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 12 13:31:01 crc kubenswrapper[4921]: I0312 13:31:01.329097 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:01 crc kubenswrapper[4921]: I0312 13:31:01.458200 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44836ae0-9135-463a-8694-19de955d2e66","Type":"ContainerStarted","Data":"a369615aa55d288ae1562f04a68dc61bc0b939f54706c574804abcd64da5564a"} Mar 12 13:31:01 crc kubenswrapper[4921]: I0312 13:31:01.993406 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24ae94ce-75ed-4179-b457-22252ef9664b" path="/var/lib/kubelet/pods/24ae94ce-75ed-4179-b457-22252ef9664b/volumes" Mar 12 13:31:03 crc kubenswrapper[4921]: I0312 13:31:03.475639 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44836ae0-9135-463a-8694-19de955d2e66","Type":"ContainerStarted","Data":"b308f05090e56d2ea4ecebd771f4aa433a5f7d2df8abcf517f23d3ab62112939"} Mar 12 13:31:04 crc kubenswrapper[4921]: I0312 13:31:04.487299 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44836ae0-9135-463a-8694-19de955d2e66","Type":"ContainerStarted","Data":"0908c15a770eac4ced285413c97d4bb1e1080776174bd817119a096fe318eb9e"} Mar 12 13:31:05 crc kubenswrapper[4921]: I0312 13:31:05.505764 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44836ae0-9135-463a-8694-19de955d2e66","Type":"ContainerStarted","Data":"c5e51c6ee2388bf0d86fd13c76221a87f2e49b96e0cf1f4685eaf2a7f4f24b91"} Mar 12 13:31:05 crc kubenswrapper[4921]: I0312 13:31:05.766427 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 12 13:31:06 crc kubenswrapper[4921]: I0312 13:31:06.033805 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 12 13:31:06 crc kubenswrapper[4921]: I0312 13:31:06.081052 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 12 13:31:06 crc kubenswrapper[4921]: I0312 13:31:06.558723 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 12 13:31:06 crc kubenswrapper[4921]: I0312 13:31:06.780986 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 13:31:06 crc kubenswrapper[4921]: I0312 13:31:06.781631 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 13:31:07 crc kubenswrapper[4921]: I0312 13:31:07.525727 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44836ae0-9135-463a-8694-19de955d2e66","Type":"ContainerStarted","Data":"4d3e4b092ff0ead2067d6da89966b4b2429ee0e77707207913218fa5fe7429df"} Mar 12 13:31:07 crc kubenswrapper[4921]: I0312 13:31:07.526739 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 13:31:07 crc kubenswrapper[4921]: I0312 13:31:07.556900 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.263359399 podStartE2EDuration="7.556877626s" podCreationTimestamp="2026-03-12 13:31:00 +0000 UTC" firstStartedPulling="2026-03-12 13:31:01.333873273 +0000 UTC m=+1284.023945244" lastFinishedPulling="2026-03-12 13:31:06.6273915 +0000 UTC m=+1289.317463471" observedRunningTime="2026-03-12 13:31:07.547297357 +0000 UTC m=+1290.237369338" watchObservedRunningTime="2026-03-12 13:31:07.556877626 +0000 UTC m=+1290.246949597" Mar 12 13:31:07 crc kubenswrapper[4921]: I0312 13:31:07.862962 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3302ef33-557d-4934-9dab-57dcbc94d090" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 13:31:07 crc kubenswrapper[4921]: I0312 13:31:07.862982 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3302ef33-557d-4934-9dab-57dcbc94d090" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.186:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.612626 4921 generic.go:334] "Generic (PLEG): container finished" podID="217d08e5-4eb0-4ba3-8c09-2249efa14d22" containerID="8a9ab96a81d8a70955c31c19fd61b688c079bef8c3e44a71f91176aa5f39f2e3" exitCode=137 Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.612719 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"217d08e5-4eb0-4ba3-8c09-2249efa14d22","Type":"ContainerDied","Data":"8a9ab96a81d8a70955c31c19fd61b688c079bef8c3e44a71f91176aa5f39f2e3"} Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.615516 4921 generic.go:334] "Generic (PLEG): container finished" podID="7b0fe145-4f1b-4673-8876-6fa26c82d046" containerID="0f56e514eb4f3b0c3c628b6e958dd0dec3f9253bed6d2fd6846ea94f9a6d4894" exitCode=137 Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.615545 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7b0fe145-4f1b-4673-8876-6fa26c82d046","Type":"ContainerDied","Data":"0f56e514eb4f3b0c3c628b6e958dd0dec3f9253bed6d2fd6846ea94f9a6d4894"} Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.730897 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.737142 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.779696 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.779772 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.822252 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0fe145-4f1b-4673-8876-6fa26c82d046-combined-ca-bundle\") pod \"7b0fe145-4f1b-4673-8876-6fa26c82d046\" (UID: \"7b0fe145-4f1b-4673-8876-6fa26c82d046\") " Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.822362 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvt6n\" (UniqueName: \"kubernetes.io/projected/7b0fe145-4f1b-4673-8876-6fa26c82d046-kube-api-access-bvt6n\") pod \"7b0fe145-4f1b-4673-8876-6fa26c82d046\" (UID: \"7b0fe145-4f1b-4673-8876-6fa26c82d046\") " Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.822413 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b0fe145-4f1b-4673-8876-6fa26c82d046-config-data\") pod \"7b0fe145-4f1b-4673-8876-6fa26c82d046\" (UID: \"7b0fe145-4f1b-4673-8876-6fa26c82d046\") " Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.822485 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/217d08e5-4eb0-4ba3-8c09-2249efa14d22-combined-ca-bundle\") pod \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.822528 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwg6l\" (UniqueName: \"kubernetes.io/projected/217d08e5-4eb0-4ba3-8c09-2249efa14d22-kube-api-access-xwg6l\") pod \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.822660 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/217d08e5-4eb0-4ba3-8c09-2249efa14d22-config-data\") pod \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.822713 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/217d08e5-4eb0-4ba3-8c09-2249efa14d22-logs\") pod \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\" (UID: \"217d08e5-4eb0-4ba3-8c09-2249efa14d22\") " Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.823531 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/217d08e5-4eb0-4ba3-8c09-2249efa14d22-logs" (OuterVolumeSpecName: "logs") pod "217d08e5-4eb0-4ba3-8c09-2249efa14d22" (UID: "217d08e5-4eb0-4ba3-8c09-2249efa14d22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.827707 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0fe145-4f1b-4673-8876-6fa26c82d046-kube-api-access-bvt6n" (OuterVolumeSpecName: "kube-api-access-bvt6n") pod "7b0fe145-4f1b-4673-8876-6fa26c82d046" (UID: "7b0fe145-4f1b-4673-8876-6fa26c82d046"). InnerVolumeSpecName "kube-api-access-bvt6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.827755 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/217d08e5-4eb0-4ba3-8c09-2249efa14d22-kube-api-access-xwg6l" (OuterVolumeSpecName: "kube-api-access-xwg6l") pod "217d08e5-4eb0-4ba3-8c09-2249efa14d22" (UID: "217d08e5-4eb0-4ba3-8c09-2249efa14d22"). InnerVolumeSpecName "kube-api-access-xwg6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.847537 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0fe145-4f1b-4673-8876-6fa26c82d046-config-data" (OuterVolumeSpecName: "config-data") pod "7b0fe145-4f1b-4673-8876-6fa26c82d046" (UID: "7b0fe145-4f1b-4673-8876-6fa26c82d046"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.848028 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/217d08e5-4eb0-4ba3-8c09-2249efa14d22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "217d08e5-4eb0-4ba3-8c09-2249efa14d22" (UID: "217d08e5-4eb0-4ba3-8c09-2249efa14d22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.852921 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/217d08e5-4eb0-4ba3-8c09-2249efa14d22-config-data" (OuterVolumeSpecName: "config-data") pod "217d08e5-4eb0-4ba3-8c09-2249efa14d22" (UID: "217d08e5-4eb0-4ba3-8c09-2249efa14d22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.864538 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b0fe145-4f1b-4673-8876-6fa26c82d046-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b0fe145-4f1b-4673-8876-6fa26c82d046" (UID: "7b0fe145-4f1b-4673-8876-6fa26c82d046"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.925426 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b0fe145-4f1b-4673-8876-6fa26c82d046-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.925770 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvt6n\" (UniqueName: \"kubernetes.io/projected/7b0fe145-4f1b-4673-8876-6fa26c82d046-kube-api-access-bvt6n\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.925798 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b0fe145-4f1b-4673-8876-6fa26c82d046-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.925836 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/217d08e5-4eb0-4ba3-8c09-2249efa14d22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.925855 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwg6l\" (UniqueName: \"kubernetes.io/projected/217d08e5-4eb0-4ba3-8c09-2249efa14d22-kube-api-access-xwg6l\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.925873 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/217d08e5-4eb0-4ba3-8c09-2249efa14d22-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:14 crc kubenswrapper[4921]: I0312 13:31:14.925888 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/217d08e5-4eb0-4ba3-8c09-2249efa14d22-logs\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.625942 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7b0fe145-4f1b-4673-8876-6fa26c82d046","Type":"ContainerDied","Data":"283d05d61aa2293a6acb2efdc540cb35c152b432dbb0745ad144e3b6bf2783f5"} Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.625995 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.626030 4921 scope.go:117] "RemoveContainer" containerID="0f56e514eb4f3b0c3c628b6e958dd0dec3f9253bed6d2fd6846ea94f9a6d4894" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.630763 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"217d08e5-4eb0-4ba3-8c09-2249efa14d22","Type":"ContainerDied","Data":"8bf322cf22cc787a35393302923184ec61b51eb83047e31ff13c9cdb5637d346"} Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.630806 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.672510 4921 scope.go:117] "RemoveContainer" containerID="8a9ab96a81d8a70955c31c19fd61b688c079bef8c3e44a71f91176aa5f39f2e3" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.680036 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.693185 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.705978 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.716762 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.723996 4921 scope.go:117] "RemoveContainer" containerID="042a7c6eb55877775c0c17124d7a30aea19906eca4cdd64ed922b72bf671d89a" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.731077 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:31:15 crc kubenswrapper[4921]: E0312 13:31:15.731466 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217d08e5-4eb0-4ba3-8c09-2249efa14d22" containerName="nova-metadata-log" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.731487 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="217d08e5-4eb0-4ba3-8c09-2249efa14d22" containerName="nova-metadata-log" Mar 12 13:31:15 crc kubenswrapper[4921]: E0312 13:31:15.731550 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0fe145-4f1b-4673-8876-6fa26c82d046" containerName="nova-cell1-novncproxy-novncproxy" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.731559 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0fe145-4f1b-4673-8876-6fa26c82d046" containerName="nova-cell1-novncproxy-novncproxy" Mar 12 13:31:15 crc kubenswrapper[4921]: E0312 13:31:15.731602 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217d08e5-4eb0-4ba3-8c09-2249efa14d22" containerName="nova-metadata-metadata" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.731612 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="217d08e5-4eb0-4ba3-8c09-2249efa14d22" containerName="nova-metadata-metadata" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.731838 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="217d08e5-4eb0-4ba3-8c09-2249efa14d22" containerName="nova-metadata-log" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.731871 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="217d08e5-4eb0-4ba3-8c09-2249efa14d22" containerName="nova-metadata-metadata" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.731903 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b0fe145-4f1b-4673-8876-6fa26c82d046" containerName="nova-cell1-novncproxy-novncproxy" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.733073 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.736454 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.739697 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.741129 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.748667 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.748710 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.749025 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.749332 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.752896 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.764780 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.850353 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmp52\" (UniqueName: \"kubernetes.io/projected/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-kube-api-access-xmp52\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.850393 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.850447 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.850491 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.850510 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-logs\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.850569 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.850742 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-config-data\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.850792 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.850873 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wkmh\" (UniqueName: \"kubernetes.io/projected/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-kube-api-access-4wkmh\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.850956 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.952569 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.952668 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.952715 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-logs\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.952794 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.953758 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-config-data\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.953953 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.954044 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wkmh\" (UniqueName: \"kubernetes.io/projected/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-kube-api-access-4wkmh\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.954321 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.954411 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-logs\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.954472 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmp52\" (UniqueName: \"kubernetes.io/projected/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-kube-api-access-xmp52\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.954532 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.957690 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.959567 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.959862 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.960241 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.961068 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.962032 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-config-data\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.964169 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.976675 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wkmh\" (UniqueName: \"kubernetes.io/projected/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-kube-api-access-4wkmh\") pod \"nova-metadata-0\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " pod="openstack/nova-metadata-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.981017 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmp52\" (UniqueName: \"kubernetes.io/projected/6f997ce1-fc3d-4a1c-b9a8-d357e879f70d-kube-api-access-xmp52\") pod \"nova-cell1-novncproxy-0\" (UID: \"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.998002 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="217d08e5-4eb0-4ba3-8c09-2249efa14d22" path="/var/lib/kubelet/pods/217d08e5-4eb0-4ba3-8c09-2249efa14d22/volumes" Mar 12 13:31:15 crc kubenswrapper[4921]: I0312 13:31:15.999171 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b0fe145-4f1b-4673-8876-6fa26c82d046" path="/var/lib/kubelet/pods/7b0fe145-4f1b-4673-8876-6fa26c82d046/volumes" Mar 12 13:31:16 crc kubenswrapper[4921]: I0312 13:31:16.068498 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 13:31:16 crc kubenswrapper[4921]: I0312 13:31:16.088211 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:16 crc kubenswrapper[4921]: I0312 13:31:16.559421 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:31:16 crc kubenswrapper[4921]: I0312 13:31:16.613781 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 13:31:16 crc kubenswrapper[4921]: I0312 13:31:16.641246 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615","Type":"ContainerStarted","Data":"ad3d763924641820cb03fe000299e90fc0d521a94f1c1db1103a7d32b6f72ccb"} Mar 12 13:31:16 crc kubenswrapper[4921]: I0312 13:31:16.643468 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d","Type":"ContainerStarted","Data":"0d48f11419b5e6d0a9cef829e5b84779d58c6908d44603e66b78eb739438bdc3"} Mar 12 13:31:16 crc kubenswrapper[4921]: I0312 13:31:16.783702 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 13:31:16 crc kubenswrapper[4921]: I0312 13:31:16.784345 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 13:31:16 crc kubenswrapper[4921]: I0312 13:31:16.787765 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 13:31:17 crc kubenswrapper[4921]: I0312 13:31:17.659931 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615","Type":"ContainerStarted","Data":"7e029eb710c5a06bd06499db12be33d8fe701748717dffc5d41cce0669d8cded"} Mar 12 13:31:17 crc kubenswrapper[4921]: I0312 13:31:17.660747 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615","Type":"ContainerStarted","Data":"3e6c646a1df84ba0c4bcf4b04c201cafed11515abd93f56bb15e6aacc3e4e47e"} Mar 12 13:31:17 crc kubenswrapper[4921]: I0312 13:31:17.662858 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6f997ce1-fc3d-4a1c-b9a8-d357e879f70d","Type":"ContainerStarted","Data":"9fc809e82252a31fb3accfba3cf70281c3fb8f7a64a6b6a70f9a59423145ba06"} Mar 12 13:31:17 crc kubenswrapper[4921]: I0312 13:31:17.672094 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 13:31:17 crc kubenswrapper[4921]: I0312 13:31:17.690988 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.690962603 podStartE2EDuration="2.690962603s" podCreationTimestamp="2026-03-12 13:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:31:17.678787866 +0000 UTC m=+1300.368859837" watchObservedRunningTime="2026-03-12 13:31:17.690962603 +0000 UTC m=+1300.381034614" Mar 12 13:31:17 crc kubenswrapper[4921]: I0312 13:31:17.716244 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.716227967 podStartE2EDuration="2.716227967s" podCreationTimestamp="2026-03-12 13:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:31:17.713424912 +0000 UTC m=+1300.403496893" watchObservedRunningTime="2026-03-12 13:31:17.716227967 +0000 UTC m=+1300.406299948" Mar 12 13:31:17 crc kubenswrapper[4921]: I0312 13:31:17.886546 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-z2mnm"] Mar 12 13:31:17 crc kubenswrapper[4921]: I0312 13:31:17.888205 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:17 crc kubenswrapper[4921]: I0312 13:31:17.901954 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-z2mnm"] Mar 12 13:31:17 crc kubenswrapper[4921]: I0312 13:31:17.988515 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-dns-svc\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:17 crc kubenswrapper[4921]: I0312 13:31:17.988581 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6spx\" (UniqueName: \"kubernetes.io/projected/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-kube-api-access-g6spx\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:17 crc kubenswrapper[4921]: I0312 13:31:17.988618 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-config\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:17 crc kubenswrapper[4921]: I0312 13:31:17.988684 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:17 crc kubenswrapper[4921]: I0312 13:31:17.988715 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:18 crc kubenswrapper[4921]: I0312 13:31:18.092440 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-dns-svc\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:18 crc kubenswrapper[4921]: I0312 13:31:18.092512 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6spx\" (UniqueName: \"kubernetes.io/projected/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-kube-api-access-g6spx\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:18 crc kubenswrapper[4921]: I0312 13:31:18.092549 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-config\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:18 crc kubenswrapper[4921]: I0312 13:31:18.092631 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:18 crc kubenswrapper[4921]: I0312 13:31:18.092661 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:18 crc kubenswrapper[4921]: I0312 13:31:18.093354 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-dns-svc\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:18 crc kubenswrapper[4921]: I0312 13:31:18.093452 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:18 crc kubenswrapper[4921]: I0312 13:31:18.093800 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:18 crc kubenswrapper[4921]: I0312 13:31:18.093800 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-config\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:18 crc kubenswrapper[4921]: I0312 13:31:18.111923 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6spx\" (UniqueName: \"kubernetes.io/projected/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-kube-api-access-g6spx\") pod \"dnsmasq-dns-5b856c5697-z2mnm\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:18 crc kubenswrapper[4921]: I0312 13:31:18.205523 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:18 crc kubenswrapper[4921]: I0312 13:31:18.680461 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-z2mnm"] Mar 12 13:31:19 crc kubenswrapper[4921]: I0312 13:31:19.687451 4921 generic.go:334] "Generic (PLEG): container finished" podID="fdae03dd-47dd-4e2a-901b-5ec7cc01e91c" containerID="ea7152b7ef32bf12c254a24a00a8805316281683a6a24e244d01ae12594a2424" exitCode=0 Mar 12 13:31:19 crc kubenswrapper[4921]: I0312 13:31:19.687555 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" event={"ID":"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c","Type":"ContainerDied","Data":"ea7152b7ef32bf12c254a24a00a8805316281683a6a24e244d01ae12594a2424"} Mar 12 13:31:19 crc kubenswrapper[4921]: I0312 13:31:19.687821 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" event={"ID":"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c","Type":"ContainerStarted","Data":"61ab0e1891c999e8d566948a5972283a1334cedaf9b3aa9ed59be386d068f859"} Mar 12 13:31:19 crc kubenswrapper[4921]: I0312 13:31:19.885327 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:19 crc kubenswrapper[4921]: I0312 13:31:19.885940 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="ceilometer-central-agent" containerID="cri-o://b308f05090e56d2ea4ecebd771f4aa433a5f7d2df8abcf517f23d3ab62112939" gracePeriod=30 Mar 12 13:31:19 crc kubenswrapper[4921]: I0312 13:31:19.886024 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="proxy-httpd" containerID="cri-o://4d3e4b092ff0ead2067d6da89966b4b2429ee0e77707207913218fa5fe7429df" gracePeriod=30 Mar 12 13:31:19 crc kubenswrapper[4921]: I0312 13:31:19.886026 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="sg-core" containerID="cri-o://c5e51c6ee2388bf0d86fd13c76221a87f2e49b96e0cf1f4685eaf2a7f4f24b91" gracePeriod=30 Mar 12 13:31:19 crc kubenswrapper[4921]: I0312 13:31:19.886050 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="ceilometer-notification-agent" containerID="cri-o://0908c15a770eac4ced285413c97d4bb1e1080776174bd817119a096fe318eb9e" gracePeriod=30 Mar 12 13:31:19 crc kubenswrapper[4921]: I0312 13:31:19.910264 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 12 13:31:20 crc kubenswrapper[4921]: I0312 13:31:20.210966 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:31:20 crc kubenswrapper[4921]: I0312 13:31:20.707617 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" event={"ID":"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c","Type":"ContainerStarted","Data":"e0f0c0a4d5d8b8661383588f3c914a6367a0e851f0c82e20e11be2b77dc666b5"} Mar 12 13:31:20 crc kubenswrapper[4921]: I0312 13:31:20.708766 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:20 crc kubenswrapper[4921]: I0312 13:31:20.711133 4921 generic.go:334] "Generic (PLEG): container finished" podID="44836ae0-9135-463a-8694-19de955d2e66" containerID="4d3e4b092ff0ead2067d6da89966b4b2429ee0e77707207913218fa5fe7429df" exitCode=0 Mar 12 13:31:20 crc kubenswrapper[4921]: I0312 13:31:20.711152 4921 generic.go:334] "Generic (PLEG): container finished" podID="44836ae0-9135-463a-8694-19de955d2e66" containerID="c5e51c6ee2388bf0d86fd13c76221a87f2e49b96e0cf1f4685eaf2a7f4f24b91" exitCode=2 Mar 12 13:31:20 crc kubenswrapper[4921]: I0312 13:31:20.711160 4921 generic.go:334] "Generic (PLEG): container finished" podID="44836ae0-9135-463a-8694-19de955d2e66" containerID="0908c15a770eac4ced285413c97d4bb1e1080776174bd817119a096fe318eb9e" exitCode=0 Mar 12 13:31:20 crc kubenswrapper[4921]: I0312 13:31:20.711169 4921 generic.go:334] "Generic (PLEG): container finished" podID="44836ae0-9135-463a-8694-19de955d2e66" containerID="b308f05090e56d2ea4ecebd771f4aa433a5f7d2df8abcf517f23d3ab62112939" exitCode=0 Mar 12 13:31:20 crc kubenswrapper[4921]: I0312 13:31:20.711302 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3302ef33-557d-4934-9dab-57dcbc94d090" containerName="nova-api-log" containerID="cri-o://f4c133ae0816fdf8b088f36ffe494a374bd98719ec665c76f4196ddead7e50cb" gracePeriod=30 Mar 12 13:31:20 crc kubenswrapper[4921]: I0312 13:31:20.711486 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44836ae0-9135-463a-8694-19de955d2e66","Type":"ContainerDied","Data":"4d3e4b092ff0ead2067d6da89966b4b2429ee0e77707207913218fa5fe7429df"} Mar 12 13:31:20 crc kubenswrapper[4921]: I0312 13:31:20.711505 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44836ae0-9135-463a-8694-19de955d2e66","Type":"ContainerDied","Data":"c5e51c6ee2388bf0d86fd13c76221a87f2e49b96e0cf1f4685eaf2a7f4f24b91"} Mar 12 13:31:20 crc kubenswrapper[4921]: I0312 13:31:20.711514 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44836ae0-9135-463a-8694-19de955d2e66","Type":"ContainerDied","Data":"0908c15a770eac4ced285413c97d4bb1e1080776174bd817119a096fe318eb9e"} Mar 12 13:31:20 crc kubenswrapper[4921]: I0312 13:31:20.711524 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44836ae0-9135-463a-8694-19de955d2e66","Type":"ContainerDied","Data":"b308f05090e56d2ea4ecebd771f4aa433a5f7d2df8abcf517f23d3ab62112939"} Mar 12 13:31:20 crc kubenswrapper[4921]: I0312 13:31:20.711570 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3302ef33-557d-4934-9dab-57dcbc94d090" containerName="nova-api-api" containerID="cri-o://1211cbc35862694895c201473af095d18af65a659488bc8390f5330c26215a41" gracePeriod=30 Mar 12 13:31:20 crc kubenswrapper[4921]: I0312 13:31:20.957452 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:20.979582 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" podStartSLOduration=3.979567862 podStartE2EDuration="3.979567862s" podCreationTimestamp="2026-03-12 13:31:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:31:20.738642818 +0000 UTC m=+1303.428714779" watchObservedRunningTime="2026-03-12 13:31:20.979567862 +0000 UTC m=+1303.669639833" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.089615 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.146669 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-scripts\") pod \"44836ae0-9135-463a-8694-19de955d2e66\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.146715 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5hr4\" (UniqueName: \"kubernetes.io/projected/44836ae0-9135-463a-8694-19de955d2e66-kube-api-access-q5hr4\") pod \"44836ae0-9135-463a-8694-19de955d2e66\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.146777 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-ceilometer-tls-certs\") pod \"44836ae0-9135-463a-8694-19de955d2e66\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.146799 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-combined-ca-bundle\") pod \"44836ae0-9135-463a-8694-19de955d2e66\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.146831 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44836ae0-9135-463a-8694-19de955d2e66-run-httpd\") pod \"44836ae0-9135-463a-8694-19de955d2e66\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.146932 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-config-data\") pod \"44836ae0-9135-463a-8694-19de955d2e66\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.146954 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-sg-core-conf-yaml\") pod \"44836ae0-9135-463a-8694-19de955d2e66\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.147022 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44836ae0-9135-463a-8694-19de955d2e66-log-httpd\") pod \"44836ae0-9135-463a-8694-19de955d2e66\" (UID: \"44836ae0-9135-463a-8694-19de955d2e66\") " Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.148037 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44836ae0-9135-463a-8694-19de955d2e66-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "44836ae0-9135-463a-8694-19de955d2e66" (UID: "44836ae0-9135-463a-8694-19de955d2e66"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.150988 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44836ae0-9135-463a-8694-19de955d2e66-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "44836ae0-9135-463a-8694-19de955d2e66" (UID: "44836ae0-9135-463a-8694-19de955d2e66"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.154319 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-scripts" (OuterVolumeSpecName: "scripts") pod "44836ae0-9135-463a-8694-19de955d2e66" (UID: "44836ae0-9135-463a-8694-19de955d2e66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.155524 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44836ae0-9135-463a-8694-19de955d2e66-kube-api-access-q5hr4" (OuterVolumeSpecName: "kube-api-access-q5hr4") pod "44836ae0-9135-463a-8694-19de955d2e66" (UID: "44836ae0-9135-463a-8694-19de955d2e66"). InnerVolumeSpecName "kube-api-access-q5hr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.183919 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "44836ae0-9135-463a-8694-19de955d2e66" (UID: "44836ae0-9135-463a-8694-19de955d2e66"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.214385 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "44836ae0-9135-463a-8694-19de955d2e66" (UID: "44836ae0-9135-463a-8694-19de955d2e66"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.243420 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44836ae0-9135-463a-8694-19de955d2e66" (UID: "44836ae0-9135-463a-8694-19de955d2e66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.249648 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.249906 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44836ae0-9135-463a-8694-19de955d2e66-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.249998 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.250129 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5hr4\" (UniqueName: \"kubernetes.io/projected/44836ae0-9135-463a-8694-19de955d2e66-kube-api-access-q5hr4\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.250236 4921 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.250328 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.250413 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44836ae0-9135-463a-8694-19de955d2e66-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.272579 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-config-data" (OuterVolumeSpecName: "config-data") pod "44836ae0-9135-463a-8694-19de955d2e66" (UID: "44836ae0-9135-463a-8694-19de955d2e66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.352280 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44836ae0-9135-463a-8694-19de955d2e66-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.719475 4921 generic.go:334] "Generic (PLEG): container finished" podID="3302ef33-557d-4934-9dab-57dcbc94d090" containerID="f4c133ae0816fdf8b088f36ffe494a374bd98719ec665c76f4196ddead7e50cb" exitCode=143 Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.719527 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3302ef33-557d-4934-9dab-57dcbc94d090","Type":"ContainerDied","Data":"f4c133ae0816fdf8b088f36ffe494a374bd98719ec665c76f4196ddead7e50cb"} Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.722663 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"44836ae0-9135-463a-8694-19de955d2e66","Type":"ContainerDied","Data":"a369615aa55d288ae1562f04a68dc61bc0b939f54706c574804abcd64da5564a"} Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.722705 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.722731 4921 scope.go:117] "RemoveContainer" containerID="4d3e4b092ff0ead2067d6da89966b4b2429ee0e77707207913218fa5fe7429df" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.767801 4921 scope.go:117] "RemoveContainer" containerID="c5e51c6ee2388bf0d86fd13c76221a87f2e49b96e0cf1f4685eaf2a7f4f24b91" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.773179 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.782677 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.795924 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:21 crc kubenswrapper[4921]: E0312 13:31:21.796264 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="proxy-httpd" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.796279 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="proxy-httpd" Mar 12 13:31:21 crc kubenswrapper[4921]: E0312 13:31:21.796302 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="sg-core" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.796309 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="sg-core" Mar 12 13:31:21 crc kubenswrapper[4921]: E0312 13:31:21.796326 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="ceilometer-notification-agent" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.796333 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="ceilometer-notification-agent" Mar 12 13:31:21 crc kubenswrapper[4921]: E0312 13:31:21.796346 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="ceilometer-central-agent" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.796352 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="ceilometer-central-agent" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.796502 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="ceilometer-notification-agent" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.796515 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="sg-core" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.796534 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="ceilometer-central-agent" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.796546 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="44836ae0-9135-463a-8694-19de955d2e66" containerName="proxy-httpd" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.797928 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.798699 4921 scope.go:117] "RemoveContainer" containerID="0908c15a770eac4ced285413c97d4bb1e1080776174bd817119a096fe318eb9e" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.799661 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.800463 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.800516 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.818416 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.843983 4921 scope.go:117] "RemoveContainer" containerID="b308f05090e56d2ea4ecebd771f4aa433a5f7d2df8abcf517f23d3ab62112939" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.964017 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-scripts\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.964205 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.964335 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-log-httpd\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.964472 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.964646 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-config-data\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.964945 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bff92\" (UniqueName: \"kubernetes.io/projected/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-kube-api-access-bff92\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.965021 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:21 crc kubenswrapper[4921]: I0312 13:31:21.965120 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-run-httpd\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.004806 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44836ae0-9135-463a-8694-19de955d2e66" path="/var/lib/kubelet/pods/44836ae0-9135-463a-8694-19de955d2e66/volumes" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.005798 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:22 crc kubenswrapper[4921]: E0312 13:31:22.007192 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-bff92 log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="b7ebc747-d7e9-4b5b-843b-a6df55a03c5a" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.068892 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.069531 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-config-data\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.069846 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bff92\" (UniqueName: \"kubernetes.io/projected/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-kube-api-access-bff92\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.069948 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.070041 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-run-httpd\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.070148 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-scripts\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.070296 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.070433 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-log-httpd\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.071983 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-log-httpd\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.072221 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-run-httpd\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.075303 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.076250 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-config-data\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.077842 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-scripts\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.086981 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.098789 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.098929 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bff92\" (UniqueName: \"kubernetes.io/projected/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-kube-api-access-bff92\") pod \"ceilometer-0\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.748588 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.767712 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.884035 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bff92\" (UniqueName: \"kubernetes.io/projected/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-kube-api-access-bff92\") pod \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.884077 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-sg-core-conf-yaml\") pod \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.884111 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-scripts\") pod \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.884134 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-config-data\") pod \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.884186 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-ceilometer-tls-certs\") pod \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.884245 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-run-httpd\") pod \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.884393 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-combined-ca-bundle\") pod \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.884465 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-log-httpd\") pod \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\" (UID: \"b7ebc747-d7e9-4b5b-843b-a6df55a03c5a\") " Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.884886 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a" (UID: "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.884915 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a" (UID: "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.890249 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a" (UID: "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.890626 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-scripts" (OuterVolumeSpecName: "scripts") pod "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a" (UID: "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.891392 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a" (UID: "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.891852 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a" (UID: "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.891897 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-config-data" (OuterVolumeSpecName: "config-data") pod "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a" (UID: "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.893077 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-kube-api-access-bff92" (OuterVolumeSpecName: "kube-api-access-bff92") pod "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a" (UID: "b7ebc747-d7e9-4b5b-843b-a6df55a03c5a"). InnerVolumeSpecName "kube-api-access-bff92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.986139 4921 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.986171 4921 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.986180 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.986190 4921 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.986199 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bff92\" (UniqueName: \"kubernetes.io/projected/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-kube-api-access-bff92\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.986209 4921 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.986217 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:22 crc kubenswrapper[4921]: I0312 13:31:22.986226 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.754908 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.829336 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.839054 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.852765 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.856596 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.858986 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.859130 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.859545 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.862721 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.902344 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q252h\" (UniqueName: \"kubernetes.io/projected/f195685b-74f0-4887-8598-367bf4425faa-kube-api-access-q252h\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.902400 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f195685b-74f0-4887-8598-367bf4425faa-run-httpd\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.902477 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.902511 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-scripts\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.902530 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.902587 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.902607 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f195685b-74f0-4887-8598-367bf4425faa-log-httpd\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:23 crc kubenswrapper[4921]: I0312 13:31:23.902684 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-config-data\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.000611 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7ebc747-d7e9-4b5b-843b-a6df55a03c5a" path="/var/lib/kubelet/pods/b7ebc747-d7e9-4b5b-843b-a6df55a03c5a/volumes" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.003754 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.003825 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-scripts\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.003844 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.003916 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.003943 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f195685b-74f0-4887-8598-367bf4425faa-log-httpd\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.003962 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-config-data\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.003980 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q252h\" (UniqueName: \"kubernetes.io/projected/f195685b-74f0-4887-8598-367bf4425faa-kube-api-access-q252h\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.004024 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f195685b-74f0-4887-8598-367bf4425faa-run-httpd\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.004406 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f195685b-74f0-4887-8598-367bf4425faa-run-httpd\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.005461 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f195685b-74f0-4887-8598-367bf4425faa-log-httpd\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.008795 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.010428 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-scripts\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.011620 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.012843 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.013061 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f195685b-74f0-4887-8598-367bf4425faa-config-data\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.021938 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q252h\" (UniqueName: \"kubernetes.io/projected/f195685b-74f0-4887-8598-367bf4425faa-kube-api-access-q252h\") pod \"ceilometer-0\" (UID: \"f195685b-74f0-4887-8598-367bf4425faa\") " pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.187535 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.262840 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.311262 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3302ef33-557d-4934-9dab-57dcbc94d090-config-data\") pod \"3302ef33-557d-4934-9dab-57dcbc94d090\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.311388 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3302ef33-557d-4934-9dab-57dcbc94d090-combined-ca-bundle\") pod \"3302ef33-557d-4934-9dab-57dcbc94d090\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.311455 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f786c\" (UniqueName: \"kubernetes.io/projected/3302ef33-557d-4934-9dab-57dcbc94d090-kube-api-access-f786c\") pod \"3302ef33-557d-4934-9dab-57dcbc94d090\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.311550 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3302ef33-557d-4934-9dab-57dcbc94d090-logs\") pod \"3302ef33-557d-4934-9dab-57dcbc94d090\" (UID: \"3302ef33-557d-4934-9dab-57dcbc94d090\") " Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.313556 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3302ef33-557d-4934-9dab-57dcbc94d090-logs" (OuterVolumeSpecName: "logs") pod "3302ef33-557d-4934-9dab-57dcbc94d090" (UID: "3302ef33-557d-4934-9dab-57dcbc94d090"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.339139 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3302ef33-557d-4934-9dab-57dcbc94d090-kube-api-access-f786c" (OuterVolumeSpecName: "kube-api-access-f786c") pod "3302ef33-557d-4934-9dab-57dcbc94d090" (UID: "3302ef33-557d-4934-9dab-57dcbc94d090"). InnerVolumeSpecName "kube-api-access-f786c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.350877 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3302ef33-557d-4934-9dab-57dcbc94d090-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3302ef33-557d-4934-9dab-57dcbc94d090" (UID: "3302ef33-557d-4934-9dab-57dcbc94d090"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.354062 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3302ef33-557d-4934-9dab-57dcbc94d090-config-data" (OuterVolumeSpecName: "config-data") pod "3302ef33-557d-4934-9dab-57dcbc94d090" (UID: "3302ef33-557d-4934-9dab-57dcbc94d090"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.414197 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f786c\" (UniqueName: \"kubernetes.io/projected/3302ef33-557d-4934-9dab-57dcbc94d090-kube-api-access-f786c\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.414246 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3302ef33-557d-4934-9dab-57dcbc94d090-logs\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.414255 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3302ef33-557d-4934-9dab-57dcbc94d090-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.414265 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3302ef33-557d-4934-9dab-57dcbc94d090-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.540424 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.763501 4921 generic.go:334] "Generic (PLEG): container finished" podID="3302ef33-557d-4934-9dab-57dcbc94d090" containerID="1211cbc35862694895c201473af095d18af65a659488bc8390f5330c26215a41" exitCode=0 Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.763570 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3302ef33-557d-4934-9dab-57dcbc94d090","Type":"ContainerDied","Data":"1211cbc35862694895c201473af095d18af65a659488bc8390f5330c26215a41"} Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.763597 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3302ef33-557d-4934-9dab-57dcbc94d090","Type":"ContainerDied","Data":"279aa8e19d2a0647b8e6e6835620fd749fb44bd47ff9d8d04661c17e8704f715"} Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.763612 4921 scope.go:117] "RemoveContainer" containerID="1211cbc35862694895c201473af095d18af65a659488bc8390f5330c26215a41" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.763709 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.771217 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f195685b-74f0-4887-8598-367bf4425faa","Type":"ContainerStarted","Data":"44a780426c6e714088326912a9c0f48aa1036d5d374d68ab289721b61e4daf14"} Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.793251 4921 scope.go:117] "RemoveContainer" containerID="f4c133ae0816fdf8b088f36ffe494a374bd98719ec665c76f4196ddead7e50cb" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.794580 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.815064 4921 scope.go:117] "RemoveContainer" containerID="1211cbc35862694895c201473af095d18af65a659488bc8390f5330c26215a41" Mar 12 13:31:24 crc kubenswrapper[4921]: E0312 13:31:24.815569 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1211cbc35862694895c201473af095d18af65a659488bc8390f5330c26215a41\": container with ID starting with 1211cbc35862694895c201473af095d18af65a659488bc8390f5330c26215a41 not found: ID does not exist" containerID="1211cbc35862694895c201473af095d18af65a659488bc8390f5330c26215a41" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.815616 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1211cbc35862694895c201473af095d18af65a659488bc8390f5330c26215a41"} err="failed to get container status \"1211cbc35862694895c201473af095d18af65a659488bc8390f5330c26215a41\": rpc error: code = NotFound desc = could not find container \"1211cbc35862694895c201473af095d18af65a659488bc8390f5330c26215a41\": container with ID starting with 1211cbc35862694895c201473af095d18af65a659488bc8390f5330c26215a41 not found: ID does not exist" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.815648 4921 scope.go:117] "RemoveContainer" containerID="f4c133ae0816fdf8b088f36ffe494a374bd98719ec665c76f4196ddead7e50cb" Mar 12 13:31:24 crc kubenswrapper[4921]: E0312 13:31:24.816325 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4c133ae0816fdf8b088f36ffe494a374bd98719ec665c76f4196ddead7e50cb\": container with ID starting with f4c133ae0816fdf8b088f36ffe494a374bd98719ec665c76f4196ddead7e50cb not found: ID does not exist" containerID="f4c133ae0816fdf8b088f36ffe494a374bd98719ec665c76f4196ddead7e50cb" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.816358 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c133ae0816fdf8b088f36ffe494a374bd98719ec665c76f4196ddead7e50cb"} err="failed to get container status \"f4c133ae0816fdf8b088f36ffe494a374bd98719ec665c76f4196ddead7e50cb\": rpc error: code = NotFound desc = could not find container \"f4c133ae0816fdf8b088f36ffe494a374bd98719ec665c76f4196ddead7e50cb\": container with ID starting with f4c133ae0816fdf8b088f36ffe494a374bd98719ec665c76f4196ddead7e50cb not found: ID does not exist" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.818295 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.831420 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 13:31:24 crc kubenswrapper[4921]: E0312 13:31:24.831895 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3302ef33-557d-4934-9dab-57dcbc94d090" containerName="nova-api-log" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.831917 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3302ef33-557d-4934-9dab-57dcbc94d090" containerName="nova-api-log" Mar 12 13:31:24 crc kubenswrapper[4921]: E0312 13:31:24.831939 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3302ef33-557d-4934-9dab-57dcbc94d090" containerName="nova-api-api" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.831946 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3302ef33-557d-4934-9dab-57dcbc94d090" containerName="nova-api-api" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.832143 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3302ef33-557d-4934-9dab-57dcbc94d090" containerName="nova-api-api" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.832180 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3302ef33-557d-4934-9dab-57dcbc94d090" containerName="nova-api-log" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.833270 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.872568 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.873168 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.875485 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.882199 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.924482 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj7x8\" (UniqueName: \"kubernetes.io/projected/36b01c9f-9666-4fac-9a0e-77e23c79a126-kube-api-access-nj7x8\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.924580 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36b01c9f-9666-4fac-9a0e-77e23c79a126-logs\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.924731 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-internal-tls-certs\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.924951 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-config-data\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.925119 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:24 crc kubenswrapper[4921]: I0312 13:31:24.925194 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-public-tls-certs\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.027938 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36b01c9f-9666-4fac-9a0e-77e23c79a126-logs\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.028049 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-internal-tls-certs\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.028094 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-config-data\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.028176 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.028206 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-public-tls-certs\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.028290 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj7x8\" (UniqueName: \"kubernetes.io/projected/36b01c9f-9666-4fac-9a0e-77e23c79a126-kube-api-access-nj7x8\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.028575 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36b01c9f-9666-4fac-9a0e-77e23c79a126-logs\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.031775 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-public-tls-certs\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.032713 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.036205 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-internal-tls-certs\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.037667 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-config-data\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.047797 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj7x8\" (UniqueName: \"kubernetes.io/projected/36b01c9f-9666-4fac-9a0e-77e23c79a126-kube-api-access-nj7x8\") pod \"nova-api-0\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " pod="openstack/nova-api-0" Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.183351 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.664486 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.787450 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f195685b-74f0-4887-8598-367bf4425faa","Type":"ContainerStarted","Data":"44d0c766576b5224a77b6557a10743221c1386b7af6e6e0666e46b939fc4aa9e"} Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.788648 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"36b01c9f-9666-4fac-9a0e-77e23c79a126","Type":"ContainerStarted","Data":"59494642c0ea56abe6f1f8364f0576b213163bc67bd6817a1c1a24a22a7266a8"} Mar 12 13:31:25 crc kubenswrapper[4921]: I0312 13:31:25.994735 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3302ef33-557d-4934-9dab-57dcbc94d090" path="/var/lib/kubelet/pods/3302ef33-557d-4934-9dab-57dcbc94d090/volumes" Mar 12 13:31:26 crc kubenswrapper[4921]: I0312 13:31:26.069036 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 13:31:26 crc kubenswrapper[4921]: I0312 13:31:26.069085 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 13:31:26 crc kubenswrapper[4921]: I0312 13:31:26.088991 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:26 crc kubenswrapper[4921]: I0312 13:31:26.110281 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:26 crc kubenswrapper[4921]: I0312 13:31:26.800990 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f195685b-74f0-4887-8598-367bf4425faa","Type":"ContainerStarted","Data":"a56c66bc9407fccbb538c98ff332a54e0cb3eeb0cbd5e749b136a12bc827955f"} Mar 12 13:31:26 crc kubenswrapper[4921]: I0312 13:31:26.801030 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f195685b-74f0-4887-8598-367bf4425faa","Type":"ContainerStarted","Data":"40c1bd4513071f462f91bd4d33771e4be03f55bf4cfc8a3d20168e7c9c9e70e4"} Mar 12 13:31:26 crc kubenswrapper[4921]: I0312 13:31:26.803732 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"36b01c9f-9666-4fac-9a0e-77e23c79a126","Type":"ContainerStarted","Data":"7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0"} Mar 12 13:31:26 crc kubenswrapper[4921]: I0312 13:31:26.803794 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"36b01c9f-9666-4fac-9a0e-77e23c79a126","Type":"ContainerStarted","Data":"7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe"} Mar 12 13:31:26 crc kubenswrapper[4921]: I0312 13:31:26.820359 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 12 13:31:26 crc kubenswrapper[4921]: I0312 13:31:26.826427 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.826411336 podStartE2EDuration="2.826411336s" podCreationTimestamp="2026-03-12 13:31:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:31:26.825227571 +0000 UTC m=+1309.515299552" watchObservedRunningTime="2026-03-12 13:31:26.826411336 +0000 UTC m=+1309.516483307" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.078446 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hm292"] Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.079695 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.081777 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.081890 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.081981 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.081976 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.090591 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hm292"] Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.178919 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-scripts\") pod \"nova-cell1-cell-mapping-hm292\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.179011 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-config-data\") pod \"nova-cell1-cell-mapping-hm292\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.179172 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hm292\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.179262 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bfqk\" (UniqueName: \"kubernetes.io/projected/00afedfb-6f74-48a9-92cc-4b7d6ac94161-kube-api-access-9bfqk\") pod \"nova-cell1-cell-mapping-hm292\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.280843 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hm292\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.280919 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bfqk\" (UniqueName: \"kubernetes.io/projected/00afedfb-6f74-48a9-92cc-4b7d6ac94161-kube-api-access-9bfqk\") pod \"nova-cell1-cell-mapping-hm292\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.280995 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-scripts\") pod \"nova-cell1-cell-mapping-hm292\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.281051 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-config-data\") pod \"nova-cell1-cell-mapping-hm292\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.285186 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hm292\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.285828 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-scripts\") pod \"nova-cell1-cell-mapping-hm292\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.293503 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-config-data\") pod \"nova-cell1-cell-mapping-hm292\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.307322 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bfqk\" (UniqueName: \"kubernetes.io/projected/00afedfb-6f74-48a9-92cc-4b7d6ac94161-kube-api-access-9bfqk\") pod \"nova-cell1-cell-mapping-hm292\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.402889 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:27 crc kubenswrapper[4921]: W0312 13:31:27.834733 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00afedfb_6f74_48a9_92cc_4b7d6ac94161.slice/crio-4a81a7f5f98b5eca83757dc587b9d0ce4c2f650e99c688df772e1bbd75b7d252 WatchSource:0}: Error finding container 4a81a7f5f98b5eca83757dc587b9d0ce4c2f650e99c688df772e1bbd75b7d252: Status 404 returned error can't find the container with id 4a81a7f5f98b5eca83757dc587b9d0ce4c2f650e99c688df772e1bbd75b7d252 Mar 12 13:31:27 crc kubenswrapper[4921]: I0312 13:31:27.836759 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hm292"] Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.207974 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.293022 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-46l8p"] Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.293264 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-46l8p" podUID="cc91c058-9ddc-41e2-b22d-0c83a87afbd7" containerName="dnsmasq-dns" containerID="cri-o://f0fc8526058df2f75a2124c282888f548632652aadbf67b6c5ac4f03d4b8701a" gracePeriod=10 Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.762975 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.824485 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hm292" event={"ID":"00afedfb-6f74-48a9-92cc-4b7d6ac94161","Type":"ContainerStarted","Data":"93cea6bebede9d4972c1fa8eca3402fa265a189c91c85e9ccbe28acf6dc9487c"} Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.824830 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hm292" event={"ID":"00afedfb-6f74-48a9-92cc-4b7d6ac94161","Type":"ContainerStarted","Data":"4a81a7f5f98b5eca83757dc587b9d0ce4c2f650e99c688df772e1bbd75b7d252"} Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.830606 4921 generic.go:334] "Generic (PLEG): container finished" podID="cc91c058-9ddc-41e2-b22d-0c83a87afbd7" containerID="f0fc8526058df2f75a2124c282888f548632652aadbf67b6c5ac4f03d4b8701a" exitCode=0 Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.830637 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-46l8p" event={"ID":"cc91c058-9ddc-41e2-b22d-0c83a87afbd7","Type":"ContainerDied","Data":"f0fc8526058df2f75a2124c282888f548632652aadbf67b6c5ac4f03d4b8701a"} Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.830657 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-46l8p" event={"ID":"cc91c058-9ddc-41e2-b22d-0c83a87afbd7","Type":"ContainerDied","Data":"28812fb312b4ab863d3d1011c6826ca6e4d8020c603880690f43120927c149a8"} Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.830673 4921 scope.go:117] "RemoveContainer" containerID="f0fc8526058df2f75a2124c282888f548632652aadbf67b6c5ac4f03d4b8701a" Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.830796 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-46l8p" Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.847110 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hm292" podStartSLOduration=1.8470902489999999 podStartE2EDuration="1.847090249s" podCreationTimestamp="2026-03-12 13:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:31:28.839291253 +0000 UTC m=+1311.529363224" watchObservedRunningTime="2026-03-12 13:31:28.847090249 +0000 UTC m=+1311.537162220" Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.855358 4921 scope.go:117] "RemoveContainer" containerID="68a5f0b7caf542de11475d4ba6d07ddc238595f1dbe4db0f2c92cd56ec3162b1" Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.883951 4921 scope.go:117] "RemoveContainer" containerID="f0fc8526058df2f75a2124c282888f548632652aadbf67b6c5ac4f03d4b8701a" Mar 12 13:31:28 crc kubenswrapper[4921]: E0312 13:31:28.887904 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0fc8526058df2f75a2124c282888f548632652aadbf67b6c5ac4f03d4b8701a\": container with ID starting with f0fc8526058df2f75a2124c282888f548632652aadbf67b6c5ac4f03d4b8701a not found: ID does not exist" containerID="f0fc8526058df2f75a2124c282888f548632652aadbf67b6c5ac4f03d4b8701a" Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.887940 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0fc8526058df2f75a2124c282888f548632652aadbf67b6c5ac4f03d4b8701a"} err="failed to get container status \"f0fc8526058df2f75a2124c282888f548632652aadbf67b6c5ac4f03d4b8701a\": rpc error: code = NotFound desc = could not find container \"f0fc8526058df2f75a2124c282888f548632652aadbf67b6c5ac4f03d4b8701a\": container with ID starting with f0fc8526058df2f75a2124c282888f548632652aadbf67b6c5ac4f03d4b8701a not found: ID does not exist" Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.887960 4921 scope.go:117] "RemoveContainer" containerID="68a5f0b7caf542de11475d4ba6d07ddc238595f1dbe4db0f2c92cd56ec3162b1" Mar 12 13:31:28 crc kubenswrapper[4921]: E0312 13:31:28.890002 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68a5f0b7caf542de11475d4ba6d07ddc238595f1dbe4db0f2c92cd56ec3162b1\": container with ID starting with 68a5f0b7caf542de11475d4ba6d07ddc238595f1dbe4db0f2c92cd56ec3162b1 not found: ID does not exist" containerID="68a5f0b7caf542de11475d4ba6d07ddc238595f1dbe4db0f2c92cd56ec3162b1" Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.890046 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a5f0b7caf542de11475d4ba6d07ddc238595f1dbe4db0f2c92cd56ec3162b1"} err="failed to get container status \"68a5f0b7caf542de11475d4ba6d07ddc238595f1dbe4db0f2c92cd56ec3162b1\": rpc error: code = NotFound desc = could not find container \"68a5f0b7caf542de11475d4ba6d07ddc238595f1dbe4db0f2c92cd56ec3162b1\": container with ID starting with 68a5f0b7caf542de11475d4ba6d07ddc238595f1dbe4db0f2c92cd56ec3162b1 not found: ID does not exist" Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.908534 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-ovsdbserver-sb\") pod \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.908616 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-ovsdbserver-nb\") pod \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.908658 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-dns-svc\") pod \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.908768 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckcsk\" (UniqueName: \"kubernetes.io/projected/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-kube-api-access-ckcsk\") pod \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.908788 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-config\") pod \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\" (UID: \"cc91c058-9ddc-41e2-b22d-0c83a87afbd7\") " Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.920956 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-kube-api-access-ckcsk" (OuterVolumeSpecName: "kube-api-access-ckcsk") pod "cc91c058-9ddc-41e2-b22d-0c83a87afbd7" (UID: "cc91c058-9ddc-41e2-b22d-0c83a87afbd7"). InnerVolumeSpecName "kube-api-access-ckcsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.955049 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc91c058-9ddc-41e2-b22d-0c83a87afbd7" (UID: "cc91c058-9ddc-41e2-b22d-0c83a87afbd7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.958268 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-config" (OuterVolumeSpecName: "config") pod "cc91c058-9ddc-41e2-b22d-0c83a87afbd7" (UID: "cc91c058-9ddc-41e2-b22d-0c83a87afbd7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.971467 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cc91c058-9ddc-41e2-b22d-0c83a87afbd7" (UID: "cc91c058-9ddc-41e2-b22d-0c83a87afbd7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:31:28 crc kubenswrapper[4921]: I0312 13:31:28.983323 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cc91c058-9ddc-41e2-b22d-0c83a87afbd7" (UID: "cc91c058-9ddc-41e2-b22d-0c83a87afbd7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:31:29 crc kubenswrapper[4921]: I0312 13:31:29.010866 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckcsk\" (UniqueName: \"kubernetes.io/projected/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-kube-api-access-ckcsk\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:29 crc kubenswrapper[4921]: I0312 13:31:29.010893 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:29 crc kubenswrapper[4921]: I0312 13:31:29.010903 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:29 crc kubenswrapper[4921]: I0312 13:31:29.010911 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:29 crc kubenswrapper[4921]: I0312 13:31:29.010919 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc91c058-9ddc-41e2-b22d-0c83a87afbd7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:29 crc kubenswrapper[4921]: I0312 13:31:29.167117 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-46l8p"] Mar 12 13:31:29 crc kubenswrapper[4921]: I0312 13:31:29.175537 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-46l8p"] Mar 12 13:31:29 crc kubenswrapper[4921]: I0312 13:31:29.843992 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f195685b-74f0-4887-8598-367bf4425faa","Type":"ContainerStarted","Data":"d1ee1363559f923863f4e928aa45c2d7e60dc15eba3b4cb9ad1100eb13cec2b4"} Mar 12 13:31:29 crc kubenswrapper[4921]: I0312 13:31:29.867391 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.8240025490000003 podStartE2EDuration="6.867369477s" podCreationTimestamp="2026-03-12 13:31:23 +0000 UTC" firstStartedPulling="2026-03-12 13:31:24.550147855 +0000 UTC m=+1307.240219826" lastFinishedPulling="2026-03-12 13:31:28.593514783 +0000 UTC m=+1311.283586754" observedRunningTime="2026-03-12 13:31:29.866238552 +0000 UTC m=+1312.556310523" watchObservedRunningTime="2026-03-12 13:31:29.867369477 +0000 UTC m=+1312.557441458" Mar 12 13:31:29 crc kubenswrapper[4921]: I0312 13:31:29.997646 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc91c058-9ddc-41e2-b22d-0c83a87afbd7" path="/var/lib/kubelet/pods/cc91c058-9ddc-41e2-b22d-0c83a87afbd7/volumes" Mar 12 13:31:30 crc kubenswrapper[4921]: I0312 13:31:30.853070 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 12 13:31:32 crc kubenswrapper[4921]: I0312 13:31:32.875805 4921 generic.go:334] "Generic (PLEG): container finished" podID="00afedfb-6f74-48a9-92cc-4b7d6ac94161" containerID="93cea6bebede9d4972c1fa8eca3402fa265a189c91c85e9ccbe28acf6dc9487c" exitCode=0 Mar 12 13:31:32 crc kubenswrapper[4921]: I0312 13:31:32.876209 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hm292" event={"ID":"00afedfb-6f74-48a9-92cc-4b7d6ac94161","Type":"ContainerDied","Data":"93cea6bebede9d4972c1fa8eca3402fa265a189c91c85e9ccbe28acf6dc9487c"} Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.069506 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.069902 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.303345 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.438852 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-combined-ca-bundle\") pod \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.438956 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-config-data\") pod \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.438984 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bfqk\" (UniqueName: \"kubernetes.io/projected/00afedfb-6f74-48a9-92cc-4b7d6ac94161-kube-api-access-9bfqk\") pod \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.439006 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-scripts\") pod \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\" (UID: \"00afedfb-6f74-48a9-92cc-4b7d6ac94161\") " Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.444917 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-scripts" (OuterVolumeSpecName: "scripts") pod "00afedfb-6f74-48a9-92cc-4b7d6ac94161" (UID: "00afedfb-6f74-48a9-92cc-4b7d6ac94161"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.445671 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00afedfb-6f74-48a9-92cc-4b7d6ac94161-kube-api-access-9bfqk" (OuterVolumeSpecName: "kube-api-access-9bfqk") pod "00afedfb-6f74-48a9-92cc-4b7d6ac94161" (UID: "00afedfb-6f74-48a9-92cc-4b7d6ac94161"). InnerVolumeSpecName "kube-api-access-9bfqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.481092 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00afedfb-6f74-48a9-92cc-4b7d6ac94161" (UID: "00afedfb-6f74-48a9-92cc-4b7d6ac94161"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.481786 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-config-data" (OuterVolumeSpecName: "config-data") pod "00afedfb-6f74-48a9-92cc-4b7d6ac94161" (UID: "00afedfb-6f74-48a9-92cc-4b7d6ac94161"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.541195 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.541226 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bfqk\" (UniqueName: \"kubernetes.io/projected/00afedfb-6f74-48a9-92cc-4b7d6ac94161-kube-api-access-9bfqk\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.541238 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.541246 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00afedfb-6f74-48a9-92cc-4b7d6ac94161-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.898847 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hm292" event={"ID":"00afedfb-6f74-48a9-92cc-4b7d6ac94161","Type":"ContainerDied","Data":"4a81a7f5f98b5eca83757dc587b9d0ce4c2f650e99c688df772e1bbd75b7d252"} Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.899218 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a81a7f5f98b5eca83757dc587b9d0ce4c2f650e99c688df772e1bbd75b7d252" Mar 12 13:31:34 crc kubenswrapper[4921]: I0312 13:31:34.898940 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hm292" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.123528 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.123915 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="36b01c9f-9666-4fac-9a0e-77e23c79a126" containerName="nova-api-log" containerID="cri-o://7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe" gracePeriod=30 Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.123981 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="36b01c9f-9666-4fac-9a0e-77e23c79a126" containerName="nova-api-api" containerID="cri-o://7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0" gracePeriod=30 Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.170146 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.170473 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="02d0afd0-25f5-44b7-91f0-47d0be7ba8f9" containerName="nova-scheduler-scheduler" containerID="cri-o://38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5" gracePeriod=30 Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.187399 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.187622 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" containerName="nova-metadata-log" containerID="cri-o://3e6c646a1df84ba0c4bcf4b04c201cafed11515abd93f56bb15e6aacc3e4e47e" gracePeriod=30 Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.187727 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" containerName="nova-metadata-metadata" containerID="cri-o://7e029eb710c5a06bd06499db12be33d8fe701748717dffc5d41cce0669d8cded" gracePeriod=30 Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.668140 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.677313 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-internal-tls-certs\") pod \"36b01c9f-9666-4fac-9a0e-77e23c79a126\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.677372 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-config-data\") pod \"36b01c9f-9666-4fac-9a0e-77e23c79a126\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.677401 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36b01c9f-9666-4fac-9a0e-77e23c79a126-logs\") pod \"36b01c9f-9666-4fac-9a0e-77e23c79a126\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.677483 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-public-tls-certs\") pod \"36b01c9f-9666-4fac-9a0e-77e23c79a126\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.677512 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj7x8\" (UniqueName: \"kubernetes.io/projected/36b01c9f-9666-4fac-9a0e-77e23c79a126-kube-api-access-nj7x8\") pod \"36b01c9f-9666-4fac-9a0e-77e23c79a126\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.677539 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-combined-ca-bundle\") pod \"36b01c9f-9666-4fac-9a0e-77e23c79a126\" (UID: \"36b01c9f-9666-4fac-9a0e-77e23c79a126\") " Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.678226 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36b01c9f-9666-4fac-9a0e-77e23c79a126-logs" (OuterVolumeSpecName: "logs") pod "36b01c9f-9666-4fac-9a0e-77e23c79a126" (UID: "36b01c9f-9666-4fac-9a0e-77e23c79a126"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.682571 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36b01c9f-9666-4fac-9a0e-77e23c79a126-kube-api-access-nj7x8" (OuterVolumeSpecName: "kube-api-access-nj7x8") pod "36b01c9f-9666-4fac-9a0e-77e23c79a126" (UID: "36b01c9f-9666-4fac-9a0e-77e23c79a126"). InnerVolumeSpecName "kube-api-access-nj7x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.712171 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36b01c9f-9666-4fac-9a0e-77e23c79a126" (UID: "36b01c9f-9666-4fac-9a0e-77e23c79a126"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.712538 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-config-data" (OuterVolumeSpecName: "config-data") pod "36b01c9f-9666-4fac-9a0e-77e23c79a126" (UID: "36b01c9f-9666-4fac-9a0e-77e23c79a126"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.733228 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "36b01c9f-9666-4fac-9a0e-77e23c79a126" (UID: "36b01c9f-9666-4fac-9a0e-77e23c79a126"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.739673 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "36b01c9f-9666-4fac-9a0e-77e23c79a126" (UID: "36b01c9f-9666-4fac-9a0e-77e23c79a126"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.786789 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.786839 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.786849 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36b01c9f-9666-4fac-9a0e-77e23c79a126-logs\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.786864 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.786884 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj7x8\" (UniqueName: \"kubernetes.io/projected/36b01c9f-9666-4fac-9a0e-77e23c79a126-kube-api-access-nj7x8\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.786899 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b01c9f-9666-4fac-9a0e-77e23c79a126-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.916538 4921 generic.go:334] "Generic (PLEG): container finished" podID="36b01c9f-9666-4fac-9a0e-77e23c79a126" containerID="7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0" exitCode=0 Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.916581 4921 generic.go:334] "Generic (PLEG): container finished" podID="36b01c9f-9666-4fac-9a0e-77e23c79a126" containerID="7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe" exitCode=143 Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.916593 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"36b01c9f-9666-4fac-9a0e-77e23c79a126","Type":"ContainerDied","Data":"7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0"} Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.916646 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"36b01c9f-9666-4fac-9a0e-77e23c79a126","Type":"ContainerDied","Data":"7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe"} Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.916663 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"36b01c9f-9666-4fac-9a0e-77e23c79a126","Type":"ContainerDied","Data":"59494642c0ea56abe6f1f8364f0576b213163bc67bd6817a1c1a24a22a7266a8"} Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.916679 4921 scope.go:117] "RemoveContainer" containerID="7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.916765 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.922422 4921 generic.go:334] "Generic (PLEG): container finished" podID="4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" containerID="3e6c646a1df84ba0c4bcf4b04c201cafed11515abd93f56bb15e6aacc3e4e47e" exitCode=143 Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.922475 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615","Type":"ContainerDied","Data":"3e6c646a1df84ba0c4bcf4b04c201cafed11515abd93f56bb15e6aacc3e4e47e"} Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.941163 4921 scope.go:117] "RemoveContainer" containerID="7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.970409 4921 scope.go:117] "RemoveContainer" containerID="7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0" Mar 12 13:31:35 crc kubenswrapper[4921]: E0312 13:31:35.970842 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0\": container with ID starting with 7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0 not found: ID does not exist" containerID="7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.970877 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0"} err="failed to get container status \"7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0\": rpc error: code = NotFound desc = could not find container \"7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0\": container with ID starting with 7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0 not found: ID does not exist" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.970899 4921 scope.go:117] "RemoveContainer" containerID="7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe" Mar 12 13:31:35 crc kubenswrapper[4921]: E0312 13:31:35.971222 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe\": container with ID starting with 7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe not found: ID does not exist" containerID="7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.971278 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe"} err="failed to get container status \"7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe\": rpc error: code = NotFound desc = could not find container \"7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe\": container with ID starting with 7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe not found: ID does not exist" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.971314 4921 scope.go:117] "RemoveContainer" containerID="7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.971696 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0"} err="failed to get container status \"7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0\": rpc error: code = NotFound desc = could not find container \"7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0\": container with ID starting with 7ed474b6e6d6f2809919e373b48846ef8648b46c1e5aa7d63bcbc707f71fcde0 not found: ID does not exist" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.971797 4921 scope.go:117] "RemoveContainer" containerID="7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.972115 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe"} err="failed to get container status \"7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe\": rpc error: code = NotFound desc = could not find container \"7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe\": container with ID starting with 7ec6dbdd6fab383d81c54ce7a98582dfad25983007c4f00a4cdadec99550ebbe not found: ID does not exist" Mar 12 13:31:35 crc kubenswrapper[4921]: I0312 13:31:35.972184 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.002438 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.002477 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 13:31:36 crc kubenswrapper[4921]: E0312 13:31:36.002770 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc91c058-9ddc-41e2-b22d-0c83a87afbd7" containerName="dnsmasq-dns" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.002784 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc91c058-9ddc-41e2-b22d-0c83a87afbd7" containerName="dnsmasq-dns" Mar 12 13:31:36 crc kubenswrapper[4921]: E0312 13:31:36.002797 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00afedfb-6f74-48a9-92cc-4b7d6ac94161" containerName="nova-manage" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.002805 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="00afedfb-6f74-48a9-92cc-4b7d6ac94161" containerName="nova-manage" Mar 12 13:31:36 crc kubenswrapper[4921]: E0312 13:31:36.002834 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36b01c9f-9666-4fac-9a0e-77e23c79a126" containerName="nova-api-api" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.002840 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b01c9f-9666-4fac-9a0e-77e23c79a126" containerName="nova-api-api" Mar 12 13:31:36 crc kubenswrapper[4921]: E0312 13:31:36.002866 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc91c058-9ddc-41e2-b22d-0c83a87afbd7" containerName="init" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.002872 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc91c058-9ddc-41e2-b22d-0c83a87afbd7" containerName="init" Mar 12 13:31:36 crc kubenswrapper[4921]: E0312 13:31:36.002883 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36b01c9f-9666-4fac-9a0e-77e23c79a126" containerName="nova-api-log" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.002889 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b01c9f-9666-4fac-9a0e-77e23c79a126" containerName="nova-api-log" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.003055 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="36b01c9f-9666-4fac-9a0e-77e23c79a126" containerName="nova-api-log" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.003069 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc91c058-9ddc-41e2-b22d-0c83a87afbd7" containerName="dnsmasq-dns" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.003084 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="00afedfb-6f74-48a9-92cc-4b7d6ac94161" containerName="nova-manage" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.003097 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="36b01c9f-9666-4fac-9a0e-77e23c79a126" containerName="nova-api-api" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.003973 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.007474 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.007916 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.007964 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.008236 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 12 13:31:36 crc kubenswrapper[4921]: E0312 13:31:36.036708 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 13:31:36 crc kubenswrapper[4921]: E0312 13:31:36.038841 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 13:31:36 crc kubenswrapper[4921]: E0312 13:31:36.040011 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 13:31:36 crc kubenswrapper[4921]: E0312 13:31:36.040086 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="02d0afd0-25f5-44b7-91f0-47d0be7ba8f9" containerName="nova-scheduler-scheduler" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.194608 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148f1f44-e990-4353-b376-1ccbb7f01d0a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.194672 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/148f1f44-e990-4353-b376-1ccbb7f01d0a-public-tls-certs\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.194760 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ft6j\" (UniqueName: \"kubernetes.io/projected/148f1f44-e990-4353-b376-1ccbb7f01d0a-kube-api-access-9ft6j\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.194837 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148f1f44-e990-4353-b376-1ccbb7f01d0a-config-data\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.194969 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/148f1f44-e990-4353-b376-1ccbb7f01d0a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.195067 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/148f1f44-e990-4353-b376-1ccbb7f01d0a-logs\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.297114 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ft6j\" (UniqueName: \"kubernetes.io/projected/148f1f44-e990-4353-b376-1ccbb7f01d0a-kube-api-access-9ft6j\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.297194 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148f1f44-e990-4353-b376-1ccbb7f01d0a-config-data\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.297256 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/148f1f44-e990-4353-b376-1ccbb7f01d0a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.297287 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/148f1f44-e990-4353-b376-1ccbb7f01d0a-logs\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.297410 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148f1f44-e990-4353-b376-1ccbb7f01d0a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.297442 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/148f1f44-e990-4353-b376-1ccbb7f01d0a-public-tls-certs\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.298271 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/148f1f44-e990-4353-b376-1ccbb7f01d0a-logs\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.301637 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148f1f44-e990-4353-b376-1ccbb7f01d0a-config-data\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.302461 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148f1f44-e990-4353-b376-1ccbb7f01d0a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.302481 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/148f1f44-e990-4353-b376-1ccbb7f01d0a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.303133 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/148f1f44-e990-4353-b376-1ccbb7f01d0a-public-tls-certs\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.330892 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ft6j\" (UniqueName: \"kubernetes.io/projected/148f1f44-e990-4353-b376-1ccbb7f01d0a-kube-api-access-9ft6j\") pod \"nova-api-0\" (UID: \"148f1f44-e990-4353-b376-1ccbb7f01d0a\") " pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.338516 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.790647 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 13:31:36 crc kubenswrapper[4921]: W0312 13:31:36.798017 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod148f1f44_e990_4353_b376_1ccbb7f01d0a.slice/crio-3e369cc1318b2364816c09eb230c752a5b469408c046aaa0cc52357bd188f52d WatchSource:0}: Error finding container 3e369cc1318b2364816c09eb230c752a5b469408c046aaa0cc52357bd188f52d: Status 404 returned error can't find the container with id 3e369cc1318b2364816c09eb230c752a5b469408c046aaa0cc52357bd188f52d Mar 12 13:31:36 crc kubenswrapper[4921]: I0312 13:31:36.935879 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"148f1f44-e990-4353-b376-1ccbb7f01d0a","Type":"ContainerStarted","Data":"3e369cc1318b2364816c09eb230c752a5b469408c046aaa0cc52357bd188f52d"} Mar 12 13:31:37 crc kubenswrapper[4921]: I0312 13:31:37.948321 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"148f1f44-e990-4353-b376-1ccbb7f01d0a","Type":"ContainerStarted","Data":"842dc104dac503e2e7f9fbbd7b73c5cc38cb298378710fafcf88906f4a0003d6"} Mar 12 13:31:37 crc kubenswrapper[4921]: I0312 13:31:37.948653 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"148f1f44-e990-4353-b376-1ccbb7f01d0a","Type":"ContainerStarted","Data":"c400f7b8ef16291903465bbd11331b0ca7223c5feb4d7440ffdbf83c30d5afe1"} Mar 12 13:31:37 crc kubenswrapper[4921]: I0312 13:31:37.973369 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.9733467129999998 podStartE2EDuration="2.973346713s" podCreationTimestamp="2026-03-12 13:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:31:37.971609931 +0000 UTC m=+1320.661681942" watchObservedRunningTime="2026-03-12 13:31:37.973346713 +0000 UTC m=+1320.663418694" Mar 12 13:31:38 crc kubenswrapper[4921]: I0312 13:31:38.001300 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36b01c9f-9666-4fac-9a0e-77e23c79a126" path="/var/lib/kubelet/pods/36b01c9f-9666-4fac-9a0e-77e23c79a126/volumes" Mar 12 13:31:38 crc kubenswrapper[4921]: I0312 13:31:38.858086 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 13:31:38 crc kubenswrapper[4921]: I0312 13:31:38.949624 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-logs" (OuterVolumeSpecName: "logs") pod "4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" (UID: "4cfad37d-60e7-4c8b-ba4e-1fddef1cb615"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:31:38 crc kubenswrapper[4921]: I0312 13:31:38.950099 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-logs\") pod \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " Mar 12 13:31:38 crc kubenswrapper[4921]: I0312 13:31:38.950867 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-logs\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:38 crc kubenswrapper[4921]: I0312 13:31:38.960585 4921 generic.go:334] "Generic (PLEG): container finished" podID="4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" containerID="7e029eb710c5a06bd06499db12be33d8fe701748717dffc5d41cce0669d8cded" exitCode=0 Mar 12 13:31:38 crc kubenswrapper[4921]: I0312 13:31:38.960632 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 13:31:38 crc kubenswrapper[4921]: I0312 13:31:38.960666 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615","Type":"ContainerDied","Data":"7e029eb710c5a06bd06499db12be33d8fe701748717dffc5d41cce0669d8cded"} Mar 12 13:31:38 crc kubenswrapper[4921]: I0312 13:31:38.960696 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615","Type":"ContainerDied","Data":"ad3d763924641820cb03fe000299e90fc0d521a94f1c1db1103a7d32b6f72ccb"} Mar 12 13:31:38 crc kubenswrapper[4921]: I0312 13:31:38.960711 4921 scope.go:117] "RemoveContainer" containerID="7e029eb710c5a06bd06499db12be33d8fe701748717dffc5d41cce0669d8cded" Mar 12 13:31:38 crc kubenswrapper[4921]: I0312 13:31:38.987566 4921 scope.go:117] "RemoveContainer" containerID="3e6c646a1df84ba0c4bcf4b04c201cafed11515abd93f56bb15e6aacc3e4e47e" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.013526 4921 scope.go:117] "RemoveContainer" containerID="7e029eb710c5a06bd06499db12be33d8fe701748717dffc5d41cce0669d8cded" Mar 12 13:31:39 crc kubenswrapper[4921]: E0312 13:31:39.013860 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e029eb710c5a06bd06499db12be33d8fe701748717dffc5d41cce0669d8cded\": container with ID starting with 7e029eb710c5a06bd06499db12be33d8fe701748717dffc5d41cce0669d8cded not found: ID does not exist" containerID="7e029eb710c5a06bd06499db12be33d8fe701748717dffc5d41cce0669d8cded" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.013887 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e029eb710c5a06bd06499db12be33d8fe701748717dffc5d41cce0669d8cded"} err="failed to get container status \"7e029eb710c5a06bd06499db12be33d8fe701748717dffc5d41cce0669d8cded\": rpc error: code = NotFound desc = could not find container \"7e029eb710c5a06bd06499db12be33d8fe701748717dffc5d41cce0669d8cded\": container with ID starting with 7e029eb710c5a06bd06499db12be33d8fe701748717dffc5d41cce0669d8cded not found: ID does not exist" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.013906 4921 scope.go:117] "RemoveContainer" containerID="3e6c646a1df84ba0c4bcf4b04c201cafed11515abd93f56bb15e6aacc3e4e47e" Mar 12 13:31:39 crc kubenswrapper[4921]: E0312 13:31:39.014070 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e6c646a1df84ba0c4bcf4b04c201cafed11515abd93f56bb15e6aacc3e4e47e\": container with ID starting with 3e6c646a1df84ba0c4bcf4b04c201cafed11515abd93f56bb15e6aacc3e4e47e not found: ID does not exist" containerID="3e6c646a1df84ba0c4bcf4b04c201cafed11515abd93f56bb15e6aacc3e4e47e" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.014090 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e6c646a1df84ba0c4bcf4b04c201cafed11515abd93f56bb15e6aacc3e4e47e"} err="failed to get container status \"3e6c646a1df84ba0c4bcf4b04c201cafed11515abd93f56bb15e6aacc3e4e47e\": rpc error: code = NotFound desc = could not find container \"3e6c646a1df84ba0c4bcf4b04c201cafed11515abd93f56bb15e6aacc3e4e47e\": container with ID starting with 3e6c646a1df84ba0c4bcf4b04c201cafed11515abd93f56bb15e6aacc3e4e47e not found: ID does not exist" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.051617 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-nova-metadata-tls-certs\") pod \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.052238 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-combined-ca-bundle\") pod \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.052311 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-config-data\") pod \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.052381 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wkmh\" (UniqueName: \"kubernetes.io/projected/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-kube-api-access-4wkmh\") pod \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\" (UID: \"4cfad37d-60e7-4c8b-ba4e-1fddef1cb615\") " Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.057698 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-kube-api-access-4wkmh" (OuterVolumeSpecName: "kube-api-access-4wkmh") pod "4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" (UID: "4cfad37d-60e7-4c8b-ba4e-1fddef1cb615"). InnerVolumeSpecName "kube-api-access-4wkmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.096587 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" (UID: "4cfad37d-60e7-4c8b-ba4e-1fddef1cb615"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.097329 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-config-data" (OuterVolumeSpecName: "config-data") pod "4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" (UID: "4cfad37d-60e7-4c8b-ba4e-1fddef1cb615"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.131401 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" (UID: "4cfad37d-60e7-4c8b-ba4e-1fddef1cb615"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.156555 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.156608 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.156647 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wkmh\" (UniqueName: \"kubernetes.io/projected/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-kube-api-access-4wkmh\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.156667 4921 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.312919 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.322510 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.342094 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:31:39 crc kubenswrapper[4921]: E0312 13:31:39.342577 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" containerName="nova-metadata-log" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.342603 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" containerName="nova-metadata-log" Mar 12 13:31:39 crc kubenswrapper[4921]: E0312 13:31:39.342641 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" containerName="nova-metadata-metadata" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.342650 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" containerName="nova-metadata-metadata" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.342930 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" containerName="nova-metadata-log" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.342965 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" containerName="nova-metadata-metadata" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.344160 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.349115 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.349449 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.354391 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.361906 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8089872-446f-4355-94d8-8b82e1b04030-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.362082 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8089872-446f-4355-94d8-8b82e1b04030-config-data\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.362255 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8089872-446f-4355-94d8-8b82e1b04030-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.362454 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4kvl\" (UniqueName: \"kubernetes.io/projected/a8089872-446f-4355-94d8-8b82e1b04030-kube-api-access-v4kvl\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.362629 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8089872-446f-4355-94d8-8b82e1b04030-logs\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.464042 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8089872-446f-4355-94d8-8b82e1b04030-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.464120 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4kvl\" (UniqueName: \"kubernetes.io/projected/a8089872-446f-4355-94d8-8b82e1b04030-kube-api-access-v4kvl\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.464196 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8089872-446f-4355-94d8-8b82e1b04030-logs\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.464238 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8089872-446f-4355-94d8-8b82e1b04030-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.464256 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8089872-446f-4355-94d8-8b82e1b04030-config-data\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.465766 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8089872-446f-4355-94d8-8b82e1b04030-logs\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.468978 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8089872-446f-4355-94d8-8b82e1b04030-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.470053 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8089872-446f-4355-94d8-8b82e1b04030-config-data\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.470762 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8089872-446f-4355-94d8-8b82e1b04030-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.485177 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4kvl\" (UniqueName: \"kubernetes.io/projected/a8089872-446f-4355-94d8-8b82e1b04030-kube-api-access-v4kvl\") pod \"nova-metadata-0\" (UID: \"a8089872-446f-4355-94d8-8b82e1b04030\") " pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.666076 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.836239 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.872753 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89gpn\" (UniqueName: \"kubernetes.io/projected/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-kube-api-access-89gpn\") pod \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\" (UID: \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\") " Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.873037 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-combined-ca-bundle\") pod \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\" (UID: \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\") " Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.873081 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-config-data\") pod \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\" (UID: \"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9\") " Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.879027 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-kube-api-access-89gpn" (OuterVolumeSpecName: "kube-api-access-89gpn") pod "02d0afd0-25f5-44b7-91f0-47d0be7ba8f9" (UID: "02d0afd0-25f5-44b7-91f0-47d0be7ba8f9"). InnerVolumeSpecName "kube-api-access-89gpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.883956 4921 scope.go:117] "RemoveContainer" containerID="03ceaeb590891f2263ea251ed3386a30cd832d3c1faed756de5e3fc7776b5b93" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.904769 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-config-data" (OuterVolumeSpecName: "config-data") pod "02d0afd0-25f5-44b7-91f0-47d0be7ba8f9" (UID: "02d0afd0-25f5-44b7-91f0-47d0be7ba8f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.905562 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02d0afd0-25f5-44b7-91f0-47d0be7ba8f9" (UID: "02d0afd0-25f5-44b7-91f0-47d0be7ba8f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.917620 4921 scope.go:117] "RemoveContainer" containerID="be8e1aff0328d2e3a4f335b0dfb700efc023d0b314dcfd68c279a226b20d7cad" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.944871 4921 scope.go:117] "RemoveContainer" containerID="0c1d09c13d1538f8de372704d0177bf0c6b360e8fec4e5e7a60e93e9ca2923b4" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.975987 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89gpn\" (UniqueName: \"kubernetes.io/projected/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-kube-api-access-89gpn\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.976201 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.976211 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.978434 4921 generic.go:334] "Generic (PLEG): container finished" podID="02d0afd0-25f5-44b7-91f0-47d0be7ba8f9" containerID="38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5" exitCode=0 Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.978483 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.978491 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9","Type":"ContainerDied","Data":"38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5"} Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.978562 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"02d0afd0-25f5-44b7-91f0-47d0be7ba8f9","Type":"ContainerDied","Data":"9d901e7663c04a70aff38e920862937467e2f11e17b0debdcb5e219da6d3d37f"} Mar 12 13:31:39 crc kubenswrapper[4921]: I0312 13:31:39.978583 4921 scope.go:117] "RemoveContainer" containerID="38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.032752 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cfad37d-60e7-4c8b-ba4e-1fddef1cb615" path="/var/lib/kubelet/pods/4cfad37d-60e7-4c8b-ba4e-1fddef1cb615/volumes" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.033917 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.038664 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.045111 4921 scope.go:117] "RemoveContainer" containerID="38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5" Mar 12 13:31:40 crc kubenswrapper[4921]: E0312 13:31:40.045555 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5\": container with ID starting with 38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5 not found: ID does not exist" containerID="38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.045733 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5"} err="failed to get container status \"38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5\": rpc error: code = NotFound desc = could not find container \"38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5\": container with ID starting with 38298f5852343d33a73d0bfaf911130844965bc177efc2b201854353943152c5 not found: ID does not exist" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.051353 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:31:40 crc kubenswrapper[4921]: E0312 13:31:40.051756 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02d0afd0-25f5-44b7-91f0-47d0be7ba8f9" containerName="nova-scheduler-scheduler" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.051775 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="02d0afd0-25f5-44b7-91f0-47d0be7ba8f9" containerName="nova-scheduler-scheduler" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.051971 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="02d0afd0-25f5-44b7-91f0-47d0be7ba8f9" containerName="nova-scheduler-scheduler" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.052627 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.057100 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.058276 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.077232 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqkst\" (UniqueName: \"kubernetes.io/projected/b3862104-1cf4-4b79-ab48-f94ad1e83964-kube-api-access-bqkst\") pod \"nova-scheduler-0\" (UID: \"b3862104-1cf4-4b79-ab48-f94ad1e83964\") " pod="openstack/nova-scheduler-0" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.077563 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3862104-1cf4-4b79-ab48-f94ad1e83964-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3862104-1cf4-4b79-ab48-f94ad1e83964\") " pod="openstack/nova-scheduler-0" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.077679 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3862104-1cf4-4b79-ab48-f94ad1e83964-config-data\") pod \"nova-scheduler-0\" (UID: \"b3862104-1cf4-4b79-ab48-f94ad1e83964\") " pod="openstack/nova-scheduler-0" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.126044 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 13:31:40 crc kubenswrapper[4921]: W0312 13:31:40.132159 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8089872_446f_4355_94d8_8b82e1b04030.slice/crio-32da8f44898bf0cef01ecfec3354e83f5336f586ee540d7fe7609e611352641c WatchSource:0}: Error finding container 32da8f44898bf0cef01ecfec3354e83f5336f586ee540d7fe7609e611352641c: Status 404 returned error can't find the container with id 32da8f44898bf0cef01ecfec3354e83f5336f586ee540d7fe7609e611352641c Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.179893 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqkst\" (UniqueName: \"kubernetes.io/projected/b3862104-1cf4-4b79-ab48-f94ad1e83964-kube-api-access-bqkst\") pod \"nova-scheduler-0\" (UID: \"b3862104-1cf4-4b79-ab48-f94ad1e83964\") " pod="openstack/nova-scheduler-0" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.179974 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3862104-1cf4-4b79-ab48-f94ad1e83964-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3862104-1cf4-4b79-ab48-f94ad1e83964\") " pod="openstack/nova-scheduler-0" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.180019 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3862104-1cf4-4b79-ab48-f94ad1e83964-config-data\") pod \"nova-scheduler-0\" (UID: \"b3862104-1cf4-4b79-ab48-f94ad1e83964\") " pod="openstack/nova-scheduler-0" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.185534 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3862104-1cf4-4b79-ab48-f94ad1e83964-config-data\") pod \"nova-scheduler-0\" (UID: \"b3862104-1cf4-4b79-ab48-f94ad1e83964\") " pod="openstack/nova-scheduler-0" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.190568 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3862104-1cf4-4b79-ab48-f94ad1e83964-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b3862104-1cf4-4b79-ab48-f94ad1e83964\") " pod="openstack/nova-scheduler-0" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.198933 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqkst\" (UniqueName: \"kubernetes.io/projected/b3862104-1cf4-4b79-ab48-f94ad1e83964-kube-api-access-bqkst\") pod \"nova-scheduler-0\" (UID: \"b3862104-1cf4-4b79-ab48-f94ad1e83964\") " pod="openstack/nova-scheduler-0" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.373366 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.825160 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 13:31:40 crc kubenswrapper[4921]: I0312 13:31:40.999533 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b3862104-1cf4-4b79-ab48-f94ad1e83964","Type":"ContainerStarted","Data":"198d58771ace2d820b7b250065d0d33512890228009e9d572397573ed3fb7a57"} Mar 12 13:31:41 crc kubenswrapper[4921]: I0312 13:31:41.002367 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8089872-446f-4355-94d8-8b82e1b04030","Type":"ContainerStarted","Data":"935978f3568dd6a3a157a704e1be78c1a64857beb5f878d55131ac92ad8d7f9a"} Mar 12 13:31:41 crc kubenswrapper[4921]: I0312 13:31:41.002416 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8089872-446f-4355-94d8-8b82e1b04030","Type":"ContainerStarted","Data":"54a6441e8d778186d2e47cd06de7715abd92fc6fa8e6a35e84661f88fce54efb"} Mar 12 13:31:41 crc kubenswrapper[4921]: I0312 13:31:41.002432 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8089872-446f-4355-94d8-8b82e1b04030","Type":"ContainerStarted","Data":"32da8f44898bf0cef01ecfec3354e83f5336f586ee540d7fe7609e611352641c"} Mar 12 13:31:41 crc kubenswrapper[4921]: I0312 13:31:41.021124 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.02110395 podStartE2EDuration="2.02110395s" podCreationTimestamp="2026-03-12 13:31:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:31:41.017857012 +0000 UTC m=+1323.707929003" watchObservedRunningTime="2026-03-12 13:31:41.02110395 +0000 UTC m=+1323.711175921" Mar 12 13:31:42 crc kubenswrapper[4921]: I0312 13:31:42.002500 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02d0afd0-25f5-44b7-91f0-47d0be7ba8f9" path="/var/lib/kubelet/pods/02d0afd0-25f5-44b7-91f0-47d0be7ba8f9/volumes" Mar 12 13:31:42 crc kubenswrapper[4921]: I0312 13:31:42.024912 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b3862104-1cf4-4b79-ab48-f94ad1e83964","Type":"ContainerStarted","Data":"7da8b73d4726bf5611146cfc27645bde6ece2fc0185f22987b8751b5fba9741b"} Mar 12 13:31:42 crc kubenswrapper[4921]: I0312 13:31:42.054810 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.054784593 podStartE2EDuration="2.054784593s" podCreationTimestamp="2026-03-12 13:31:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:31:42.048243665 +0000 UTC m=+1324.738315676" watchObservedRunningTime="2026-03-12 13:31:42.054784593 +0000 UTC m=+1324.744856604" Mar 12 13:31:45 crc kubenswrapper[4921]: I0312 13:31:45.374122 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 12 13:31:46 crc kubenswrapper[4921]: I0312 13:31:46.339584 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 13:31:46 crc kubenswrapper[4921]: I0312 13:31:46.340164 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 13:31:47 crc kubenswrapper[4921]: I0312 13:31:47.355976 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="148f1f44-e990-4353-b376-1ccbb7f01d0a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.195:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 13:31:47 crc kubenswrapper[4921]: I0312 13:31:47.356000 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="148f1f44-e990-4353-b376-1ccbb7f01d0a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.195:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 13:31:49 crc kubenswrapper[4921]: I0312 13:31:49.667618 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 13:31:49 crc kubenswrapper[4921]: I0312 13:31:49.669902 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 13:31:50 crc kubenswrapper[4921]: I0312 13:31:50.374194 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 12 13:31:50 crc kubenswrapper[4921]: I0312 13:31:50.399141 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 12 13:31:50 crc kubenswrapper[4921]: I0312 13:31:50.683180 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a8089872-446f-4355-94d8-8b82e1b04030" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 13:31:50 crc kubenswrapper[4921]: I0312 13:31:50.683182 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a8089872-446f-4355-94d8-8b82e1b04030" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 13:31:51 crc kubenswrapper[4921]: I0312 13:31:51.163138 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 12 13:31:54 crc kubenswrapper[4921]: I0312 13:31:54.222055 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 12 13:31:54 crc kubenswrapper[4921]: I0312 13:31:54.339272 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 13:31:54 crc kubenswrapper[4921]: I0312 13:31:54.339652 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 13:31:56 crc kubenswrapper[4921]: I0312 13:31:56.346971 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 13:31:56 crc kubenswrapper[4921]: I0312 13:31:56.356887 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 13:31:56 crc kubenswrapper[4921]: I0312 13:31:56.362463 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 13:31:57 crc kubenswrapper[4921]: I0312 13:31:57.198669 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 13:31:57 crc kubenswrapper[4921]: I0312 13:31:57.666180 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 13:31:57 crc kubenswrapper[4921]: I0312 13:31:57.666241 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 13:31:59 crc kubenswrapper[4921]: I0312 13:31:59.674329 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 13:31:59 crc kubenswrapper[4921]: I0312 13:31:59.680753 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 13:31:59 crc kubenswrapper[4921]: I0312 13:31:59.682681 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 13:32:00 crc kubenswrapper[4921]: I0312 13:32:00.141971 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555372-2hkm4"] Mar 12 13:32:00 crc kubenswrapper[4921]: I0312 13:32:00.143525 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555372-2hkm4" Mar 12 13:32:00 crc kubenswrapper[4921]: I0312 13:32:00.146351 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:32:00 crc kubenswrapper[4921]: I0312 13:32:00.146352 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:32:00 crc kubenswrapper[4921]: I0312 13:32:00.146932 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:32:00 crc kubenswrapper[4921]: I0312 13:32:00.164430 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555372-2hkm4"] Mar 12 13:32:00 crc kubenswrapper[4921]: I0312 13:32:00.236088 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 13:32:00 crc kubenswrapper[4921]: I0312 13:32:00.293562 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9jkd\" (UniqueName: \"kubernetes.io/projected/bc827adf-e6a8-4249-a452-8af8f3cde429-kube-api-access-c9jkd\") pod \"auto-csr-approver-29555372-2hkm4\" (UID: \"bc827adf-e6a8-4249-a452-8af8f3cde429\") " pod="openshift-infra/auto-csr-approver-29555372-2hkm4" Mar 12 13:32:00 crc kubenswrapper[4921]: I0312 13:32:00.395612 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9jkd\" (UniqueName: \"kubernetes.io/projected/bc827adf-e6a8-4249-a452-8af8f3cde429-kube-api-access-c9jkd\") pod \"auto-csr-approver-29555372-2hkm4\" (UID: \"bc827adf-e6a8-4249-a452-8af8f3cde429\") " pod="openshift-infra/auto-csr-approver-29555372-2hkm4" Mar 12 13:32:00 crc kubenswrapper[4921]: I0312 13:32:00.422552 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9jkd\" (UniqueName: \"kubernetes.io/projected/bc827adf-e6a8-4249-a452-8af8f3cde429-kube-api-access-c9jkd\") pod \"auto-csr-approver-29555372-2hkm4\" (UID: \"bc827adf-e6a8-4249-a452-8af8f3cde429\") " pod="openshift-infra/auto-csr-approver-29555372-2hkm4" Mar 12 13:32:00 crc kubenswrapper[4921]: I0312 13:32:00.468546 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555372-2hkm4" Mar 12 13:32:00 crc kubenswrapper[4921]: I0312 13:32:00.935474 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555372-2hkm4"] Mar 12 13:32:00 crc kubenswrapper[4921]: W0312 13:32:00.944434 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc827adf_e6a8_4249_a452_8af8f3cde429.slice/crio-27f2295995f825610696191af74030b165c8d2a5377aa795bc6e084eeac59c71 WatchSource:0}: Error finding container 27f2295995f825610696191af74030b165c8d2a5377aa795bc6e084eeac59c71: Status 404 returned error can't find the container with id 27f2295995f825610696191af74030b165c8d2a5377aa795bc6e084eeac59c71 Mar 12 13:32:01 crc kubenswrapper[4921]: I0312 13:32:01.232690 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555372-2hkm4" event={"ID":"bc827adf-e6a8-4249-a452-8af8f3cde429","Type":"ContainerStarted","Data":"27f2295995f825610696191af74030b165c8d2a5377aa795bc6e084eeac59c71"} Mar 12 13:32:03 crc kubenswrapper[4921]: I0312 13:32:03.251754 4921 generic.go:334] "Generic (PLEG): container finished" podID="bc827adf-e6a8-4249-a452-8af8f3cde429" containerID="9a257cbc99df1790ab488b536010ac8715d24cfba59c78d993f9bc8b8cb969b3" exitCode=0 Mar 12 13:32:03 crc kubenswrapper[4921]: I0312 13:32:03.252007 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555372-2hkm4" event={"ID":"bc827adf-e6a8-4249-a452-8af8f3cde429","Type":"ContainerDied","Data":"9a257cbc99df1790ab488b536010ac8715d24cfba59c78d993f9bc8b8cb969b3"} Mar 12 13:32:04 crc kubenswrapper[4921]: I0312 13:32:04.640254 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555372-2hkm4" Mar 12 13:32:04 crc kubenswrapper[4921]: I0312 13:32:04.777614 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9jkd\" (UniqueName: \"kubernetes.io/projected/bc827adf-e6a8-4249-a452-8af8f3cde429-kube-api-access-c9jkd\") pod \"bc827adf-e6a8-4249-a452-8af8f3cde429\" (UID: \"bc827adf-e6a8-4249-a452-8af8f3cde429\") " Mar 12 13:32:04 crc kubenswrapper[4921]: I0312 13:32:04.789035 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc827adf-e6a8-4249-a452-8af8f3cde429-kube-api-access-c9jkd" (OuterVolumeSpecName: "kube-api-access-c9jkd") pod "bc827adf-e6a8-4249-a452-8af8f3cde429" (UID: "bc827adf-e6a8-4249-a452-8af8f3cde429"). InnerVolumeSpecName "kube-api-access-c9jkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:32:04 crc kubenswrapper[4921]: I0312 13:32:04.879732 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9jkd\" (UniqueName: \"kubernetes.io/projected/bc827adf-e6a8-4249-a452-8af8f3cde429-kube-api-access-c9jkd\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:05 crc kubenswrapper[4921]: I0312 13:32:05.272883 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555372-2hkm4" event={"ID":"bc827adf-e6a8-4249-a452-8af8f3cde429","Type":"ContainerDied","Data":"27f2295995f825610696191af74030b165c8d2a5377aa795bc6e084eeac59c71"} Mar 12 13:32:05 crc kubenswrapper[4921]: I0312 13:32:05.272920 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f2295995f825610696191af74030b165c8d2a5377aa795bc6e084eeac59c71" Mar 12 13:32:05 crc kubenswrapper[4921]: I0312 13:32:05.272983 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555372-2hkm4" Mar 12 13:32:05 crc kubenswrapper[4921]: I0312 13:32:05.708607 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555366-2kvrw"] Mar 12 13:32:05 crc kubenswrapper[4921]: I0312 13:32:05.717557 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555366-2kvrw"] Mar 12 13:32:06 crc kubenswrapper[4921]: I0312 13:32:06.003751 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="230fb418-c791-493a-9703-188ba4af8657" path="/var/lib/kubelet/pods/230fb418-c791-493a-9703-188ba4af8657/volumes" Mar 12 13:32:08 crc kubenswrapper[4921]: I0312 13:32:08.659711 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 13:32:10 crc kubenswrapper[4921]: I0312 13:32:10.054953 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 13:32:12 crc kubenswrapper[4921]: I0312 13:32:12.800401 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="bf4146bb-5512-4a8d-81a6-b462a508be2f" containerName="rabbitmq" containerID="cri-o://35e060ac61a718cdeaeac630edaa120fe7f0f7e9114bab16eb31527e2e1ff99d" gracePeriod=604796 Mar 12 13:32:14 crc kubenswrapper[4921]: I0312 13:32:14.005773 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c83f4404-c7af-4fb6-aa92-6ac4e691a27f" containerName="rabbitmq" containerID="cri-o://2622f1967762c5f954d8dadd8f0275d5bbe4135976e200c0bc017219c4bc6b92" gracePeriod=604797 Mar 12 13:32:18 crc kubenswrapper[4921]: I0312 13:32:18.370550 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="bf4146bb-5512-4a8d-81a6-b462a508be2f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Mar 12 13:32:18 crc kubenswrapper[4921]: I0312 13:32:18.711521 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="c83f4404-c7af-4fb6-aa92-6ac4e691a27f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.365278 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.440391 4921 generic.go:334] "Generic (PLEG): container finished" podID="bf4146bb-5512-4a8d-81a6-b462a508be2f" containerID="35e060ac61a718cdeaeac630edaa120fe7f0f7e9114bab16eb31527e2e1ff99d" exitCode=0 Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.440443 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf4146bb-5512-4a8d-81a6-b462a508be2f","Type":"ContainerDied","Data":"35e060ac61a718cdeaeac630edaa120fe7f0f7e9114bab16eb31527e2e1ff99d"} Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.440465 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-plugins-conf\") pod \"bf4146bb-5512-4a8d-81a6-b462a508be2f\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.440486 4921 scope.go:117] "RemoveContainer" containerID="35e060ac61a718cdeaeac630edaa120fe7f0f7e9114bab16eb31527e2e1ff99d" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.440491 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.440548 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"bf4146bb-5512-4a8d-81a6-b462a508be2f\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.440588 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-erlang-cookie\") pod \"bf4146bb-5512-4a8d-81a6-b462a508be2f\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.440474 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"bf4146bb-5512-4a8d-81a6-b462a508be2f","Type":"ContainerDied","Data":"62643b0de7f19420512cbfdd05ebbfd924dfd73b565268f9f942cd53ca8d1a75"} Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.440630 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-tls\") pod \"bf4146bb-5512-4a8d-81a6-b462a508be2f\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.440842 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-plugins\") pod \"bf4146bb-5512-4a8d-81a6-b462a508be2f\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.440877 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf4146bb-5512-4a8d-81a6-b462a508be2f-erlang-cookie-secret\") pod \"bf4146bb-5512-4a8d-81a6-b462a508be2f\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.440948 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-server-conf\") pod \"bf4146bb-5512-4a8d-81a6-b462a508be2f\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.440993 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf4146bb-5512-4a8d-81a6-b462a508be2f-pod-info\") pod \"bf4146bb-5512-4a8d-81a6-b462a508be2f\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.441070 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bf4146bb-5512-4a8d-81a6-b462a508be2f" (UID: "bf4146bb-5512-4a8d-81a6-b462a508be2f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.441096 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbrwz\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-kube-api-access-vbrwz\") pod \"bf4146bb-5512-4a8d-81a6-b462a508be2f\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.441197 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-confd\") pod \"bf4146bb-5512-4a8d-81a6-b462a508be2f\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.441315 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-config-data\") pod \"bf4146bb-5512-4a8d-81a6-b462a508be2f\" (UID: \"bf4146bb-5512-4a8d-81a6-b462a508be2f\") " Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.441489 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bf4146bb-5512-4a8d-81a6-b462a508be2f" (UID: "bf4146bb-5512-4a8d-81a6-b462a508be2f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.441609 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bf4146bb-5512-4a8d-81a6-b462a508be2f" (UID: "bf4146bb-5512-4a8d-81a6-b462a508be2f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.442427 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.442448 4921 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.442508 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.471514 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "persistence") pod "bf4146bb-5512-4a8d-81a6-b462a508be2f" (UID: "bf4146bb-5512-4a8d-81a6-b462a508be2f"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.472659 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-kube-api-access-vbrwz" (OuterVolumeSpecName: "kube-api-access-vbrwz") pod "bf4146bb-5512-4a8d-81a6-b462a508be2f" (UID: "bf4146bb-5512-4a8d-81a6-b462a508be2f"). InnerVolumeSpecName "kube-api-access-vbrwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.473519 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bf4146bb-5512-4a8d-81a6-b462a508be2f-pod-info" (OuterVolumeSpecName: "pod-info") pod "bf4146bb-5512-4a8d-81a6-b462a508be2f" (UID: "bf4146bb-5512-4a8d-81a6-b462a508be2f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.473732 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bf4146bb-5512-4a8d-81a6-b462a508be2f" (UID: "bf4146bb-5512-4a8d-81a6-b462a508be2f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.474394 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf4146bb-5512-4a8d-81a6-b462a508be2f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bf4146bb-5512-4a8d-81a6-b462a508be2f" (UID: "bf4146bb-5512-4a8d-81a6-b462a508be2f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.524100 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-config-data" (OuterVolumeSpecName: "config-data") pod "bf4146bb-5512-4a8d-81a6-b462a508be2f" (UID: "bf4146bb-5512-4a8d-81a6-b462a508be2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.529411 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-server-conf" (OuterVolumeSpecName: "server-conf") pod "bf4146bb-5512-4a8d-81a6-b462a508be2f" (UID: "bf4146bb-5512-4a8d-81a6-b462a508be2f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.543924 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.543954 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.543965 4921 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bf4146bb-5512-4a8d-81a6-b462a508be2f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.543974 4921 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-server-conf\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.543982 4921 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bf4146bb-5512-4a8d-81a6-b462a508be2f-pod-info\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.543991 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbrwz\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-kube-api-access-vbrwz\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.543998 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bf4146bb-5512-4a8d-81a6-b462a508be2f-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.547346 4921 scope.go:117] "RemoveContainer" containerID="84135f38b17f95a53f553d4468a52434f6006e30f8307b952806cef3b61cebdd" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.570407 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.577051 4921 scope.go:117] "RemoveContainer" containerID="35e060ac61a718cdeaeac630edaa120fe7f0f7e9114bab16eb31527e2e1ff99d" Mar 12 13:32:19 crc kubenswrapper[4921]: E0312 13:32:19.577565 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35e060ac61a718cdeaeac630edaa120fe7f0f7e9114bab16eb31527e2e1ff99d\": container with ID starting with 35e060ac61a718cdeaeac630edaa120fe7f0f7e9114bab16eb31527e2e1ff99d not found: ID does not exist" containerID="35e060ac61a718cdeaeac630edaa120fe7f0f7e9114bab16eb31527e2e1ff99d" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.577603 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35e060ac61a718cdeaeac630edaa120fe7f0f7e9114bab16eb31527e2e1ff99d"} err="failed to get container status \"35e060ac61a718cdeaeac630edaa120fe7f0f7e9114bab16eb31527e2e1ff99d\": rpc error: code = NotFound desc = could not find container \"35e060ac61a718cdeaeac630edaa120fe7f0f7e9114bab16eb31527e2e1ff99d\": container with ID starting with 35e060ac61a718cdeaeac630edaa120fe7f0f7e9114bab16eb31527e2e1ff99d not found: ID does not exist" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.577656 4921 scope.go:117] "RemoveContainer" containerID="84135f38b17f95a53f553d4468a52434f6006e30f8307b952806cef3b61cebdd" Mar 12 13:32:19 crc kubenswrapper[4921]: E0312 13:32:19.578800 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84135f38b17f95a53f553d4468a52434f6006e30f8307b952806cef3b61cebdd\": container with ID starting with 84135f38b17f95a53f553d4468a52434f6006e30f8307b952806cef3b61cebdd not found: ID does not exist" containerID="84135f38b17f95a53f553d4468a52434f6006e30f8307b952806cef3b61cebdd" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.578922 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84135f38b17f95a53f553d4468a52434f6006e30f8307b952806cef3b61cebdd"} err="failed to get container status \"84135f38b17f95a53f553d4468a52434f6006e30f8307b952806cef3b61cebdd\": rpc error: code = NotFound desc = could not find container \"84135f38b17f95a53f553d4468a52434f6006e30f8307b952806cef3b61cebdd\": container with ID starting with 84135f38b17f95a53f553d4468a52434f6006e30f8307b952806cef3b61cebdd not found: ID does not exist" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.606098 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bf4146bb-5512-4a8d-81a6-b462a508be2f" (UID: "bf4146bb-5512-4a8d-81a6-b462a508be2f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.645159 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.645362 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bf4146bb-5512-4a8d-81a6-b462a508be2f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.784067 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.796917 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.809042 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 13:32:19 crc kubenswrapper[4921]: E0312 13:32:19.809468 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4146bb-5512-4a8d-81a6-b462a508be2f" containerName="setup-container" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.809489 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4146bb-5512-4a8d-81a6-b462a508be2f" containerName="setup-container" Mar 12 13:32:19 crc kubenswrapper[4921]: E0312 13:32:19.809511 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc827adf-e6a8-4249-a452-8af8f3cde429" containerName="oc" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.809517 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc827adf-e6a8-4249-a452-8af8f3cde429" containerName="oc" Mar 12 13:32:19 crc kubenswrapper[4921]: E0312 13:32:19.809548 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4146bb-5512-4a8d-81a6-b462a508be2f" containerName="rabbitmq" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.809556 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4146bb-5512-4a8d-81a6-b462a508be2f" containerName="rabbitmq" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.809725 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf4146bb-5512-4a8d-81a6-b462a508be2f" containerName="rabbitmq" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.809739 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc827adf-e6a8-4249-a452-8af8f3cde429" containerName="oc" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.810771 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.814371 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.814786 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.815004 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-4npht" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.815175 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.815440 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.815603 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.815785 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.819712 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.847869 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.847935 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.848005 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.848052 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2jjz\" (UniqueName: \"kubernetes.io/projected/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-kube-api-access-n2jjz\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.848100 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.848132 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.848205 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-config-data\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.848271 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.848310 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.848352 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.848381 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.949382 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-config-data\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.949633 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.949746 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.949842 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.950032 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.950772 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.950962 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.951074 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.951174 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2jjz\" (UniqueName: \"kubernetes.io/projected/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-kube-api-access-n2jjz\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.951267 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.951367 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.950239 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-config-data\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.951172 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.951289 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") device mount path \"/mnt/openstack/pv19\"" pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.952061 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.952381 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.952678 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.953386 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.953872 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.955314 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.955672 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.967301 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2jjz\" (UniqueName: \"kubernetes.io/projected/7e627c0e-6753-4c4a-ad5f-7d36e4373a2c-kube-api-access-n2jjz\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:19 crc kubenswrapper[4921]: I0312 13:32:19.983026 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"rabbitmq-server-0\" (UID: \"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c\") " pod="openstack/rabbitmq-server-0" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.001948 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf4146bb-5512-4a8d-81a6-b462a508be2f" path="/var/lib/kubelet/pods/bf4146bb-5512-4a8d-81a6-b462a508be2f/volumes" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.163223 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.464487 4921 generic.go:334] "Generic (PLEG): container finished" podID="c83f4404-c7af-4fb6-aa92-6ac4e691a27f" containerID="2622f1967762c5f954d8dadd8f0275d5bbe4135976e200c0bc017219c4bc6b92" exitCode=0 Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.464744 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c83f4404-c7af-4fb6-aa92-6ac4e691a27f","Type":"ContainerDied","Data":"2622f1967762c5f954d8dadd8f0275d5bbe4135976e200c0bc017219c4bc6b92"} Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.579651 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.688751 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 13:32:20 crc kubenswrapper[4921]: W0312 13:32:20.697948 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e627c0e_6753_4c4a_ad5f_7d36e4373a2c.slice/crio-5be8d33bb28b9555d79e9b4fd54c96eb16e4e28a9499fdf3a78ada63b378e718 WatchSource:0}: Error finding container 5be8d33bb28b9555d79e9b4fd54c96eb16e4e28a9499fdf3a78ada63b378e718: Status 404 returned error can't find the container with id 5be8d33bb28b9555d79e9b4fd54c96eb16e4e28a9499fdf3a78ada63b378e718 Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.766965 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-pod-info\") pod \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.767010 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-config-data\") pod \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.767036 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.767060 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-server-conf\") pod \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.767095 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v79p\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-kube-api-access-8v79p\") pod \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.767199 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-erlang-cookie-secret\") pod \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.767258 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-plugins\") pod \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.767292 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-plugins-conf\") pod \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.767371 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-confd\") pod \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.767398 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-tls\") pod \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.767426 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-erlang-cookie\") pod \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\" (UID: \"c83f4404-c7af-4fb6-aa92-6ac4e691a27f\") " Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.768035 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c83f4404-c7af-4fb6-aa92-6ac4e691a27f" (UID: "c83f4404-c7af-4fb6-aa92-6ac4e691a27f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.768336 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c83f4404-c7af-4fb6-aa92-6ac4e691a27f" (UID: "c83f4404-c7af-4fb6-aa92-6ac4e691a27f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.768400 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c83f4404-c7af-4fb6-aa92-6ac4e691a27f" (UID: "c83f4404-c7af-4fb6-aa92-6ac4e691a27f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.773995 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-pod-info" (OuterVolumeSpecName: "pod-info") pod "c83f4404-c7af-4fb6-aa92-6ac4e691a27f" (UID: "c83f4404-c7af-4fb6-aa92-6ac4e691a27f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.774020 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "persistence") pod "c83f4404-c7af-4fb6-aa92-6ac4e691a27f" (UID: "c83f4404-c7af-4fb6-aa92-6ac4e691a27f"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.774492 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c83f4404-c7af-4fb6-aa92-6ac4e691a27f" (UID: "c83f4404-c7af-4fb6-aa92-6ac4e691a27f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.777850 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-kube-api-access-8v79p" (OuterVolumeSpecName: "kube-api-access-8v79p") pod "c83f4404-c7af-4fb6-aa92-6ac4e691a27f" (UID: "c83f4404-c7af-4fb6-aa92-6ac4e691a27f"). InnerVolumeSpecName "kube-api-access-8v79p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.780234 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c83f4404-c7af-4fb6-aa92-6ac4e691a27f" (UID: "c83f4404-c7af-4fb6-aa92-6ac4e691a27f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.793195 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-config-data" (OuterVolumeSpecName: "config-data") pod "c83f4404-c7af-4fb6-aa92-6ac4e691a27f" (UID: "c83f4404-c7af-4fb6-aa92-6ac4e691a27f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.821340 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-server-conf" (OuterVolumeSpecName: "server-conf") pod "c83f4404-c7af-4fb6-aa92-6ac4e691a27f" (UID: "c83f4404-c7af-4fb6-aa92-6ac4e691a27f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.870244 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.870709 4921 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.870720 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.870732 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.870744 4921 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-pod-info\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.870755 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.870787 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.870799 4921 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-server-conf\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.870810 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v79p\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-kube-api-access-8v79p\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.870837 4921 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.884700 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c83f4404-c7af-4fb6-aa92-6ac4e691a27f" (UID: "c83f4404-c7af-4fb6-aa92-6ac4e691a27f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.891173 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.971400 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:20 crc kubenswrapper[4921]: I0312 13:32:20.971438 4921 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c83f4404-c7af-4fb6-aa92-6ac4e691a27f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.483608 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c83f4404-c7af-4fb6-aa92-6ac4e691a27f","Type":"ContainerDied","Data":"447c2eb8ab99e771dedb952f242b8cce5ba9a0f567f4183ce92b7da4ae1fa3e9"} Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.483664 4921 scope.go:117] "RemoveContainer" containerID="2622f1967762c5f954d8dadd8f0275d5bbe4135976e200c0bc017219c4bc6b92" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.483705 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.485884 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c","Type":"ContainerStarted","Data":"5be8d33bb28b9555d79e9b4fd54c96eb16e4e28a9499fdf3a78ada63b378e718"} Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.509341 4921 scope.go:117] "RemoveContainer" containerID="cfab788e9f5ce4b3ba25a82075975900efb79d705a9bb5a1bdfdd4a9183dccb7" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.555356 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.571433 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.596911 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 13:32:21 crc kubenswrapper[4921]: E0312 13:32:21.597456 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83f4404-c7af-4fb6-aa92-6ac4e691a27f" containerName="setup-container" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.597479 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83f4404-c7af-4fb6-aa92-6ac4e691a27f" containerName="setup-container" Mar 12 13:32:21 crc kubenswrapper[4921]: E0312 13:32:21.597501 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83f4404-c7af-4fb6-aa92-6ac4e691a27f" containerName="rabbitmq" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.597511 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83f4404-c7af-4fb6-aa92-6ac4e691a27f" containerName="rabbitmq" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.597770 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83f4404-c7af-4fb6-aa92-6ac4e691a27f" containerName="rabbitmq" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.599183 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.601952 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.602098 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.602299 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.602365 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.602412 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.602473 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5m2pc" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.602637 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.616891 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.784351 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b28ef2e5-d1ca-460a-9c97-a058c098ef64-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.784595 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b28ef2e5-d1ca-460a-9c97-a058c098ef64-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.784744 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b28ef2e5-d1ca-460a-9c97-a058c098ef64-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.784858 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b28ef2e5-d1ca-460a-9c97-a058c098ef64-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.784947 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b28ef2e5-d1ca-460a-9c97-a058c098ef64-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.785025 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szrgl\" (UniqueName: \"kubernetes.io/projected/b28ef2e5-d1ca-460a-9c97-a058c098ef64-kube-api-access-szrgl\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.785119 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b28ef2e5-d1ca-460a-9c97-a058c098ef64-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.785208 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b28ef2e5-d1ca-460a-9c97-a058c098ef64-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.785287 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b28ef2e5-d1ca-460a-9c97-a058c098ef64-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.785381 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.785471 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b28ef2e5-d1ca-460a-9c97-a058c098ef64-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.886948 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b28ef2e5-d1ca-460a-9c97-a058c098ef64-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.887233 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b28ef2e5-d1ca-460a-9c97-a058c098ef64-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.887286 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b28ef2e5-d1ca-460a-9c97-a058c098ef64-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.887317 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b28ef2e5-d1ca-460a-9c97-a058c098ef64-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.887338 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b28ef2e5-d1ca-460a-9c97-a058c098ef64-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.887361 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szrgl\" (UniqueName: \"kubernetes.io/projected/b28ef2e5-d1ca-460a-9c97-a058c098ef64-kube-api-access-szrgl\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.887388 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b28ef2e5-d1ca-460a-9c97-a058c098ef64-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.887410 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b28ef2e5-d1ca-460a-9c97-a058c098ef64-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.887425 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b28ef2e5-d1ca-460a-9c97-a058c098ef64-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.887443 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.887473 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b28ef2e5-d1ca-460a-9c97-a058c098ef64-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.887882 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b28ef2e5-d1ca-460a-9c97-a058c098ef64-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.888028 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b28ef2e5-d1ca-460a-9c97-a058c098ef64-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.888140 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b28ef2e5-d1ca-460a-9c97-a058c098ef64-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.888401 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") device mount path \"/mnt/openstack/pv20\"" pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.889761 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b28ef2e5-d1ca-460a-9c97-a058c098ef64-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.890086 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b28ef2e5-d1ca-460a-9c97-a058c098ef64-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.911133 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b28ef2e5-d1ca-460a-9c97-a058c098ef64-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.911466 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b28ef2e5-d1ca-460a-9c97-a058c098ef64-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.914643 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b28ef2e5-d1ca-460a-9c97-a058c098ef64-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.914957 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b28ef2e5-d1ca-460a-9c97-a058c098ef64-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.916357 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szrgl\" (UniqueName: \"kubernetes.io/projected/b28ef2e5-d1ca-460a-9c97-a058c098ef64-kube-api-access-szrgl\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:21 crc kubenswrapper[4921]: I0312 13:32:21.947006 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b28ef2e5-d1ca-460a-9c97-a058c098ef64\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:22 crc kubenswrapper[4921]: I0312 13:32:22.012878 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83f4404-c7af-4fb6-aa92-6ac4e691a27f" path="/var/lib/kubelet/pods/c83f4404-c7af-4fb6-aa92-6ac4e691a27f/volumes" Mar 12 13:32:22 crc kubenswrapper[4921]: I0312 13:32:22.221484 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:32:22 crc kubenswrapper[4921]: I0312 13:32:22.495497 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c","Type":"ContainerStarted","Data":"0d4ab79e2275a7883f22d504995cd28fd02d21790cea8b78b7bcb4eea4a2ca23"} Mar 12 13:32:22 crc kubenswrapper[4921]: I0312 13:32:22.835931 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 13:32:22 crc kubenswrapper[4921]: W0312 13:32:22.849664 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb28ef2e5_d1ca_460a_9c97_a058c098ef64.slice/crio-6dcd1b483874a96e4c5be68e38b1f64f363c078ae8ad238ff08a7d04b84b0424 WatchSource:0}: Error finding container 6dcd1b483874a96e4c5be68e38b1f64f363c078ae8ad238ff08a7d04b84b0424: Status 404 returned error can't find the container with id 6dcd1b483874a96e4c5be68e38b1f64f363c078ae8ad238ff08a7d04b84b0424 Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.516878 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b28ef2e5-d1ca-460a-9c97-a058c098ef64","Type":"ContainerStarted","Data":"6dcd1b483874a96e4c5be68e38b1f64f363c078ae8ad238ff08a7d04b84b0424"} Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.626655 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2v5ng"] Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.632535 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.634875 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.649610 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2v5ng"] Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.721147 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dccsr\" (UniqueName: \"kubernetes.io/projected/dde16069-3b0f-4323-afc5-7b33f07f6bce-kube-api-access-dccsr\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.721238 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.721266 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.721313 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-config\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.721348 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.721386 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.824962 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dccsr\" (UniqueName: \"kubernetes.io/projected/dde16069-3b0f-4323-afc5-7b33f07f6bce-kube-api-access-dccsr\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.825347 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.825373 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.826157 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.826242 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-config\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.826288 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.826344 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.826517 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.826953 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.827151 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.827284 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-config\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.847331 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dccsr\" (UniqueName: \"kubernetes.io/projected/dde16069-3b0f-4323-afc5-7b33f07f6bce-kube-api-access-dccsr\") pod \"dnsmasq-dns-6447ccbd8f-2v5ng\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:23 crc kubenswrapper[4921]: I0312 13:32:23.995058 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:24 crc kubenswrapper[4921]: I0312 13:32:24.439133 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2v5ng"] Mar 12 13:32:24 crc kubenswrapper[4921]: W0312 13:32:24.440982 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddde16069_3b0f_4323_afc5_7b33f07f6bce.slice/crio-56962b662872d31d3847dac09683faf3fc6bbf8ac3163bde8ca1f9e7fe375077 WatchSource:0}: Error finding container 56962b662872d31d3847dac09683faf3fc6bbf8ac3163bde8ca1f9e7fe375077: Status 404 returned error can't find the container with id 56962b662872d31d3847dac09683faf3fc6bbf8ac3163bde8ca1f9e7fe375077 Mar 12 13:32:24 crc kubenswrapper[4921]: I0312 13:32:24.529752 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" event={"ID":"dde16069-3b0f-4323-afc5-7b33f07f6bce","Type":"ContainerStarted","Data":"56962b662872d31d3847dac09683faf3fc6bbf8ac3163bde8ca1f9e7fe375077"} Mar 12 13:32:24 crc kubenswrapper[4921]: I0312 13:32:24.532159 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b28ef2e5-d1ca-460a-9c97-a058c098ef64","Type":"ContainerStarted","Data":"3223f3b0d75da5f6541bbb789154fd9a68366dc5226a487010384693e5aae502"} Mar 12 13:32:25 crc kubenswrapper[4921]: I0312 13:32:25.545237 4921 generic.go:334] "Generic (PLEG): container finished" podID="dde16069-3b0f-4323-afc5-7b33f07f6bce" containerID="736dc87b37618430d192d29e2a9493b7e3621056c537dedd668161153e27b696" exitCode=0 Mar 12 13:32:25 crc kubenswrapper[4921]: I0312 13:32:25.545309 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" event={"ID":"dde16069-3b0f-4323-afc5-7b33f07f6bce","Type":"ContainerDied","Data":"736dc87b37618430d192d29e2a9493b7e3621056c537dedd668161153e27b696"} Mar 12 13:32:26 crc kubenswrapper[4921]: I0312 13:32:26.323509 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:32:26 crc kubenswrapper[4921]: I0312 13:32:26.323593 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:32:26 crc kubenswrapper[4921]: I0312 13:32:26.560154 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" event={"ID":"dde16069-3b0f-4323-afc5-7b33f07f6bce","Type":"ContainerStarted","Data":"1b47ead2999081c0759b3baf6769db652439b9242cc9fac1f8d56266505f0cd0"} Mar 12 13:32:26 crc kubenswrapper[4921]: I0312 13:32:26.560515 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:26 crc kubenswrapper[4921]: I0312 13:32:26.591937 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" podStartSLOduration=3.591917311 podStartE2EDuration="3.591917311s" podCreationTimestamp="2026-03-12 13:32:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:32:26.585546938 +0000 UTC m=+1369.275618939" watchObservedRunningTime="2026-03-12 13:32:26.591917311 +0000 UTC m=+1369.281989282" Mar 12 13:32:33 crc kubenswrapper[4921]: I0312 13:32:33.997007 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.081344 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-z2mnm"] Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.081578 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" podUID="fdae03dd-47dd-4e2a-901b-5ec7cc01e91c" containerName="dnsmasq-dns" containerID="cri-o://e0f0c0a4d5d8b8661383588f3c914a6367a0e851f0c82e20e11be2b77dc666b5" gracePeriod=10 Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.322940 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79794c8ddf-ht28n"] Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.324726 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.333243 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79794c8ddf-ht28n"] Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.449174 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-ovsdbserver-nb\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.449265 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-openstack-edpm-ipam\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.449311 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qh2q\" (UniqueName: \"kubernetes.io/projected/06610185-0afb-4841-86c4-406c12519fc2-kube-api-access-6qh2q\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.449335 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-dns-svc\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.449424 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-config\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.449600 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-ovsdbserver-sb\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.551409 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-openstack-edpm-ipam\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.551786 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qh2q\" (UniqueName: \"kubernetes.io/projected/06610185-0afb-4841-86c4-406c12519fc2-kube-api-access-6qh2q\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.551829 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-dns-svc\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.551862 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-config\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.551901 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-ovsdbserver-sb\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.551945 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-ovsdbserver-nb\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.552440 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-openstack-edpm-ipam\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.552688 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-ovsdbserver-nb\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.553024 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-config\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.553354 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-ovsdbserver-sb\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.554033 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-dns-svc\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.569327 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qh2q\" (UniqueName: \"kubernetes.io/projected/06610185-0afb-4841-86c4-406c12519fc2-kube-api-access-6qh2q\") pod \"dnsmasq-dns-79794c8ddf-ht28n\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.654242 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.678524 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.689293 4921 generic.go:334] "Generic (PLEG): container finished" podID="fdae03dd-47dd-4e2a-901b-5ec7cc01e91c" containerID="e0f0c0a4d5d8b8661383588f3c914a6367a0e851f0c82e20e11be2b77dc666b5" exitCode=0 Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.689339 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" event={"ID":"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c","Type":"ContainerDied","Data":"e0f0c0a4d5d8b8661383588f3c914a6367a0e851f0c82e20e11be2b77dc666b5"} Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.689377 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" event={"ID":"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c","Type":"ContainerDied","Data":"61ab0e1891c999e8d566948a5972283a1334cedaf9b3aa9ed59be386d068f859"} Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.689400 4921 scope.go:117] "RemoveContainer" containerID="e0f0c0a4d5d8b8661383588f3c914a6367a0e851f0c82e20e11be2b77dc666b5" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.689569 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-z2mnm" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.755422 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6spx\" (UniqueName: \"kubernetes.io/projected/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-kube-api-access-g6spx\") pod \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.755861 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-ovsdbserver-nb\") pod \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.755913 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-dns-svc\") pod \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.755973 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-config\") pod \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.756046 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-ovsdbserver-sb\") pod \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\" (UID: \"fdae03dd-47dd-4e2a-901b-5ec7cc01e91c\") " Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.766066 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-kube-api-access-g6spx" (OuterVolumeSpecName: "kube-api-access-g6spx") pod "fdae03dd-47dd-4e2a-901b-5ec7cc01e91c" (UID: "fdae03dd-47dd-4e2a-901b-5ec7cc01e91c"). InnerVolumeSpecName "kube-api-access-g6spx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.787367 4921 scope.go:117] "RemoveContainer" containerID="ea7152b7ef32bf12c254a24a00a8805316281683a6a24e244d01ae12594a2424" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.813301 4921 scope.go:117] "RemoveContainer" containerID="e0f0c0a4d5d8b8661383588f3c914a6367a0e851f0c82e20e11be2b77dc666b5" Mar 12 13:32:34 crc kubenswrapper[4921]: E0312 13:32:34.814845 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f0c0a4d5d8b8661383588f3c914a6367a0e851f0c82e20e11be2b77dc666b5\": container with ID starting with e0f0c0a4d5d8b8661383588f3c914a6367a0e851f0c82e20e11be2b77dc666b5 not found: ID does not exist" containerID="e0f0c0a4d5d8b8661383588f3c914a6367a0e851f0c82e20e11be2b77dc666b5" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.814887 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f0c0a4d5d8b8661383588f3c914a6367a0e851f0c82e20e11be2b77dc666b5"} err="failed to get container status \"e0f0c0a4d5d8b8661383588f3c914a6367a0e851f0c82e20e11be2b77dc666b5\": rpc error: code = NotFound desc = could not find container \"e0f0c0a4d5d8b8661383588f3c914a6367a0e851f0c82e20e11be2b77dc666b5\": container with ID starting with e0f0c0a4d5d8b8661383588f3c914a6367a0e851f0c82e20e11be2b77dc666b5 not found: ID does not exist" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.814914 4921 scope.go:117] "RemoveContainer" containerID="ea7152b7ef32bf12c254a24a00a8805316281683a6a24e244d01ae12594a2424" Mar 12 13:32:34 crc kubenswrapper[4921]: E0312 13:32:34.815294 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea7152b7ef32bf12c254a24a00a8805316281683a6a24e244d01ae12594a2424\": container with ID starting with ea7152b7ef32bf12c254a24a00a8805316281683a6a24e244d01ae12594a2424 not found: ID does not exist" containerID="ea7152b7ef32bf12c254a24a00a8805316281683a6a24e244d01ae12594a2424" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.815320 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7152b7ef32bf12c254a24a00a8805316281683a6a24e244d01ae12594a2424"} err="failed to get container status \"ea7152b7ef32bf12c254a24a00a8805316281683a6a24e244d01ae12594a2424\": rpc error: code = NotFound desc = could not find container \"ea7152b7ef32bf12c254a24a00a8805316281683a6a24e244d01ae12594a2424\": container with ID starting with ea7152b7ef32bf12c254a24a00a8805316281683a6a24e244d01ae12594a2424 not found: ID does not exist" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.816033 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fdae03dd-47dd-4e2a-901b-5ec7cc01e91c" (UID: "fdae03dd-47dd-4e2a-901b-5ec7cc01e91c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.817628 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fdae03dd-47dd-4e2a-901b-5ec7cc01e91c" (UID: "fdae03dd-47dd-4e2a-901b-5ec7cc01e91c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.838385 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-config" (OuterVolumeSpecName: "config") pod "fdae03dd-47dd-4e2a-901b-5ec7cc01e91c" (UID: "fdae03dd-47dd-4e2a-901b-5ec7cc01e91c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.840014 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fdae03dd-47dd-4e2a-901b-5ec7cc01e91c" (UID: "fdae03dd-47dd-4e2a-901b-5ec7cc01e91c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.858540 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6spx\" (UniqueName: \"kubernetes.io/projected/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-kube-api-access-g6spx\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.858566 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.858575 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.858584 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:34 crc kubenswrapper[4921]: I0312 13:32:34.858592 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:35 crc kubenswrapper[4921]: I0312 13:32:35.038682 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-z2mnm"] Mar 12 13:32:35 crc kubenswrapper[4921]: I0312 13:32:35.060190 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-z2mnm"] Mar 12 13:32:35 crc kubenswrapper[4921]: I0312 13:32:35.222677 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79794c8ddf-ht28n"] Mar 12 13:32:35 crc kubenswrapper[4921]: I0312 13:32:35.701378 4921 generic.go:334] "Generic (PLEG): container finished" podID="06610185-0afb-4841-86c4-406c12519fc2" containerID="9a63bb20238de0d1debcb2481355a1728cfc685156ca4f94e46cf92551031f8f" exitCode=0 Mar 12 13:32:35 crc kubenswrapper[4921]: I0312 13:32:35.701456 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" event={"ID":"06610185-0afb-4841-86c4-406c12519fc2","Type":"ContainerDied","Data":"9a63bb20238de0d1debcb2481355a1728cfc685156ca4f94e46cf92551031f8f"} Mar 12 13:32:35 crc kubenswrapper[4921]: I0312 13:32:35.702153 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" event={"ID":"06610185-0afb-4841-86c4-406c12519fc2","Type":"ContainerStarted","Data":"63cfcfb67afeec87b8e3f0abbbe86a3e6978915651b089f246c5f437b2318e84"} Mar 12 13:32:35 crc kubenswrapper[4921]: I0312 13:32:35.997851 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdae03dd-47dd-4e2a-901b-5ec7cc01e91c" path="/var/lib/kubelet/pods/fdae03dd-47dd-4e2a-901b-5ec7cc01e91c/volumes" Mar 12 13:32:36 crc kubenswrapper[4921]: I0312 13:32:36.715922 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" event={"ID":"06610185-0afb-4841-86c4-406c12519fc2","Type":"ContainerStarted","Data":"ae83612ac487991e8873ddbd81c596d83cda8d817d48e2d4f3415b2af4ed9bf6"} Mar 12 13:32:36 crc kubenswrapper[4921]: I0312 13:32:36.716113 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:36 crc kubenswrapper[4921]: I0312 13:32:36.738076 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" podStartSLOduration=2.7380589 podStartE2EDuration="2.7380589s" podCreationTimestamp="2026-03-12 13:32:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:32:36.736423051 +0000 UTC m=+1379.426495032" watchObservedRunningTime="2026-03-12 13:32:36.7380589 +0000 UTC m=+1379.428130881" Mar 12 13:32:40 crc kubenswrapper[4921]: I0312 13:32:40.235162 4921 scope.go:117] "RemoveContainer" containerID="6ec6847728a310a5ebe83d645ad8fba01a7971d5fcc48461074fa52038ee05a5" Mar 12 13:32:44 crc kubenswrapper[4921]: I0312 13:32:44.656071 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 13:32:44 crc kubenswrapper[4921]: I0312 13:32:44.746985 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2v5ng"] Mar 12 13:32:44 crc kubenswrapper[4921]: I0312 13:32:44.747213 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" podUID="dde16069-3b0f-4323-afc5-7b33f07f6bce" containerName="dnsmasq-dns" containerID="cri-o://1b47ead2999081c0759b3baf6769db652439b9242cc9fac1f8d56266505f0cd0" gracePeriod=10 Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.178722 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.266238 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-dns-svc\") pod \"dde16069-3b0f-4323-afc5-7b33f07f6bce\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.266311 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dccsr\" (UniqueName: \"kubernetes.io/projected/dde16069-3b0f-4323-afc5-7b33f07f6bce-kube-api-access-dccsr\") pod \"dde16069-3b0f-4323-afc5-7b33f07f6bce\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.266345 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-config\") pod \"dde16069-3b0f-4323-afc5-7b33f07f6bce\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.266361 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-ovsdbserver-nb\") pod \"dde16069-3b0f-4323-afc5-7b33f07f6bce\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.266537 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-openstack-edpm-ipam\") pod \"dde16069-3b0f-4323-afc5-7b33f07f6bce\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.266580 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-ovsdbserver-sb\") pod \"dde16069-3b0f-4323-afc5-7b33f07f6bce\" (UID: \"dde16069-3b0f-4323-afc5-7b33f07f6bce\") " Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.285873 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dde16069-3b0f-4323-afc5-7b33f07f6bce-kube-api-access-dccsr" (OuterVolumeSpecName: "kube-api-access-dccsr") pod "dde16069-3b0f-4323-afc5-7b33f07f6bce" (UID: "dde16069-3b0f-4323-afc5-7b33f07f6bce"). InnerVolumeSpecName "kube-api-access-dccsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.346417 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dde16069-3b0f-4323-afc5-7b33f07f6bce" (UID: "dde16069-3b0f-4323-afc5-7b33f07f6bce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.346430 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-config" (OuterVolumeSpecName: "config") pod "dde16069-3b0f-4323-afc5-7b33f07f6bce" (UID: "dde16069-3b0f-4323-afc5-7b33f07f6bce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.358869 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dde16069-3b0f-4323-afc5-7b33f07f6bce" (UID: "dde16069-3b0f-4323-afc5-7b33f07f6bce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.369432 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.369467 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-config\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.369478 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.369489 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dccsr\" (UniqueName: \"kubernetes.io/projected/dde16069-3b0f-4323-afc5-7b33f07f6bce-kube-api-access-dccsr\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.371903 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dde16069-3b0f-4323-afc5-7b33f07f6bce" (UID: "dde16069-3b0f-4323-afc5-7b33f07f6bce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.374984 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "dde16069-3b0f-4323-afc5-7b33f07f6bce" (UID: "dde16069-3b0f-4323-afc5-7b33f07f6bce"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.471369 4921 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.471397 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dde16069-3b0f-4323-afc5-7b33f07f6bce-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.832519 4921 generic.go:334] "Generic (PLEG): container finished" podID="dde16069-3b0f-4323-afc5-7b33f07f6bce" containerID="1b47ead2999081c0759b3baf6769db652439b9242cc9fac1f8d56266505f0cd0" exitCode=0 Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.832559 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" event={"ID":"dde16069-3b0f-4323-afc5-7b33f07f6bce","Type":"ContainerDied","Data":"1b47ead2999081c0759b3baf6769db652439b9242cc9fac1f8d56266505f0cd0"} Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.832588 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" event={"ID":"dde16069-3b0f-4323-afc5-7b33f07f6bce","Type":"ContainerDied","Data":"56962b662872d31d3847dac09683faf3fc6bbf8ac3163bde8ca1f9e7fe375077"} Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.832622 4921 scope.go:117] "RemoveContainer" containerID="1b47ead2999081c0759b3baf6769db652439b9242cc9fac1f8d56266505f0cd0" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.832659 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2v5ng" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.861709 4921 scope.go:117] "RemoveContainer" containerID="736dc87b37618430d192d29e2a9493b7e3621056c537dedd668161153e27b696" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.941369 4921 scope.go:117] "RemoveContainer" containerID="1b47ead2999081c0759b3baf6769db652439b9242cc9fac1f8d56266505f0cd0" Mar 12 13:32:45 crc kubenswrapper[4921]: E0312 13:32:45.941940 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b47ead2999081c0759b3baf6769db652439b9242cc9fac1f8d56266505f0cd0\": container with ID starting with 1b47ead2999081c0759b3baf6769db652439b9242cc9fac1f8d56266505f0cd0 not found: ID does not exist" containerID="1b47ead2999081c0759b3baf6769db652439b9242cc9fac1f8d56266505f0cd0" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.941982 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b47ead2999081c0759b3baf6769db652439b9242cc9fac1f8d56266505f0cd0"} err="failed to get container status \"1b47ead2999081c0759b3baf6769db652439b9242cc9fac1f8d56266505f0cd0\": rpc error: code = NotFound desc = could not find container \"1b47ead2999081c0759b3baf6769db652439b9242cc9fac1f8d56266505f0cd0\": container with ID starting with 1b47ead2999081c0759b3baf6769db652439b9242cc9fac1f8d56266505f0cd0 not found: ID does not exist" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.942011 4921 scope.go:117] "RemoveContainer" containerID="736dc87b37618430d192d29e2a9493b7e3621056c537dedd668161153e27b696" Mar 12 13:32:45 crc kubenswrapper[4921]: E0312 13:32:45.942432 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"736dc87b37618430d192d29e2a9493b7e3621056c537dedd668161153e27b696\": container with ID starting with 736dc87b37618430d192d29e2a9493b7e3621056c537dedd668161153e27b696 not found: ID does not exist" containerID="736dc87b37618430d192d29e2a9493b7e3621056c537dedd668161153e27b696" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.942451 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736dc87b37618430d192d29e2a9493b7e3621056c537dedd668161153e27b696"} err="failed to get container status \"736dc87b37618430d192d29e2a9493b7e3621056c537dedd668161153e27b696\": rpc error: code = NotFound desc = could not find container \"736dc87b37618430d192d29e2a9493b7e3621056c537dedd668161153e27b696\": container with ID starting with 736dc87b37618430d192d29e2a9493b7e3621056c537dedd668161153e27b696 not found: ID does not exist" Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.948644 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2v5ng"] Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.979326 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2v5ng"] Mar 12 13:32:45 crc kubenswrapper[4921]: I0312 13:32:45.997044 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dde16069-3b0f-4323-afc5-7b33f07f6bce" path="/var/lib/kubelet/pods/dde16069-3b0f-4323-afc5-7b33f07f6bce/volumes" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.879959 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7"] Mar 12 13:32:50 crc kubenswrapper[4921]: E0312 13:32:50.881251 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdae03dd-47dd-4e2a-901b-5ec7cc01e91c" containerName="dnsmasq-dns" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.881270 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdae03dd-47dd-4e2a-901b-5ec7cc01e91c" containerName="dnsmasq-dns" Mar 12 13:32:50 crc kubenswrapper[4921]: E0312 13:32:50.881303 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdae03dd-47dd-4e2a-901b-5ec7cc01e91c" containerName="init" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.881312 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdae03dd-47dd-4e2a-901b-5ec7cc01e91c" containerName="init" Mar 12 13:32:50 crc kubenswrapper[4921]: E0312 13:32:50.881328 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde16069-3b0f-4323-afc5-7b33f07f6bce" containerName="dnsmasq-dns" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.881337 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde16069-3b0f-4323-afc5-7b33f07f6bce" containerName="dnsmasq-dns" Mar 12 13:32:50 crc kubenswrapper[4921]: E0312 13:32:50.881356 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde16069-3b0f-4323-afc5-7b33f07f6bce" containerName="init" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.881365 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde16069-3b0f-4323-afc5-7b33f07f6bce" containerName="init" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.881666 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdae03dd-47dd-4e2a-901b-5ec7cc01e91c" containerName="dnsmasq-dns" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.881695 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="dde16069-3b0f-4323-afc5-7b33f07f6bce" containerName="dnsmasq-dns" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.882434 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.885490 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.885940 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.886191 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.889722 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.894496 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7"] Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.981627 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.981931 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.982279 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:32:50 crc kubenswrapper[4921]: I0312 13:32:50.982366 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmhjt\" (UniqueName: \"kubernetes.io/projected/cd563802-76ca-4f00-bb21-caef86a804ce-kube-api-access-qmhjt\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:32:51 crc kubenswrapper[4921]: I0312 13:32:51.083938 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:32:51 crc kubenswrapper[4921]: I0312 13:32:51.084165 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:32:51 crc kubenswrapper[4921]: I0312 13:32:51.084225 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmhjt\" (UniqueName: \"kubernetes.io/projected/cd563802-76ca-4f00-bb21-caef86a804ce-kube-api-access-qmhjt\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:32:51 crc kubenswrapper[4921]: I0312 13:32:51.084341 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:32:51 crc kubenswrapper[4921]: I0312 13:32:51.089654 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:32:51 crc kubenswrapper[4921]: I0312 13:32:51.090097 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:32:51 crc kubenswrapper[4921]: I0312 13:32:51.094497 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:32:51 crc kubenswrapper[4921]: I0312 13:32:51.103152 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmhjt\" (UniqueName: \"kubernetes.io/projected/cd563802-76ca-4f00-bb21-caef86a804ce-kube-api-access-qmhjt\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:32:51 crc kubenswrapper[4921]: I0312 13:32:51.214642 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:32:51 crc kubenswrapper[4921]: I0312 13:32:51.774599 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7"] Mar 12 13:32:51 crc kubenswrapper[4921]: W0312 13:32:51.775708 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd563802_76ca_4f00_bb21_caef86a804ce.slice/crio-7015d3d51d889f73e248abc6ca8a3385524d9d63491a7e3c23757ed107283d73 WatchSource:0}: Error finding container 7015d3d51d889f73e248abc6ca8a3385524d9d63491a7e3c23757ed107283d73: Status 404 returned error can't find the container with id 7015d3d51d889f73e248abc6ca8a3385524d9d63491a7e3c23757ed107283d73 Mar 12 13:32:51 crc kubenswrapper[4921]: I0312 13:32:51.897046 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" event={"ID":"cd563802-76ca-4f00-bb21-caef86a804ce","Type":"ContainerStarted","Data":"7015d3d51d889f73e248abc6ca8a3385524d9d63491a7e3c23757ed107283d73"} Mar 12 13:32:54 crc kubenswrapper[4921]: I0312 13:32:54.933582 4921 generic.go:334] "Generic (PLEG): container finished" podID="7e627c0e-6753-4c4a-ad5f-7d36e4373a2c" containerID="0d4ab79e2275a7883f22d504995cd28fd02d21790cea8b78b7bcb4eea4a2ca23" exitCode=0 Mar 12 13:32:54 crc kubenswrapper[4921]: I0312 13:32:54.933704 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c","Type":"ContainerDied","Data":"0d4ab79e2275a7883f22d504995cd28fd02d21790cea8b78b7bcb4eea4a2ca23"} Mar 12 13:32:56 crc kubenswrapper[4921]: I0312 13:32:56.324306 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:32:56 crc kubenswrapper[4921]: I0312 13:32:56.324674 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:32:56 crc kubenswrapper[4921]: I0312 13:32:56.956987 4921 generic.go:334] "Generic (PLEG): container finished" podID="b28ef2e5-d1ca-460a-9c97-a058c098ef64" containerID="3223f3b0d75da5f6541bbb789154fd9a68366dc5226a487010384693e5aae502" exitCode=0 Mar 12 13:32:56 crc kubenswrapper[4921]: I0312 13:32:56.957044 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b28ef2e5-d1ca-460a-9c97-a058c098ef64","Type":"ContainerDied","Data":"3223f3b0d75da5f6541bbb789154fd9a68366dc5226a487010384693e5aae502"} Mar 12 13:33:00 crc kubenswrapper[4921]: I0312 13:33:00.001010 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b28ef2e5-d1ca-460a-9c97-a058c098ef64","Type":"ContainerStarted","Data":"b61524f9cc4f402140bb021c842461eea3dd0d2cedca41156e168e877ec2eb95"} Mar 12 13:33:00 crc kubenswrapper[4921]: I0312 13:33:00.001493 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:33:00 crc kubenswrapper[4921]: I0312 13:33:00.040350 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.040331743 podStartE2EDuration="39.040331743s" podCreationTimestamp="2026-03-12 13:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:33:00.033501187 +0000 UTC m=+1402.723573168" watchObservedRunningTime="2026-03-12 13:33:00.040331743 +0000 UTC m=+1402.730403734" Mar 12 13:33:01 crc kubenswrapper[4921]: I0312 13:33:01.011463 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" event={"ID":"cd563802-76ca-4f00-bb21-caef86a804ce","Type":"ContainerStarted","Data":"df989e8f8c9418baec821bcd9c40a977ea784f8b6e134fe2d3564147b7ed3e8e"} Mar 12 13:33:01 crc kubenswrapper[4921]: I0312 13:33:01.015712 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7e627c0e-6753-4c4a-ad5f-7d36e4373a2c","Type":"ContainerStarted","Data":"b11312f12e0b066545cad1720fec1624b1079faadd90904280318341af7473c1"} Mar 12 13:33:01 crc kubenswrapper[4921]: I0312 13:33:01.016104 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 12 13:33:01 crc kubenswrapper[4921]: I0312 13:33:01.044707 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" podStartSLOduration=3.09635823 podStartE2EDuration="11.04467802s" podCreationTimestamp="2026-03-12 13:32:50 +0000 UTC" firstStartedPulling="2026-03-12 13:32:51.778928818 +0000 UTC m=+1394.469000789" lastFinishedPulling="2026-03-12 13:32:59.727248588 +0000 UTC m=+1402.417320579" observedRunningTime="2026-03-12 13:33:01.030713707 +0000 UTC m=+1403.720785678" watchObservedRunningTime="2026-03-12 13:33:01.04467802 +0000 UTC m=+1403.734750001" Mar 12 13:33:01 crc kubenswrapper[4921]: I0312 13:33:01.066281 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.066261642 podStartE2EDuration="42.066261642s" podCreationTimestamp="2026-03-12 13:32:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:33:01.056573129 +0000 UTC m=+1403.746645150" watchObservedRunningTime="2026-03-12 13:33:01.066261642 +0000 UTC m=+1403.756333613" Mar 12 13:33:10 crc kubenswrapper[4921]: I0312 13:33:10.167005 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 12 13:33:12 crc kubenswrapper[4921]: I0312 13:33:12.129419 4921 generic.go:334] "Generic (PLEG): container finished" podID="cd563802-76ca-4f00-bb21-caef86a804ce" containerID="df989e8f8c9418baec821bcd9c40a977ea784f8b6e134fe2d3564147b7ed3e8e" exitCode=0 Mar 12 13:33:12 crc kubenswrapper[4921]: I0312 13:33:12.129493 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" event={"ID":"cd563802-76ca-4f00-bb21-caef86a804ce","Type":"ContainerDied","Data":"df989e8f8c9418baec821bcd9c40a977ea784f8b6e134fe2d3564147b7ed3e8e"} Mar 12 13:33:12 crc kubenswrapper[4921]: I0312 13:33:12.225031 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 12 13:33:13 crc kubenswrapper[4921]: I0312 13:33:13.541943 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:33:13 crc kubenswrapper[4921]: I0312 13:33:13.622799 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmhjt\" (UniqueName: \"kubernetes.io/projected/cd563802-76ca-4f00-bb21-caef86a804ce-kube-api-access-qmhjt\") pod \"cd563802-76ca-4f00-bb21-caef86a804ce\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " Mar 12 13:33:13 crc kubenswrapper[4921]: I0312 13:33:13.622911 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-ssh-key-openstack-edpm-ipam\") pod \"cd563802-76ca-4f00-bb21-caef86a804ce\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " Mar 12 13:33:13 crc kubenswrapper[4921]: I0312 13:33:13.622953 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-repo-setup-combined-ca-bundle\") pod \"cd563802-76ca-4f00-bb21-caef86a804ce\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " Mar 12 13:33:13 crc kubenswrapper[4921]: I0312 13:33:13.623060 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-inventory\") pod \"cd563802-76ca-4f00-bb21-caef86a804ce\" (UID: \"cd563802-76ca-4f00-bb21-caef86a804ce\") " Mar 12 13:33:13 crc kubenswrapper[4921]: I0312 13:33:13.629036 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "cd563802-76ca-4f00-bb21-caef86a804ce" (UID: "cd563802-76ca-4f00-bb21-caef86a804ce"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:33:13 crc kubenswrapper[4921]: I0312 13:33:13.629930 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd563802-76ca-4f00-bb21-caef86a804ce-kube-api-access-qmhjt" (OuterVolumeSpecName: "kube-api-access-qmhjt") pod "cd563802-76ca-4f00-bb21-caef86a804ce" (UID: "cd563802-76ca-4f00-bb21-caef86a804ce"). InnerVolumeSpecName "kube-api-access-qmhjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:33:13 crc kubenswrapper[4921]: I0312 13:33:13.647736 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cd563802-76ca-4f00-bb21-caef86a804ce" (UID: "cd563802-76ca-4f00-bb21-caef86a804ce"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:33:13 crc kubenswrapper[4921]: I0312 13:33:13.657113 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-inventory" (OuterVolumeSpecName: "inventory") pod "cd563802-76ca-4f00-bb21-caef86a804ce" (UID: "cd563802-76ca-4f00-bb21-caef86a804ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:33:13 crc kubenswrapper[4921]: I0312 13:33:13.725388 4921 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:33:13 crc kubenswrapper[4921]: I0312 13:33:13.725419 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:33:13 crc kubenswrapper[4921]: I0312 13:33:13.725428 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmhjt\" (UniqueName: \"kubernetes.io/projected/cd563802-76ca-4f00-bb21-caef86a804ce-kube-api-access-qmhjt\") on node \"crc\" DevicePath \"\"" Mar 12 13:33:13 crc kubenswrapper[4921]: I0312 13:33:13.725438 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd563802-76ca-4f00-bb21-caef86a804ce-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.154810 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" event={"ID":"cd563802-76ca-4f00-bb21-caef86a804ce","Type":"ContainerDied","Data":"7015d3d51d889f73e248abc6ca8a3385524d9d63491a7e3c23757ed107283d73"} Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.155184 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7015d3d51d889f73e248abc6ca8a3385524d9d63491a7e3c23757ed107283d73" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.154938 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.240027 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r"] Mar 12 13:33:14 crc kubenswrapper[4921]: E0312 13:33:14.240571 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd563802-76ca-4f00-bb21-caef86a804ce" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.240601 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd563802-76ca-4f00-bb21-caef86a804ce" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.240948 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd563802-76ca-4f00-bb21-caef86a804ce" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.241708 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.246324 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.246398 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.246636 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.246978 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.270271 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r"] Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.334649 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq96n\" (UniqueName: \"kubernetes.io/projected/287a8351-7199-4b48-90c1-e1a58233fae2-kube-api-access-xq96n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.335026 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.335162 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.335318 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.436530 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.436800 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.436950 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.437235 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq96n\" (UniqueName: \"kubernetes.io/projected/287a8351-7199-4b48-90c1-e1a58233fae2-kube-api-access-xq96n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.441926 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.442472 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.443208 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.455223 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq96n\" (UniqueName: \"kubernetes.io/projected/287a8351-7199-4b48-90c1-e1a58233fae2-kube-api-access-xq96n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:33:14 crc kubenswrapper[4921]: I0312 13:33:14.570651 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:33:15 crc kubenswrapper[4921]: I0312 13:33:15.106227 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r"] Mar 12 13:33:15 crc kubenswrapper[4921]: I0312 13:33:15.111836 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:33:15 crc kubenswrapper[4921]: I0312 13:33:15.163241 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" event={"ID":"287a8351-7199-4b48-90c1-e1a58233fae2","Type":"ContainerStarted","Data":"899fed6a2107f6f32c08c130ee5fa39a39e39c2bf66cff9a95f363ae1854cf87"} Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.199173 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" event={"ID":"287a8351-7199-4b48-90c1-e1a58233fae2","Type":"ContainerStarted","Data":"340af1523869e468840aa430e759e3a1bbeafd3ae7d343dcc5266e10acdb0a9c"} Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.224501 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4s5pf"] Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.228124 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.248278 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4s5pf"] Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.255845 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" podStartSLOduration=2.260468556 podStartE2EDuration="3.255824819s" podCreationTimestamp="2026-03-12 13:33:14 +0000 UTC" firstStartedPulling="2026-03-12 13:33:15.111631432 +0000 UTC m=+1417.801703403" lastFinishedPulling="2026-03-12 13:33:16.106987695 +0000 UTC m=+1418.797059666" observedRunningTime="2026-03-12 13:33:17.239441014 +0000 UTC m=+1419.929513015" watchObservedRunningTime="2026-03-12 13:33:17.255824819 +0000 UTC m=+1419.945896790" Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.429445 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711f54fc-5844-494f-ba4d-01e56cc9a990-utilities\") pod \"redhat-operators-4s5pf\" (UID: \"711f54fc-5844-494f-ba4d-01e56cc9a990\") " pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.429868 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nslgn\" (UniqueName: \"kubernetes.io/projected/711f54fc-5844-494f-ba4d-01e56cc9a990-kube-api-access-nslgn\") pod \"redhat-operators-4s5pf\" (UID: \"711f54fc-5844-494f-ba4d-01e56cc9a990\") " pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.429939 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711f54fc-5844-494f-ba4d-01e56cc9a990-catalog-content\") pod \"redhat-operators-4s5pf\" (UID: \"711f54fc-5844-494f-ba4d-01e56cc9a990\") " pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.531106 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nslgn\" (UniqueName: \"kubernetes.io/projected/711f54fc-5844-494f-ba4d-01e56cc9a990-kube-api-access-nslgn\") pod \"redhat-operators-4s5pf\" (UID: \"711f54fc-5844-494f-ba4d-01e56cc9a990\") " pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.531155 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711f54fc-5844-494f-ba4d-01e56cc9a990-catalog-content\") pod \"redhat-operators-4s5pf\" (UID: \"711f54fc-5844-494f-ba4d-01e56cc9a990\") " pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.531187 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711f54fc-5844-494f-ba4d-01e56cc9a990-utilities\") pod \"redhat-operators-4s5pf\" (UID: \"711f54fc-5844-494f-ba4d-01e56cc9a990\") " pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.531751 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711f54fc-5844-494f-ba4d-01e56cc9a990-catalog-content\") pod \"redhat-operators-4s5pf\" (UID: \"711f54fc-5844-494f-ba4d-01e56cc9a990\") " pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.531769 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711f54fc-5844-494f-ba4d-01e56cc9a990-utilities\") pod \"redhat-operators-4s5pf\" (UID: \"711f54fc-5844-494f-ba4d-01e56cc9a990\") " pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.549351 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nslgn\" (UniqueName: \"kubernetes.io/projected/711f54fc-5844-494f-ba4d-01e56cc9a990-kube-api-access-nslgn\") pod \"redhat-operators-4s5pf\" (UID: \"711f54fc-5844-494f-ba4d-01e56cc9a990\") " pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:17 crc kubenswrapper[4921]: I0312 13:33:17.567321 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:18 crc kubenswrapper[4921]: W0312 13:33:18.057187 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod711f54fc_5844_494f_ba4d_01e56cc9a990.slice/crio-250f692103a2b674bf377afcc0d8d6d7ddf01722ff4eeba2bcf8d7af79cd4aeb WatchSource:0}: Error finding container 250f692103a2b674bf377afcc0d8d6d7ddf01722ff4eeba2bcf8d7af79cd4aeb: Status 404 returned error can't find the container with id 250f692103a2b674bf377afcc0d8d6d7ddf01722ff4eeba2bcf8d7af79cd4aeb Mar 12 13:33:18 crc kubenswrapper[4921]: I0312 13:33:18.057794 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4s5pf"] Mar 12 13:33:18 crc kubenswrapper[4921]: I0312 13:33:18.211059 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s5pf" event={"ID":"711f54fc-5844-494f-ba4d-01e56cc9a990","Type":"ContainerStarted","Data":"250f692103a2b674bf377afcc0d8d6d7ddf01722ff4eeba2bcf8d7af79cd4aeb"} Mar 12 13:33:19 crc kubenswrapper[4921]: I0312 13:33:19.224765 4921 generic.go:334] "Generic (PLEG): container finished" podID="711f54fc-5844-494f-ba4d-01e56cc9a990" containerID="cba7e8da1fe9193f7c02e0639735375b151420a9162aca2f1d88272178e0af8b" exitCode=0 Mar 12 13:33:19 crc kubenswrapper[4921]: I0312 13:33:19.224848 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s5pf" event={"ID":"711f54fc-5844-494f-ba4d-01e56cc9a990","Type":"ContainerDied","Data":"cba7e8da1fe9193f7c02e0639735375b151420a9162aca2f1d88272178e0af8b"} Mar 12 13:33:21 crc kubenswrapper[4921]: I0312 13:33:21.248679 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s5pf" event={"ID":"711f54fc-5844-494f-ba4d-01e56cc9a990","Type":"ContainerStarted","Data":"1b25526eaabfe400cbc1cbf86387173ddfc62bf54ce45783eddd255b1dfb29f5"} Mar 12 13:33:22 crc kubenswrapper[4921]: I0312 13:33:22.260839 4921 generic.go:334] "Generic (PLEG): container finished" podID="711f54fc-5844-494f-ba4d-01e56cc9a990" containerID="1b25526eaabfe400cbc1cbf86387173ddfc62bf54ce45783eddd255b1dfb29f5" exitCode=0 Mar 12 13:33:22 crc kubenswrapper[4921]: I0312 13:33:22.260911 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s5pf" event={"ID":"711f54fc-5844-494f-ba4d-01e56cc9a990","Type":"ContainerDied","Data":"1b25526eaabfe400cbc1cbf86387173ddfc62bf54ce45783eddd255b1dfb29f5"} Mar 12 13:33:23 crc kubenswrapper[4921]: I0312 13:33:23.270388 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s5pf" event={"ID":"711f54fc-5844-494f-ba4d-01e56cc9a990","Type":"ContainerStarted","Data":"5d3fe86d13be9ae4b88d38b63d05b9a9d1eaa59b32af591a6a04295ff87d8190"} Mar 12 13:33:23 crc kubenswrapper[4921]: I0312 13:33:23.290525 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4s5pf" podStartSLOduration=2.827413528 podStartE2EDuration="6.290510152s" podCreationTimestamp="2026-03-12 13:33:17 +0000 UTC" firstStartedPulling="2026-03-12 13:33:19.226969885 +0000 UTC m=+1421.917041866" lastFinishedPulling="2026-03-12 13:33:22.690066479 +0000 UTC m=+1425.380138490" observedRunningTime="2026-03-12 13:33:23.286337797 +0000 UTC m=+1425.976409778" watchObservedRunningTime="2026-03-12 13:33:23.290510152 +0000 UTC m=+1425.980582123" Mar 12 13:33:26 crc kubenswrapper[4921]: I0312 13:33:26.324495 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:33:26 crc kubenswrapper[4921]: I0312 13:33:26.324976 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:33:26 crc kubenswrapper[4921]: I0312 13:33:26.325035 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:33:26 crc kubenswrapper[4921]: I0312 13:33:26.325899 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7697d191845361ae138c9b2df2cde8ebed453242ceaff45c19913d28c03c6fd3"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 13:33:26 crc kubenswrapper[4921]: I0312 13:33:26.325970 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://7697d191845361ae138c9b2df2cde8ebed453242ceaff45c19913d28c03c6fd3" gracePeriod=600 Mar 12 13:33:27 crc kubenswrapper[4921]: I0312 13:33:27.322963 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="7697d191845361ae138c9b2df2cde8ebed453242ceaff45c19913d28c03c6fd3" exitCode=0 Mar 12 13:33:27 crc kubenswrapper[4921]: I0312 13:33:27.323021 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"7697d191845361ae138c9b2df2cde8ebed453242ceaff45c19913d28c03c6fd3"} Mar 12 13:33:27 crc kubenswrapper[4921]: I0312 13:33:27.323776 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8"} Mar 12 13:33:27 crc kubenswrapper[4921]: I0312 13:33:27.323835 4921 scope.go:117] "RemoveContainer" containerID="f7722c7345ffa51f6b2d5016c3d605416a6961812caddc8f13639d2d6299573d" Mar 12 13:33:27 crc kubenswrapper[4921]: I0312 13:33:27.568274 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:27 crc kubenswrapper[4921]: I0312 13:33:27.568380 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:28 crc kubenswrapper[4921]: I0312 13:33:28.642024 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4s5pf" podUID="711f54fc-5844-494f-ba4d-01e56cc9a990" containerName="registry-server" probeResult="failure" output=< Mar 12 13:33:28 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 13:33:28 crc kubenswrapper[4921]: > Mar 12 13:33:37 crc kubenswrapper[4921]: I0312 13:33:37.634319 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:37 crc kubenswrapper[4921]: I0312 13:33:37.702956 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:37 crc kubenswrapper[4921]: I0312 13:33:37.879244 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4s5pf"] Mar 12 13:33:39 crc kubenswrapper[4921]: I0312 13:33:39.462470 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4s5pf" podUID="711f54fc-5844-494f-ba4d-01e56cc9a990" containerName="registry-server" containerID="cri-o://5d3fe86d13be9ae4b88d38b63d05b9a9d1eaa59b32af591a6a04295ff87d8190" gracePeriod=2 Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:39.935539 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.103747 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711f54fc-5844-494f-ba4d-01e56cc9a990-utilities\") pod \"711f54fc-5844-494f-ba4d-01e56cc9a990\" (UID: \"711f54fc-5844-494f-ba4d-01e56cc9a990\") " Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.104038 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711f54fc-5844-494f-ba4d-01e56cc9a990-catalog-content\") pod \"711f54fc-5844-494f-ba4d-01e56cc9a990\" (UID: \"711f54fc-5844-494f-ba4d-01e56cc9a990\") " Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.104124 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nslgn\" (UniqueName: \"kubernetes.io/projected/711f54fc-5844-494f-ba4d-01e56cc9a990-kube-api-access-nslgn\") pod \"711f54fc-5844-494f-ba4d-01e56cc9a990\" (UID: \"711f54fc-5844-494f-ba4d-01e56cc9a990\") " Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.104910 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/711f54fc-5844-494f-ba4d-01e56cc9a990-utilities" (OuterVolumeSpecName: "utilities") pod "711f54fc-5844-494f-ba4d-01e56cc9a990" (UID: "711f54fc-5844-494f-ba4d-01e56cc9a990"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.114062 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/711f54fc-5844-494f-ba4d-01e56cc9a990-kube-api-access-nslgn" (OuterVolumeSpecName: "kube-api-access-nslgn") pod "711f54fc-5844-494f-ba4d-01e56cc9a990" (UID: "711f54fc-5844-494f-ba4d-01e56cc9a990"). InnerVolumeSpecName "kube-api-access-nslgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.209905 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nslgn\" (UniqueName: \"kubernetes.io/projected/711f54fc-5844-494f-ba4d-01e56cc9a990-kube-api-access-nslgn\") on node \"crc\" DevicePath \"\"" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.209938 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711f54fc-5844-494f-ba4d-01e56cc9a990-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.302982 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/711f54fc-5844-494f-ba4d-01e56cc9a990-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "711f54fc-5844-494f-ba4d-01e56cc9a990" (UID: "711f54fc-5844-494f-ba4d-01e56cc9a990"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.311996 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711f54fc-5844-494f-ba4d-01e56cc9a990-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.486213 4921 generic.go:334] "Generic (PLEG): container finished" podID="711f54fc-5844-494f-ba4d-01e56cc9a990" containerID="5d3fe86d13be9ae4b88d38b63d05b9a9d1eaa59b32af591a6a04295ff87d8190" exitCode=0 Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.486277 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4s5pf" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.486297 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s5pf" event={"ID":"711f54fc-5844-494f-ba4d-01e56cc9a990","Type":"ContainerDied","Data":"5d3fe86d13be9ae4b88d38b63d05b9a9d1eaa59b32af591a6a04295ff87d8190"} Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.486332 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4s5pf" event={"ID":"711f54fc-5844-494f-ba4d-01e56cc9a990","Type":"ContainerDied","Data":"250f692103a2b674bf377afcc0d8d6d7ddf01722ff4eeba2bcf8d7af79cd4aeb"} Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.486350 4921 scope.go:117] "RemoveContainer" containerID="5d3fe86d13be9ae4b88d38b63d05b9a9d1eaa59b32af591a6a04295ff87d8190" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.509962 4921 scope.go:117] "RemoveContainer" containerID="1b25526eaabfe400cbc1cbf86387173ddfc62bf54ce45783eddd255b1dfb29f5" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.531856 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4s5pf"] Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.540528 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4s5pf"] Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.555335 4921 scope.go:117] "RemoveContainer" containerID="cba7e8da1fe9193f7c02e0639735375b151420a9162aca2f1d88272178e0af8b" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.599734 4921 scope.go:117] "RemoveContainer" containerID="5d3fe86d13be9ae4b88d38b63d05b9a9d1eaa59b32af591a6a04295ff87d8190" Mar 12 13:33:40 crc kubenswrapper[4921]: E0312 13:33:40.600306 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d3fe86d13be9ae4b88d38b63d05b9a9d1eaa59b32af591a6a04295ff87d8190\": container with ID starting with 5d3fe86d13be9ae4b88d38b63d05b9a9d1eaa59b32af591a6a04295ff87d8190 not found: ID does not exist" containerID="5d3fe86d13be9ae4b88d38b63d05b9a9d1eaa59b32af591a6a04295ff87d8190" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.600349 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d3fe86d13be9ae4b88d38b63d05b9a9d1eaa59b32af591a6a04295ff87d8190"} err="failed to get container status \"5d3fe86d13be9ae4b88d38b63d05b9a9d1eaa59b32af591a6a04295ff87d8190\": rpc error: code = NotFound desc = could not find container \"5d3fe86d13be9ae4b88d38b63d05b9a9d1eaa59b32af591a6a04295ff87d8190\": container with ID starting with 5d3fe86d13be9ae4b88d38b63d05b9a9d1eaa59b32af591a6a04295ff87d8190 not found: ID does not exist" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.600376 4921 scope.go:117] "RemoveContainer" containerID="1b25526eaabfe400cbc1cbf86387173ddfc62bf54ce45783eddd255b1dfb29f5" Mar 12 13:33:40 crc kubenswrapper[4921]: E0312 13:33:40.600698 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b25526eaabfe400cbc1cbf86387173ddfc62bf54ce45783eddd255b1dfb29f5\": container with ID starting with 1b25526eaabfe400cbc1cbf86387173ddfc62bf54ce45783eddd255b1dfb29f5 not found: ID does not exist" containerID="1b25526eaabfe400cbc1cbf86387173ddfc62bf54ce45783eddd255b1dfb29f5" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.600716 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b25526eaabfe400cbc1cbf86387173ddfc62bf54ce45783eddd255b1dfb29f5"} err="failed to get container status \"1b25526eaabfe400cbc1cbf86387173ddfc62bf54ce45783eddd255b1dfb29f5\": rpc error: code = NotFound desc = could not find container \"1b25526eaabfe400cbc1cbf86387173ddfc62bf54ce45783eddd255b1dfb29f5\": container with ID starting with 1b25526eaabfe400cbc1cbf86387173ddfc62bf54ce45783eddd255b1dfb29f5 not found: ID does not exist" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.600729 4921 scope.go:117] "RemoveContainer" containerID="cba7e8da1fe9193f7c02e0639735375b151420a9162aca2f1d88272178e0af8b" Mar 12 13:33:40 crc kubenswrapper[4921]: E0312 13:33:40.601029 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cba7e8da1fe9193f7c02e0639735375b151420a9162aca2f1d88272178e0af8b\": container with ID starting with cba7e8da1fe9193f7c02e0639735375b151420a9162aca2f1d88272178e0af8b not found: ID does not exist" containerID="cba7e8da1fe9193f7c02e0639735375b151420a9162aca2f1d88272178e0af8b" Mar 12 13:33:40 crc kubenswrapper[4921]: I0312 13:33:40.601066 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cba7e8da1fe9193f7c02e0639735375b151420a9162aca2f1d88272178e0af8b"} err="failed to get container status \"cba7e8da1fe9193f7c02e0639735375b151420a9162aca2f1d88272178e0af8b\": rpc error: code = NotFound desc = could not find container \"cba7e8da1fe9193f7c02e0639735375b151420a9162aca2f1d88272178e0af8b\": container with ID starting with cba7e8da1fe9193f7c02e0639735375b151420a9162aca2f1d88272178e0af8b not found: ID does not exist" Mar 12 13:33:42 crc kubenswrapper[4921]: I0312 13:33:42.021009 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="711f54fc-5844-494f-ba4d-01e56cc9a990" path="/var/lib/kubelet/pods/711f54fc-5844-494f-ba4d-01e56cc9a990/volumes" Mar 12 13:34:00 crc kubenswrapper[4921]: I0312 13:34:00.173843 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555374-scjkw"] Mar 12 13:34:00 crc kubenswrapper[4921]: E0312 13:34:00.175365 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711f54fc-5844-494f-ba4d-01e56cc9a990" containerName="extract-content" Mar 12 13:34:00 crc kubenswrapper[4921]: I0312 13:34:00.175394 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="711f54fc-5844-494f-ba4d-01e56cc9a990" containerName="extract-content" Mar 12 13:34:00 crc kubenswrapper[4921]: E0312 13:34:00.175419 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711f54fc-5844-494f-ba4d-01e56cc9a990" containerName="extract-utilities" Mar 12 13:34:00 crc kubenswrapper[4921]: I0312 13:34:00.175434 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="711f54fc-5844-494f-ba4d-01e56cc9a990" containerName="extract-utilities" Mar 12 13:34:00 crc kubenswrapper[4921]: E0312 13:34:00.175460 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711f54fc-5844-494f-ba4d-01e56cc9a990" containerName="registry-server" Mar 12 13:34:00 crc kubenswrapper[4921]: I0312 13:34:00.175476 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="711f54fc-5844-494f-ba4d-01e56cc9a990" containerName="registry-server" Mar 12 13:34:00 crc kubenswrapper[4921]: I0312 13:34:00.175941 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="711f54fc-5844-494f-ba4d-01e56cc9a990" containerName="registry-server" Mar 12 13:34:00 crc kubenswrapper[4921]: I0312 13:34:00.177085 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555374-scjkw" Mar 12 13:34:00 crc kubenswrapper[4921]: I0312 13:34:00.179959 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:34:00 crc kubenswrapper[4921]: I0312 13:34:00.180062 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:34:00 crc kubenswrapper[4921]: I0312 13:34:00.180350 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:34:00 crc kubenswrapper[4921]: I0312 13:34:00.201745 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555374-scjkw"] Mar 12 13:34:00 crc kubenswrapper[4921]: I0312 13:34:00.297253 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gdhz\" (UniqueName: \"kubernetes.io/projected/e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60-kube-api-access-4gdhz\") pod \"auto-csr-approver-29555374-scjkw\" (UID: \"e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60\") " pod="openshift-infra/auto-csr-approver-29555374-scjkw" Mar 12 13:34:00 crc kubenswrapper[4921]: I0312 13:34:00.399111 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gdhz\" (UniqueName: \"kubernetes.io/projected/e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60-kube-api-access-4gdhz\") pod \"auto-csr-approver-29555374-scjkw\" (UID: \"e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60\") " pod="openshift-infra/auto-csr-approver-29555374-scjkw" Mar 12 13:34:00 crc kubenswrapper[4921]: I0312 13:34:00.425359 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gdhz\" (UniqueName: \"kubernetes.io/projected/e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60-kube-api-access-4gdhz\") pod \"auto-csr-approver-29555374-scjkw\" (UID: \"e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60\") " pod="openshift-infra/auto-csr-approver-29555374-scjkw" Mar 12 13:34:00 crc kubenswrapper[4921]: I0312 13:34:00.503353 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555374-scjkw" Mar 12 13:34:01 crc kubenswrapper[4921]: W0312 13:34:01.076418 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode04e6bc7_1db4_4d33_89a0_ba4c75bcfe60.slice/crio-dee31b69938538af2921004ff82382aec447bdf9bd1854fe641c783fdbd6d234 WatchSource:0}: Error finding container dee31b69938538af2921004ff82382aec447bdf9bd1854fe641c783fdbd6d234: Status 404 returned error can't find the container with id dee31b69938538af2921004ff82382aec447bdf9bd1854fe641c783fdbd6d234 Mar 12 13:34:01 crc kubenswrapper[4921]: I0312 13:34:01.077549 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555374-scjkw"] Mar 12 13:34:01 crc kubenswrapper[4921]: I0312 13:34:01.730591 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555374-scjkw" event={"ID":"e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60","Type":"ContainerStarted","Data":"dee31b69938538af2921004ff82382aec447bdf9bd1854fe641c783fdbd6d234"} Mar 12 13:34:02 crc kubenswrapper[4921]: I0312 13:34:02.741437 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555374-scjkw" event={"ID":"e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60","Type":"ContainerStarted","Data":"1b2a8ea04d664bb89c44fecd7ca1ebde743a3540b6627c4e98a301293bdafb38"} Mar 12 13:34:02 crc kubenswrapper[4921]: I0312 13:34:02.760773 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555374-scjkw" podStartSLOduration=1.632097052 podStartE2EDuration="2.760744595s" podCreationTimestamp="2026-03-12 13:34:00 +0000 UTC" firstStartedPulling="2026-03-12 13:34:01.079387484 +0000 UTC m=+1463.769459455" lastFinishedPulling="2026-03-12 13:34:02.208035007 +0000 UTC m=+1464.898106998" observedRunningTime="2026-03-12 13:34:02.757278629 +0000 UTC m=+1465.447350600" watchObservedRunningTime="2026-03-12 13:34:02.760744595 +0000 UTC m=+1465.450816566" Mar 12 13:34:03 crc kubenswrapper[4921]: I0312 13:34:03.751264 4921 generic.go:334] "Generic (PLEG): container finished" podID="e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60" containerID="1b2a8ea04d664bb89c44fecd7ca1ebde743a3540b6627c4e98a301293bdafb38" exitCode=0 Mar 12 13:34:03 crc kubenswrapper[4921]: I0312 13:34:03.751313 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555374-scjkw" event={"ID":"e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60","Type":"ContainerDied","Data":"1b2a8ea04d664bb89c44fecd7ca1ebde743a3540b6627c4e98a301293bdafb38"} Mar 12 13:34:05 crc kubenswrapper[4921]: I0312 13:34:05.039696 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555374-scjkw" Mar 12 13:34:05 crc kubenswrapper[4921]: I0312 13:34:05.081899 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gdhz\" (UniqueName: \"kubernetes.io/projected/e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60-kube-api-access-4gdhz\") pod \"e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60\" (UID: \"e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60\") " Mar 12 13:34:05 crc kubenswrapper[4921]: I0312 13:34:05.096239 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60-kube-api-access-4gdhz" (OuterVolumeSpecName: "kube-api-access-4gdhz") pod "e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60" (UID: "e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60"). InnerVolumeSpecName "kube-api-access-4gdhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:34:05 crc kubenswrapper[4921]: I0312 13:34:05.184877 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gdhz\" (UniqueName: \"kubernetes.io/projected/e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60-kube-api-access-4gdhz\") on node \"crc\" DevicePath \"\"" Mar 12 13:34:05 crc kubenswrapper[4921]: I0312 13:34:05.776782 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555374-scjkw" event={"ID":"e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60","Type":"ContainerDied","Data":"dee31b69938538af2921004ff82382aec447bdf9bd1854fe641c783fdbd6d234"} Mar 12 13:34:05 crc kubenswrapper[4921]: I0312 13:34:05.776868 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dee31b69938538af2921004ff82382aec447bdf9bd1854fe641c783fdbd6d234" Mar 12 13:34:05 crc kubenswrapper[4921]: I0312 13:34:05.776886 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555374-scjkw" Mar 12 13:34:05 crc kubenswrapper[4921]: I0312 13:34:05.866020 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555368-82pj9"] Mar 12 13:34:05 crc kubenswrapper[4921]: I0312 13:34:05.874049 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555368-82pj9"] Mar 12 13:34:05 crc kubenswrapper[4921]: I0312 13:34:05.996774 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="982b1b07-1ac6-4431-9226-3bf8129423cd" path="/var/lib/kubelet/pods/982b1b07-1ac6-4431-9226-3bf8129423cd/volumes" Mar 12 13:34:40 crc kubenswrapper[4921]: I0312 13:34:40.399641 4921 scope.go:117] "RemoveContainer" containerID="3d0fdc0ecffd9277de528115a9925fd78f5e48b685197844c2b4614363ef8dc0" Mar 12 13:34:40 crc kubenswrapper[4921]: I0312 13:34:40.439684 4921 scope.go:117] "RemoveContainer" containerID="3c5c0ffc7161c468760a7964361fbe00731b21aae9c39610fb20c3dc3e9e3a7e" Mar 12 13:34:40 crc kubenswrapper[4921]: I0312 13:34:40.481975 4921 scope.go:117] "RemoveContainer" containerID="4a9bbae02a3363ca87334d69e34ab8c25f1a8f8b6ffc28003e05588479008ef7" Mar 12 13:34:40 crc kubenswrapper[4921]: I0312 13:34:40.533004 4921 scope.go:117] "RemoveContainer" containerID="49f8886d7160b0c64fd0a031464306729473f2b2cfec92335eeaf84a2649b2af" Mar 12 13:34:40 crc kubenswrapper[4921]: I0312 13:34:40.553597 4921 scope.go:117] "RemoveContainer" containerID="d919f77d2517b14200444846ad80d74de444231258e626c9d3d8594c3eb01f3a" Mar 12 13:34:40 crc kubenswrapper[4921]: I0312 13:34:40.588479 4921 scope.go:117] "RemoveContainer" containerID="ab86bbafa64b7b54ca717739766a6060392900703d9735765d290b4d35b9b56c" Mar 12 13:34:53 crc kubenswrapper[4921]: I0312 13:34:53.670915 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-txgpj"] Mar 12 13:34:53 crc kubenswrapper[4921]: E0312 13:34:53.672575 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60" containerName="oc" Mar 12 13:34:53 crc kubenswrapper[4921]: I0312 13:34:53.672607 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60" containerName="oc" Mar 12 13:34:53 crc kubenswrapper[4921]: I0312 13:34:53.673145 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60" containerName="oc" Mar 12 13:34:53 crc kubenswrapper[4921]: I0312 13:34:53.676167 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:34:53 crc kubenswrapper[4921]: I0312 13:34:53.694131 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txgpj"] Mar 12 13:34:53 crc kubenswrapper[4921]: I0312 13:34:53.772634 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f008ad24-548e-4db1-abcb-7c73fbf99907-catalog-content\") pod \"redhat-marketplace-txgpj\" (UID: \"f008ad24-548e-4db1-abcb-7c73fbf99907\") " pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:34:53 crc kubenswrapper[4921]: I0312 13:34:53.773704 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f008ad24-548e-4db1-abcb-7c73fbf99907-utilities\") pod \"redhat-marketplace-txgpj\" (UID: \"f008ad24-548e-4db1-abcb-7c73fbf99907\") " pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:34:53 crc kubenswrapper[4921]: I0312 13:34:53.773744 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxk9p\" (UniqueName: \"kubernetes.io/projected/f008ad24-548e-4db1-abcb-7c73fbf99907-kube-api-access-fxk9p\") pod \"redhat-marketplace-txgpj\" (UID: \"f008ad24-548e-4db1-abcb-7c73fbf99907\") " pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:34:53 crc kubenswrapper[4921]: I0312 13:34:53.875703 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f008ad24-548e-4db1-abcb-7c73fbf99907-catalog-content\") pod \"redhat-marketplace-txgpj\" (UID: \"f008ad24-548e-4db1-abcb-7c73fbf99907\") " pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:34:53 crc kubenswrapper[4921]: I0312 13:34:53.875771 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f008ad24-548e-4db1-abcb-7c73fbf99907-utilities\") pod \"redhat-marketplace-txgpj\" (UID: \"f008ad24-548e-4db1-abcb-7c73fbf99907\") " pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:34:53 crc kubenswrapper[4921]: I0312 13:34:53.875789 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxk9p\" (UniqueName: \"kubernetes.io/projected/f008ad24-548e-4db1-abcb-7c73fbf99907-kube-api-access-fxk9p\") pod \"redhat-marketplace-txgpj\" (UID: \"f008ad24-548e-4db1-abcb-7c73fbf99907\") " pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:34:53 crc kubenswrapper[4921]: I0312 13:34:53.876321 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f008ad24-548e-4db1-abcb-7c73fbf99907-utilities\") pod \"redhat-marketplace-txgpj\" (UID: \"f008ad24-548e-4db1-abcb-7c73fbf99907\") " pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:34:53 crc kubenswrapper[4921]: I0312 13:34:53.876377 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f008ad24-548e-4db1-abcb-7c73fbf99907-catalog-content\") pod \"redhat-marketplace-txgpj\" (UID: \"f008ad24-548e-4db1-abcb-7c73fbf99907\") " pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:34:53 crc kubenswrapper[4921]: I0312 13:34:53.898350 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxk9p\" (UniqueName: \"kubernetes.io/projected/f008ad24-548e-4db1-abcb-7c73fbf99907-kube-api-access-fxk9p\") pod \"redhat-marketplace-txgpj\" (UID: \"f008ad24-548e-4db1-abcb-7c73fbf99907\") " pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:34:54 crc kubenswrapper[4921]: I0312 13:34:54.005871 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:34:54 crc kubenswrapper[4921]: I0312 13:34:54.486506 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txgpj"] Mar 12 13:34:55 crc kubenswrapper[4921]: I0312 13:34:55.285459 4921 generic.go:334] "Generic (PLEG): container finished" podID="f008ad24-548e-4db1-abcb-7c73fbf99907" containerID="787a8ca95e69fbdc10c78765b90e2b3073477fabfe4f87cce1a4d5e2dda399c8" exitCode=0 Mar 12 13:34:55 crc kubenswrapper[4921]: I0312 13:34:55.285559 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txgpj" event={"ID":"f008ad24-548e-4db1-abcb-7c73fbf99907","Type":"ContainerDied","Data":"787a8ca95e69fbdc10c78765b90e2b3073477fabfe4f87cce1a4d5e2dda399c8"} Mar 12 13:34:55 crc kubenswrapper[4921]: I0312 13:34:55.285764 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txgpj" event={"ID":"f008ad24-548e-4db1-abcb-7c73fbf99907","Type":"ContainerStarted","Data":"30550358e77cfc353359f49c6a486f8cc0bffa4991a7ed55895b6861c651a1eb"} Mar 12 13:34:56 crc kubenswrapper[4921]: I0312 13:34:56.297010 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txgpj" event={"ID":"f008ad24-548e-4db1-abcb-7c73fbf99907","Type":"ContainerStarted","Data":"88990f5723d67b72b4d18360bed65a92c4af29e365a0a6d99a5ecaed3108ff31"} Mar 12 13:34:57 crc kubenswrapper[4921]: I0312 13:34:57.314088 4921 generic.go:334] "Generic (PLEG): container finished" podID="f008ad24-548e-4db1-abcb-7c73fbf99907" containerID="88990f5723d67b72b4d18360bed65a92c4af29e365a0a6d99a5ecaed3108ff31" exitCode=0 Mar 12 13:34:57 crc kubenswrapper[4921]: I0312 13:34:57.314188 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txgpj" event={"ID":"f008ad24-548e-4db1-abcb-7c73fbf99907","Type":"ContainerDied","Data":"88990f5723d67b72b4d18360bed65a92c4af29e365a0a6d99a5ecaed3108ff31"} Mar 12 13:34:58 crc kubenswrapper[4921]: I0312 13:34:58.336533 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txgpj" event={"ID":"f008ad24-548e-4db1-abcb-7c73fbf99907","Type":"ContainerStarted","Data":"38d536bd626c4330e64aad995ce9111b76ce58717618a8eae9b6fa0ced1e59be"} Mar 12 13:34:58 crc kubenswrapper[4921]: I0312 13:34:58.360491 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-txgpj" podStartSLOduration=2.558108931 podStartE2EDuration="5.360478171s" podCreationTimestamp="2026-03-12 13:34:53 +0000 UTC" firstStartedPulling="2026-03-12 13:34:55.288081216 +0000 UTC m=+1517.978153217" lastFinishedPulling="2026-03-12 13:34:58.090450446 +0000 UTC m=+1520.780522457" observedRunningTime="2026-03-12 13:34:58.35652268 +0000 UTC m=+1521.046594651" watchObservedRunningTime="2026-03-12 13:34:58.360478171 +0000 UTC m=+1521.050550142" Mar 12 13:35:04 crc kubenswrapper[4921]: I0312 13:35:04.006260 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:35:04 crc kubenswrapper[4921]: I0312 13:35:04.007639 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:35:04 crc kubenswrapper[4921]: I0312 13:35:04.051537 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:35:04 crc kubenswrapper[4921]: I0312 13:35:04.478549 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:35:04 crc kubenswrapper[4921]: I0312 13:35:04.550356 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-txgpj"] Mar 12 13:35:06 crc kubenswrapper[4921]: I0312 13:35:06.415263 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-txgpj" podUID="f008ad24-548e-4db1-abcb-7c73fbf99907" containerName="registry-server" containerID="cri-o://38d536bd626c4330e64aad995ce9111b76ce58717618a8eae9b6fa0ced1e59be" gracePeriod=2 Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.007703 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.123862 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxk9p\" (UniqueName: \"kubernetes.io/projected/f008ad24-548e-4db1-abcb-7c73fbf99907-kube-api-access-fxk9p\") pod \"f008ad24-548e-4db1-abcb-7c73fbf99907\" (UID: \"f008ad24-548e-4db1-abcb-7c73fbf99907\") " Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.124087 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f008ad24-548e-4db1-abcb-7c73fbf99907-utilities\") pod \"f008ad24-548e-4db1-abcb-7c73fbf99907\" (UID: \"f008ad24-548e-4db1-abcb-7c73fbf99907\") " Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.124130 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f008ad24-548e-4db1-abcb-7c73fbf99907-catalog-content\") pod \"f008ad24-548e-4db1-abcb-7c73fbf99907\" (UID: \"f008ad24-548e-4db1-abcb-7c73fbf99907\") " Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.125233 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f008ad24-548e-4db1-abcb-7c73fbf99907-utilities" (OuterVolumeSpecName: "utilities") pod "f008ad24-548e-4db1-abcb-7c73fbf99907" (UID: "f008ad24-548e-4db1-abcb-7c73fbf99907"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.125733 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f008ad24-548e-4db1-abcb-7c73fbf99907-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.129887 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f008ad24-548e-4db1-abcb-7c73fbf99907-kube-api-access-fxk9p" (OuterVolumeSpecName: "kube-api-access-fxk9p") pod "f008ad24-548e-4db1-abcb-7c73fbf99907" (UID: "f008ad24-548e-4db1-abcb-7c73fbf99907"). InnerVolumeSpecName "kube-api-access-fxk9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.157455 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f008ad24-548e-4db1-abcb-7c73fbf99907-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f008ad24-548e-4db1-abcb-7c73fbf99907" (UID: "f008ad24-548e-4db1-abcb-7c73fbf99907"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.227173 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f008ad24-548e-4db1-abcb-7c73fbf99907-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.227503 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxk9p\" (UniqueName: \"kubernetes.io/projected/f008ad24-548e-4db1-abcb-7c73fbf99907-kube-api-access-fxk9p\") on node \"crc\" DevicePath \"\"" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.427101 4921 generic.go:334] "Generic (PLEG): container finished" podID="f008ad24-548e-4db1-abcb-7c73fbf99907" containerID="38d536bd626c4330e64aad995ce9111b76ce58717618a8eae9b6fa0ced1e59be" exitCode=0 Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.427143 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txgpj" event={"ID":"f008ad24-548e-4db1-abcb-7c73fbf99907","Type":"ContainerDied","Data":"38d536bd626c4330e64aad995ce9111b76ce58717618a8eae9b6fa0ced1e59be"} Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.427168 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txgpj" event={"ID":"f008ad24-548e-4db1-abcb-7c73fbf99907","Type":"ContainerDied","Data":"30550358e77cfc353359f49c6a486f8cc0bffa4991a7ed55895b6861c651a1eb"} Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.427183 4921 scope.go:117] "RemoveContainer" containerID="38d536bd626c4330e64aad995ce9111b76ce58717618a8eae9b6fa0ced1e59be" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.427201 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txgpj" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.458668 4921 scope.go:117] "RemoveContainer" containerID="88990f5723d67b72b4d18360bed65a92c4af29e365a0a6d99a5ecaed3108ff31" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.486408 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-txgpj"] Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.493841 4921 scope.go:117] "RemoveContainer" containerID="787a8ca95e69fbdc10c78765b90e2b3073477fabfe4f87cce1a4d5e2dda399c8" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.494205 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-txgpj"] Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.533902 4921 scope.go:117] "RemoveContainer" containerID="38d536bd626c4330e64aad995ce9111b76ce58717618a8eae9b6fa0ced1e59be" Mar 12 13:35:07 crc kubenswrapper[4921]: E0312 13:35:07.534378 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38d536bd626c4330e64aad995ce9111b76ce58717618a8eae9b6fa0ced1e59be\": container with ID starting with 38d536bd626c4330e64aad995ce9111b76ce58717618a8eae9b6fa0ced1e59be not found: ID does not exist" containerID="38d536bd626c4330e64aad995ce9111b76ce58717618a8eae9b6fa0ced1e59be" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.534426 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38d536bd626c4330e64aad995ce9111b76ce58717618a8eae9b6fa0ced1e59be"} err="failed to get container status \"38d536bd626c4330e64aad995ce9111b76ce58717618a8eae9b6fa0ced1e59be\": rpc error: code = NotFound desc = could not find container \"38d536bd626c4330e64aad995ce9111b76ce58717618a8eae9b6fa0ced1e59be\": container with ID starting with 38d536bd626c4330e64aad995ce9111b76ce58717618a8eae9b6fa0ced1e59be not found: ID does not exist" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.534459 4921 scope.go:117] "RemoveContainer" containerID="88990f5723d67b72b4d18360bed65a92c4af29e365a0a6d99a5ecaed3108ff31" Mar 12 13:35:07 crc kubenswrapper[4921]: E0312 13:35:07.534768 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88990f5723d67b72b4d18360bed65a92c4af29e365a0a6d99a5ecaed3108ff31\": container with ID starting with 88990f5723d67b72b4d18360bed65a92c4af29e365a0a6d99a5ecaed3108ff31 not found: ID does not exist" containerID="88990f5723d67b72b4d18360bed65a92c4af29e365a0a6d99a5ecaed3108ff31" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.534800 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88990f5723d67b72b4d18360bed65a92c4af29e365a0a6d99a5ecaed3108ff31"} err="failed to get container status \"88990f5723d67b72b4d18360bed65a92c4af29e365a0a6d99a5ecaed3108ff31\": rpc error: code = NotFound desc = could not find container \"88990f5723d67b72b4d18360bed65a92c4af29e365a0a6d99a5ecaed3108ff31\": container with ID starting with 88990f5723d67b72b4d18360bed65a92c4af29e365a0a6d99a5ecaed3108ff31 not found: ID does not exist" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.534842 4921 scope.go:117] "RemoveContainer" containerID="787a8ca95e69fbdc10c78765b90e2b3073477fabfe4f87cce1a4d5e2dda399c8" Mar 12 13:35:07 crc kubenswrapper[4921]: E0312 13:35:07.535072 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"787a8ca95e69fbdc10c78765b90e2b3073477fabfe4f87cce1a4d5e2dda399c8\": container with ID starting with 787a8ca95e69fbdc10c78765b90e2b3073477fabfe4f87cce1a4d5e2dda399c8 not found: ID does not exist" containerID="787a8ca95e69fbdc10c78765b90e2b3073477fabfe4f87cce1a4d5e2dda399c8" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.535103 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"787a8ca95e69fbdc10c78765b90e2b3073477fabfe4f87cce1a4d5e2dda399c8"} err="failed to get container status \"787a8ca95e69fbdc10c78765b90e2b3073477fabfe4f87cce1a4d5e2dda399c8\": rpc error: code = NotFound desc = could not find container \"787a8ca95e69fbdc10c78765b90e2b3073477fabfe4f87cce1a4d5e2dda399c8\": container with ID starting with 787a8ca95e69fbdc10c78765b90e2b3073477fabfe4f87cce1a4d5e2dda399c8 not found: ID does not exist" Mar 12 13:35:07 crc kubenswrapper[4921]: I0312 13:35:07.999345 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f008ad24-548e-4db1-abcb-7c73fbf99907" path="/var/lib/kubelet/pods/f008ad24-548e-4db1-abcb-7c73fbf99907/volumes" Mar 12 13:35:26 crc kubenswrapper[4921]: I0312 13:35:26.323756 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:35:26 crc kubenswrapper[4921]: I0312 13:35:26.325874 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:35:40 crc kubenswrapper[4921]: I0312 13:35:40.706421 4921 scope.go:117] "RemoveContainer" containerID="8ae6415b86e637b09ed6ed8eb721da79db2b5d0cf0ae025f5ebd65187ff5ba71" Mar 12 13:35:40 crc kubenswrapper[4921]: I0312 13:35:40.763146 4921 scope.go:117] "RemoveContainer" containerID="6208e6e16328b07ee57acad0b7d54e2206e69379e81312961f6b0b4954b50748" Mar 12 13:35:40 crc kubenswrapper[4921]: I0312 13:35:40.803595 4921 scope.go:117] "RemoveContainer" containerID="5de3ad8d3563a1c396e01a64104cc10969c7df05da22fb2e628de17147678961" Mar 12 13:35:56 crc kubenswrapper[4921]: I0312 13:35:56.323993 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:35:56 crc kubenswrapper[4921]: I0312 13:35:56.325913 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:36:00 crc kubenswrapper[4921]: I0312 13:36:00.155309 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555376-2fkvr"] Mar 12 13:36:00 crc kubenswrapper[4921]: E0312 13:36:00.157485 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f008ad24-548e-4db1-abcb-7c73fbf99907" containerName="extract-utilities" Mar 12 13:36:00 crc kubenswrapper[4921]: I0312 13:36:00.157591 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f008ad24-548e-4db1-abcb-7c73fbf99907" containerName="extract-utilities" Mar 12 13:36:00 crc kubenswrapper[4921]: E0312 13:36:00.157661 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f008ad24-548e-4db1-abcb-7c73fbf99907" containerName="extract-content" Mar 12 13:36:00 crc kubenswrapper[4921]: I0312 13:36:00.157760 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f008ad24-548e-4db1-abcb-7c73fbf99907" containerName="extract-content" Mar 12 13:36:00 crc kubenswrapper[4921]: E0312 13:36:00.157853 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f008ad24-548e-4db1-abcb-7c73fbf99907" containerName="registry-server" Mar 12 13:36:00 crc kubenswrapper[4921]: I0312 13:36:00.157928 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f008ad24-548e-4db1-abcb-7c73fbf99907" containerName="registry-server" Mar 12 13:36:00 crc kubenswrapper[4921]: I0312 13:36:00.158280 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f008ad24-548e-4db1-abcb-7c73fbf99907" containerName="registry-server" Mar 12 13:36:00 crc kubenswrapper[4921]: I0312 13:36:00.164040 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555376-2fkvr" Mar 12 13:36:00 crc kubenswrapper[4921]: I0312 13:36:00.171312 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:36:00 crc kubenswrapper[4921]: I0312 13:36:00.175258 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:36:00 crc kubenswrapper[4921]: I0312 13:36:00.175397 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:36:00 crc kubenswrapper[4921]: I0312 13:36:00.187292 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555376-2fkvr"] Mar 12 13:36:00 crc kubenswrapper[4921]: I0312 13:36:00.256294 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkn4h\" (UniqueName: \"kubernetes.io/projected/7cc7ff0c-3e22-46c8-b128-e637f21f83d3-kube-api-access-hkn4h\") pod \"auto-csr-approver-29555376-2fkvr\" (UID: \"7cc7ff0c-3e22-46c8-b128-e637f21f83d3\") " pod="openshift-infra/auto-csr-approver-29555376-2fkvr" Mar 12 13:36:00 crc kubenswrapper[4921]: I0312 13:36:00.358031 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkn4h\" (UniqueName: \"kubernetes.io/projected/7cc7ff0c-3e22-46c8-b128-e637f21f83d3-kube-api-access-hkn4h\") pod \"auto-csr-approver-29555376-2fkvr\" (UID: \"7cc7ff0c-3e22-46c8-b128-e637f21f83d3\") " pod="openshift-infra/auto-csr-approver-29555376-2fkvr" Mar 12 13:36:00 crc kubenswrapper[4921]: I0312 13:36:00.381826 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkn4h\" (UniqueName: \"kubernetes.io/projected/7cc7ff0c-3e22-46c8-b128-e637f21f83d3-kube-api-access-hkn4h\") pod \"auto-csr-approver-29555376-2fkvr\" (UID: \"7cc7ff0c-3e22-46c8-b128-e637f21f83d3\") " pod="openshift-infra/auto-csr-approver-29555376-2fkvr" Mar 12 13:36:00 crc kubenswrapper[4921]: I0312 13:36:00.501848 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555376-2fkvr" Mar 12 13:36:01 crc kubenswrapper[4921]: I0312 13:36:01.035456 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555376-2fkvr"] Mar 12 13:36:02 crc kubenswrapper[4921]: I0312 13:36:02.016703 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555376-2fkvr" event={"ID":"7cc7ff0c-3e22-46c8-b128-e637f21f83d3","Type":"ContainerStarted","Data":"53517a40d4e17e8ecb580fdef40058845d8c07e3c77f51acf1227a7c2d7f717f"} Mar 12 13:36:03 crc kubenswrapper[4921]: I0312 13:36:03.027712 4921 generic.go:334] "Generic (PLEG): container finished" podID="7cc7ff0c-3e22-46c8-b128-e637f21f83d3" containerID="6488db2f8389620c8d4df043174953e4e0dafe0036a135bac043ac1faba99efc" exitCode=0 Mar 12 13:36:03 crc kubenswrapper[4921]: I0312 13:36:03.027937 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555376-2fkvr" event={"ID":"7cc7ff0c-3e22-46c8-b128-e637f21f83d3","Type":"ContainerDied","Data":"6488db2f8389620c8d4df043174953e4e0dafe0036a135bac043ac1faba99efc"} Mar 12 13:36:04 crc kubenswrapper[4921]: I0312 13:36:04.393664 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555376-2fkvr" Mar 12 13:36:04 crc kubenswrapper[4921]: I0312 13:36:04.546345 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkn4h\" (UniqueName: \"kubernetes.io/projected/7cc7ff0c-3e22-46c8-b128-e637f21f83d3-kube-api-access-hkn4h\") pod \"7cc7ff0c-3e22-46c8-b128-e637f21f83d3\" (UID: \"7cc7ff0c-3e22-46c8-b128-e637f21f83d3\") " Mar 12 13:36:04 crc kubenswrapper[4921]: I0312 13:36:04.553984 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cc7ff0c-3e22-46c8-b128-e637f21f83d3-kube-api-access-hkn4h" (OuterVolumeSpecName: "kube-api-access-hkn4h") pod "7cc7ff0c-3e22-46c8-b128-e637f21f83d3" (UID: "7cc7ff0c-3e22-46c8-b128-e637f21f83d3"). InnerVolumeSpecName "kube-api-access-hkn4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:36:04 crc kubenswrapper[4921]: I0312 13:36:04.649077 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkn4h\" (UniqueName: \"kubernetes.io/projected/7cc7ff0c-3e22-46c8-b128-e637f21f83d3-kube-api-access-hkn4h\") on node \"crc\" DevicePath \"\"" Mar 12 13:36:05 crc kubenswrapper[4921]: I0312 13:36:05.050157 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555376-2fkvr" event={"ID":"7cc7ff0c-3e22-46c8-b128-e637f21f83d3","Type":"ContainerDied","Data":"53517a40d4e17e8ecb580fdef40058845d8c07e3c77f51acf1227a7c2d7f717f"} Mar 12 13:36:05 crc kubenswrapper[4921]: I0312 13:36:05.050448 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53517a40d4e17e8ecb580fdef40058845d8c07e3c77f51acf1227a7c2d7f717f" Mar 12 13:36:05 crc kubenswrapper[4921]: I0312 13:36:05.050567 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555376-2fkvr" Mar 12 13:36:05 crc kubenswrapper[4921]: I0312 13:36:05.484824 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555370-6g99d"] Mar 12 13:36:05 crc kubenswrapper[4921]: I0312 13:36:05.495348 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555370-6g99d"] Mar 12 13:36:05 crc kubenswrapper[4921]: I0312 13:36:05.994777 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4c99a71-792e-4bc1-81d5-e75c67437787" path="/var/lib/kubelet/pods/f4c99a71-792e-4bc1-81d5-e75c67437787/volumes" Mar 12 13:36:26 crc kubenswrapper[4921]: I0312 13:36:26.323904 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:36:26 crc kubenswrapper[4921]: I0312 13:36:26.324600 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:36:26 crc kubenswrapper[4921]: I0312 13:36:26.324658 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:36:26 crc kubenswrapper[4921]: I0312 13:36:26.326122 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 13:36:26 crc kubenswrapper[4921]: I0312 13:36:26.326222 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" gracePeriod=600 Mar 12 13:36:26 crc kubenswrapper[4921]: E0312 13:36:26.463073 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:36:27 crc kubenswrapper[4921]: I0312 13:36:27.288399 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" exitCode=0 Mar 12 13:36:27 crc kubenswrapper[4921]: I0312 13:36:27.288459 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8"} Mar 12 13:36:27 crc kubenswrapper[4921]: I0312 13:36:27.288504 4921 scope.go:117] "RemoveContainer" containerID="7697d191845361ae138c9b2df2cde8ebed453242ceaff45c19913d28c03c6fd3" Mar 12 13:36:27 crc kubenswrapper[4921]: I0312 13:36:27.289375 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:36:27 crc kubenswrapper[4921]: E0312 13:36:27.289772 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:36:38 crc kubenswrapper[4921]: I0312 13:36:38.983760 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:36:38 crc kubenswrapper[4921]: E0312 13:36:38.984678 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:36:40 crc kubenswrapper[4921]: I0312 13:36:40.895910 4921 scope.go:117] "RemoveContainer" containerID="514a39081be976dbb2b8573ffca91f0b4371f6c7fa4a3fbf11e8f04783c97598" Mar 12 13:36:44 crc kubenswrapper[4921]: I0312 13:36:44.505889 4921 generic.go:334] "Generic (PLEG): container finished" podID="287a8351-7199-4b48-90c1-e1a58233fae2" containerID="340af1523869e468840aa430e759e3a1bbeafd3ae7d343dcc5266e10acdb0a9c" exitCode=0 Mar 12 13:36:44 crc kubenswrapper[4921]: I0312 13:36:44.506660 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" event={"ID":"287a8351-7199-4b48-90c1-e1a58233fae2","Type":"ContainerDied","Data":"340af1523869e468840aa430e759e3a1bbeafd3ae7d343dcc5266e10acdb0a9c"} Mar 12 13:36:45 crc kubenswrapper[4921]: I0312 13:36:45.894509 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:36:45 crc kubenswrapper[4921]: I0312 13:36:45.949238 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-ssh-key-openstack-edpm-ipam\") pod \"287a8351-7199-4b48-90c1-e1a58233fae2\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " Mar 12 13:36:45 crc kubenswrapper[4921]: I0312 13:36:45.949413 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-bootstrap-combined-ca-bundle\") pod \"287a8351-7199-4b48-90c1-e1a58233fae2\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " Mar 12 13:36:45 crc kubenswrapper[4921]: I0312 13:36:45.949465 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq96n\" (UniqueName: \"kubernetes.io/projected/287a8351-7199-4b48-90c1-e1a58233fae2-kube-api-access-xq96n\") pod \"287a8351-7199-4b48-90c1-e1a58233fae2\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " Mar 12 13:36:45 crc kubenswrapper[4921]: I0312 13:36:45.949545 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-inventory\") pod \"287a8351-7199-4b48-90c1-e1a58233fae2\" (UID: \"287a8351-7199-4b48-90c1-e1a58233fae2\") " Mar 12 13:36:45 crc kubenswrapper[4921]: I0312 13:36:45.954945 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "287a8351-7199-4b48-90c1-e1a58233fae2" (UID: "287a8351-7199-4b48-90c1-e1a58233fae2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:36:45 crc kubenswrapper[4921]: I0312 13:36:45.955040 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/287a8351-7199-4b48-90c1-e1a58233fae2-kube-api-access-xq96n" (OuterVolumeSpecName: "kube-api-access-xq96n") pod "287a8351-7199-4b48-90c1-e1a58233fae2" (UID: "287a8351-7199-4b48-90c1-e1a58233fae2"). InnerVolumeSpecName "kube-api-access-xq96n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:36:45 crc kubenswrapper[4921]: I0312 13:36:45.973707 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "287a8351-7199-4b48-90c1-e1a58233fae2" (UID: "287a8351-7199-4b48-90c1-e1a58233fae2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:36:45 crc kubenswrapper[4921]: I0312 13:36:45.974077 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-inventory" (OuterVolumeSpecName: "inventory") pod "287a8351-7199-4b48-90c1-e1a58233fae2" (UID: "287a8351-7199-4b48-90c1-e1a58233fae2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.052106 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.052133 4921 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.052145 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq96n\" (UniqueName: \"kubernetes.io/projected/287a8351-7199-4b48-90c1-e1a58233fae2-kube-api-access-xq96n\") on node \"crc\" DevicePath \"\"" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.052154 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/287a8351-7199-4b48-90c1-e1a58233fae2-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.536641 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" event={"ID":"287a8351-7199-4b48-90c1-e1a58233fae2","Type":"ContainerDied","Data":"899fed6a2107f6f32c08c130ee5fa39a39e39c2bf66cff9a95f363ae1854cf87"} Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.536974 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="899fed6a2107f6f32c08c130ee5fa39a39e39c2bf66cff9a95f363ae1854cf87" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.536856 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.606713 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx"] Mar 12 13:36:46 crc kubenswrapper[4921]: E0312 13:36:46.607298 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cc7ff0c-3e22-46c8-b128-e637f21f83d3" containerName="oc" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.607362 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cc7ff0c-3e22-46c8-b128-e637f21f83d3" containerName="oc" Mar 12 13:36:46 crc kubenswrapper[4921]: E0312 13:36:46.607417 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="287a8351-7199-4b48-90c1-e1a58233fae2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.607463 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="287a8351-7199-4b48-90c1-e1a58233fae2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.607697 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="287a8351-7199-4b48-90c1-e1a58233fae2" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.607751 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cc7ff0c-3e22-46c8-b128-e637f21f83d3" containerName="oc" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.609628 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.611577 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.611831 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.612776 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.613174 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.620054 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx"] Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.763259 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea960b85-3b3c-4afb-a363-a3a0e3327701-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx\" (UID: \"ea960b85-3b3c-4afb-a363-a3a0e3327701\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.763497 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkm6l\" (UniqueName: \"kubernetes.io/projected/ea960b85-3b3c-4afb-a363-a3a0e3327701-kube-api-access-xkm6l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx\" (UID: \"ea960b85-3b3c-4afb-a363-a3a0e3327701\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.764307 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea960b85-3b3c-4afb-a363-a3a0e3327701-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx\" (UID: \"ea960b85-3b3c-4afb-a363-a3a0e3327701\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.866020 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea960b85-3b3c-4afb-a363-a3a0e3327701-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx\" (UID: \"ea960b85-3b3c-4afb-a363-a3a0e3327701\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.866115 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkm6l\" (UniqueName: \"kubernetes.io/projected/ea960b85-3b3c-4afb-a363-a3a0e3327701-kube-api-access-xkm6l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx\" (UID: \"ea960b85-3b3c-4afb-a363-a3a0e3327701\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.866215 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea960b85-3b3c-4afb-a363-a3a0e3327701-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx\" (UID: \"ea960b85-3b3c-4afb-a363-a3a0e3327701\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.869513 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea960b85-3b3c-4afb-a363-a3a0e3327701-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx\" (UID: \"ea960b85-3b3c-4afb-a363-a3a0e3327701\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.869759 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea960b85-3b3c-4afb-a363-a3a0e3327701-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx\" (UID: \"ea960b85-3b3c-4afb-a363-a3a0e3327701\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.880590 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkm6l\" (UniqueName: \"kubernetes.io/projected/ea960b85-3b3c-4afb-a363-a3a0e3327701-kube-api-access-xkm6l\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx\" (UID: \"ea960b85-3b3c-4afb-a363-a3a0e3327701\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" Mar 12 13:36:46 crc kubenswrapper[4921]: I0312 13:36:46.925856 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" Mar 12 13:36:47 crc kubenswrapper[4921]: I0312 13:36:47.425740 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx"] Mar 12 13:36:47 crc kubenswrapper[4921]: W0312 13:36:47.428179 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea960b85_3b3c_4afb_a363_a3a0e3327701.slice/crio-46ea1f24dfbf1093bd9f0fb23e56ec691a6cfcb54caff950a52b6d85012d2085 WatchSource:0}: Error finding container 46ea1f24dfbf1093bd9f0fb23e56ec691a6cfcb54caff950a52b6d85012d2085: Status 404 returned error can't find the container with id 46ea1f24dfbf1093bd9f0fb23e56ec691a6cfcb54caff950a52b6d85012d2085 Mar 12 13:36:47 crc kubenswrapper[4921]: I0312 13:36:47.549000 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" event={"ID":"ea960b85-3b3c-4afb-a363-a3a0e3327701","Type":"ContainerStarted","Data":"46ea1f24dfbf1093bd9f0fb23e56ec691a6cfcb54caff950a52b6d85012d2085"} Mar 12 13:36:48 crc kubenswrapper[4921]: I0312 13:36:48.557217 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" event={"ID":"ea960b85-3b3c-4afb-a363-a3a0e3327701","Type":"ContainerStarted","Data":"52a7c12169126ebf1da8074f295d3a7a00929d9b8f6c6f3d8cb83850b8805005"} Mar 12 13:36:48 crc kubenswrapper[4921]: I0312 13:36:48.578536 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" podStartSLOduration=2.031402931 podStartE2EDuration="2.578517706s" podCreationTimestamp="2026-03-12 13:36:46 +0000 UTC" firstStartedPulling="2026-03-12 13:36:47.431242034 +0000 UTC m=+1630.121314005" lastFinishedPulling="2026-03-12 13:36:47.978356779 +0000 UTC m=+1630.668428780" observedRunningTime="2026-03-12 13:36:48.577889886 +0000 UTC m=+1631.267961857" watchObservedRunningTime="2026-03-12 13:36:48.578517706 +0000 UTC m=+1631.268589677" Mar 12 13:36:50 crc kubenswrapper[4921]: I0312 13:36:50.983773 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:36:50 crc kubenswrapper[4921]: E0312 13:36:50.984398 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:37:03 crc kubenswrapper[4921]: I0312 13:37:03.984928 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:37:03 crc kubenswrapper[4921]: E0312 13:37:03.985964 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:37:18 crc kubenswrapper[4921]: I0312 13:37:18.984057 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:37:18 crc kubenswrapper[4921]: E0312 13:37:18.984948 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:37:33 crc kubenswrapper[4921]: I0312 13:37:33.983491 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:37:33 crc kubenswrapper[4921]: E0312 13:37:33.984234 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:37:44 crc kubenswrapper[4921]: I0312 13:37:44.984843 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:37:44 crc kubenswrapper[4921]: E0312 13:37:44.986214 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:37:57 crc kubenswrapper[4921]: I0312 13:37:57.999022 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:37:58 crc kubenswrapper[4921]: E0312 13:37:57.999923 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.052450 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-zd7sn"] Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.065205 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-527f-account-create-update-j7dk7"] Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.075610 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-zd7sn"] Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.085378 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-527f-account-create-update-j7dk7"] Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.160256 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555378-c54xt"] Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.162915 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555378-c54xt" Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.165293 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.165293 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.165437 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.173266 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555378-c54xt"] Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.227304 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnptk\" (UniqueName: \"kubernetes.io/projected/c84ef1cf-f416-4617-a119-169a2104bc89-kube-api-access-bnptk\") pod \"auto-csr-approver-29555378-c54xt\" (UID: \"c84ef1cf-f416-4617-a119-169a2104bc89\") " pod="openshift-infra/auto-csr-approver-29555378-c54xt" Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.333351 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnptk\" (UniqueName: \"kubernetes.io/projected/c84ef1cf-f416-4617-a119-169a2104bc89-kube-api-access-bnptk\") pod \"auto-csr-approver-29555378-c54xt\" (UID: \"c84ef1cf-f416-4617-a119-169a2104bc89\") " pod="openshift-infra/auto-csr-approver-29555378-c54xt" Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.362834 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnptk\" (UniqueName: \"kubernetes.io/projected/c84ef1cf-f416-4617-a119-169a2104bc89-kube-api-access-bnptk\") pod \"auto-csr-approver-29555378-c54xt\" (UID: \"c84ef1cf-f416-4617-a119-169a2104bc89\") " pod="openshift-infra/auto-csr-approver-29555378-c54xt" Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.501132 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555378-c54xt" Mar 12 13:38:00 crc kubenswrapper[4921]: I0312 13:38:00.957109 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555378-c54xt"] Mar 12 13:38:00 crc kubenswrapper[4921]: W0312 13:38:00.961998 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc84ef1cf_f416_4617_a119_169a2104bc89.slice/crio-af896ee76911605efe4e16dda2b79e8e5f48677618411e06bb582b15ebc30c6d WatchSource:0}: Error finding container af896ee76911605efe4e16dda2b79e8e5f48677618411e06bb582b15ebc30c6d: Status 404 returned error can't find the container with id af896ee76911605efe4e16dda2b79e8e5f48677618411e06bb582b15ebc30c6d Mar 12 13:38:01 crc kubenswrapper[4921]: I0312 13:38:01.238998 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555378-c54xt" event={"ID":"c84ef1cf-f416-4617-a119-169a2104bc89","Type":"ContainerStarted","Data":"af896ee76911605efe4e16dda2b79e8e5f48677618411e06bb582b15ebc30c6d"} Mar 12 13:38:02 crc kubenswrapper[4921]: I0312 13:38:02.001070 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10be574f-fcea-4cd1-9ce9-7146709cc274" path="/var/lib/kubelet/pods/10be574f-fcea-4cd1-9ce9-7146709cc274/volumes" Mar 12 13:38:02 crc kubenswrapper[4921]: I0312 13:38:02.002522 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7ee872b-3556-4e4b-912a-4124b76e5ccc" path="/var/lib/kubelet/pods/a7ee872b-3556-4e4b-912a-4124b76e5ccc/volumes" Mar 12 13:38:03 crc kubenswrapper[4921]: I0312 13:38:03.259190 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555378-c54xt" event={"ID":"c84ef1cf-f416-4617-a119-169a2104bc89","Type":"ContainerStarted","Data":"ffc2423c09527a0584d870700ec1c9dbbe5c170e2966414c03c79efebf0d440f"} Mar 12 13:38:03 crc kubenswrapper[4921]: I0312 13:38:03.281600 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555378-c54xt" podStartSLOduration=1.5427208860000001 podStartE2EDuration="3.281579977s" podCreationTimestamp="2026-03-12 13:38:00 +0000 UTC" firstStartedPulling="2026-03-12 13:38:00.965607249 +0000 UTC m=+1703.655679220" lastFinishedPulling="2026-03-12 13:38:02.70446629 +0000 UTC m=+1705.394538311" observedRunningTime="2026-03-12 13:38:03.279981998 +0000 UTC m=+1705.970053979" watchObservedRunningTime="2026-03-12 13:38:03.281579977 +0000 UTC m=+1705.971651958" Mar 12 13:38:04 crc kubenswrapper[4921]: I0312 13:38:04.272737 4921 generic.go:334] "Generic (PLEG): container finished" podID="c84ef1cf-f416-4617-a119-169a2104bc89" containerID="ffc2423c09527a0584d870700ec1c9dbbe5c170e2966414c03c79efebf0d440f" exitCode=0 Mar 12 13:38:04 crc kubenswrapper[4921]: I0312 13:38:04.272807 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555378-c54xt" event={"ID":"c84ef1cf-f416-4617-a119-169a2104bc89","Type":"ContainerDied","Data":"ffc2423c09527a0584d870700ec1c9dbbe5c170e2966414c03c79efebf0d440f"} Mar 12 13:38:05 crc kubenswrapper[4921]: I0312 13:38:05.677539 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555378-c54xt" Mar 12 13:38:05 crc kubenswrapper[4921]: I0312 13:38:05.856142 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnptk\" (UniqueName: \"kubernetes.io/projected/c84ef1cf-f416-4617-a119-169a2104bc89-kube-api-access-bnptk\") pod \"c84ef1cf-f416-4617-a119-169a2104bc89\" (UID: \"c84ef1cf-f416-4617-a119-169a2104bc89\") " Mar 12 13:38:05 crc kubenswrapper[4921]: I0312 13:38:05.862043 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c84ef1cf-f416-4617-a119-169a2104bc89-kube-api-access-bnptk" (OuterVolumeSpecName: "kube-api-access-bnptk") pod "c84ef1cf-f416-4617-a119-169a2104bc89" (UID: "c84ef1cf-f416-4617-a119-169a2104bc89"). InnerVolumeSpecName "kube-api-access-bnptk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:38:05 crc kubenswrapper[4921]: I0312 13:38:05.957411 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnptk\" (UniqueName: \"kubernetes.io/projected/c84ef1cf-f416-4617-a119-169a2104bc89-kube-api-access-bnptk\") on node \"crc\" DevicePath \"\"" Mar 12 13:38:06 crc kubenswrapper[4921]: I0312 13:38:06.039297 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0711-account-create-update-8gxd6"] Mar 12 13:38:06 crc kubenswrapper[4921]: I0312 13:38:06.048561 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bb3b-account-create-update-bvs2n"] Mar 12 13:38:06 crc kubenswrapper[4921]: I0312 13:38:06.061499 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-c2x57"] Mar 12 13:38:06 crc kubenswrapper[4921]: I0312 13:38:06.070465 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0711-account-create-update-8gxd6"] Mar 12 13:38:06 crc kubenswrapper[4921]: I0312 13:38:06.080396 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-c2x57"] Mar 12 13:38:06 crc kubenswrapper[4921]: I0312 13:38:06.088444 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bb3b-account-create-update-bvs2n"] Mar 12 13:38:06 crc kubenswrapper[4921]: I0312 13:38:06.095668 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-frghj"] Mar 12 13:38:06 crc kubenswrapper[4921]: I0312 13:38:06.102509 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-frghj"] Mar 12 13:38:06 crc kubenswrapper[4921]: I0312 13:38:06.291803 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555378-c54xt" event={"ID":"c84ef1cf-f416-4617-a119-169a2104bc89","Type":"ContainerDied","Data":"af896ee76911605efe4e16dda2b79e8e5f48677618411e06bb582b15ebc30c6d"} Mar 12 13:38:06 crc kubenswrapper[4921]: I0312 13:38:06.291873 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af896ee76911605efe4e16dda2b79e8e5f48677618411e06bb582b15ebc30c6d" Mar 12 13:38:06 crc kubenswrapper[4921]: I0312 13:38:06.291877 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555378-c54xt" Mar 12 13:38:06 crc kubenswrapper[4921]: I0312 13:38:06.345385 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555372-2hkm4"] Mar 12 13:38:06 crc kubenswrapper[4921]: I0312 13:38:06.354046 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555372-2hkm4"] Mar 12 13:38:07 crc kubenswrapper[4921]: I0312 13:38:07.993625 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c025e13-55c5-4aab-8815-d7ab022219b7" path="/var/lib/kubelet/pods/4c025e13-55c5-4aab-8815-d7ab022219b7/volumes" Mar 12 13:38:07 crc kubenswrapper[4921]: I0312 13:38:07.994835 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64049769-9653-4351-b97a-881100721f77" path="/var/lib/kubelet/pods/64049769-9653-4351-b97a-881100721f77/volumes" Mar 12 13:38:07 crc kubenswrapper[4921]: I0312 13:38:07.995427 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f8352a3-a443-4b74-b6a3-57b2074f7cef" path="/var/lib/kubelet/pods/6f8352a3-a443-4b74-b6a3-57b2074f7cef/volumes" Mar 12 13:38:07 crc kubenswrapper[4921]: I0312 13:38:07.996014 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa3bfb0c-259e-44db-a8ca-4ea73bd4493d" path="/var/lib/kubelet/pods/aa3bfb0c-259e-44db-a8ca-4ea73bd4493d/volumes" Mar 12 13:38:07 crc kubenswrapper[4921]: I0312 13:38:07.997191 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc827adf-e6a8-4249-a452-8af8f3cde429" path="/var/lib/kubelet/pods/bc827adf-e6a8-4249-a452-8af8f3cde429/volumes" Mar 12 13:38:09 crc kubenswrapper[4921]: I0312 13:38:09.983771 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:38:09 crc kubenswrapper[4921]: E0312 13:38:09.984206 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:38:10 crc kubenswrapper[4921]: I0312 13:38:10.353678 4921 generic.go:334] "Generic (PLEG): container finished" podID="ea960b85-3b3c-4afb-a363-a3a0e3327701" containerID="52a7c12169126ebf1da8074f295d3a7a00929d9b8f6c6f3d8cb83850b8805005" exitCode=0 Mar 12 13:38:10 crc kubenswrapper[4921]: I0312 13:38:10.353882 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" event={"ID":"ea960b85-3b3c-4afb-a363-a3a0e3327701","Type":"ContainerDied","Data":"52a7c12169126ebf1da8074f295d3a7a00929d9b8f6c6f3d8cb83850b8805005"} Mar 12 13:38:11 crc kubenswrapper[4921]: I0312 13:38:11.905729 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.023901 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea960b85-3b3c-4afb-a363-a3a0e3327701-inventory\") pod \"ea960b85-3b3c-4afb-a363-a3a0e3327701\" (UID: \"ea960b85-3b3c-4afb-a363-a3a0e3327701\") " Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.024069 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkm6l\" (UniqueName: \"kubernetes.io/projected/ea960b85-3b3c-4afb-a363-a3a0e3327701-kube-api-access-xkm6l\") pod \"ea960b85-3b3c-4afb-a363-a3a0e3327701\" (UID: \"ea960b85-3b3c-4afb-a363-a3a0e3327701\") " Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.024138 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea960b85-3b3c-4afb-a363-a3a0e3327701-ssh-key-openstack-edpm-ipam\") pod \"ea960b85-3b3c-4afb-a363-a3a0e3327701\" (UID: \"ea960b85-3b3c-4afb-a363-a3a0e3327701\") " Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.031575 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea960b85-3b3c-4afb-a363-a3a0e3327701-kube-api-access-xkm6l" (OuterVolumeSpecName: "kube-api-access-xkm6l") pod "ea960b85-3b3c-4afb-a363-a3a0e3327701" (UID: "ea960b85-3b3c-4afb-a363-a3a0e3327701"). InnerVolumeSpecName "kube-api-access-xkm6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.048365 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea960b85-3b3c-4afb-a363-a3a0e3327701-inventory" (OuterVolumeSpecName: "inventory") pod "ea960b85-3b3c-4afb-a363-a3a0e3327701" (UID: "ea960b85-3b3c-4afb-a363-a3a0e3327701"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.053648 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea960b85-3b3c-4afb-a363-a3a0e3327701-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ea960b85-3b3c-4afb-a363-a3a0e3327701" (UID: "ea960b85-3b3c-4afb-a363-a3a0e3327701"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.126759 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ea960b85-3b3c-4afb-a363-a3a0e3327701-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.126803 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea960b85-3b3c-4afb-a363-a3a0e3327701-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.126837 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkm6l\" (UniqueName: \"kubernetes.io/projected/ea960b85-3b3c-4afb-a363-a3a0e3327701-kube-api-access-xkm6l\") on node \"crc\" DevicePath \"\"" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.369027 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" event={"ID":"ea960b85-3b3c-4afb-a363-a3a0e3327701","Type":"ContainerDied","Data":"46ea1f24dfbf1093bd9f0fb23e56ec691a6cfcb54caff950a52b6d85012d2085"} Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.369260 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46ea1f24dfbf1093bd9f0fb23e56ec691a6cfcb54caff950a52b6d85012d2085" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.369095 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.543474 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq"] Mar 12 13:38:12 crc kubenswrapper[4921]: E0312 13:38:12.543990 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea960b85-3b3c-4afb-a363-a3a0e3327701" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.544013 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea960b85-3b3c-4afb-a363-a3a0e3327701" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 13:38:12 crc kubenswrapper[4921]: E0312 13:38:12.544044 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c84ef1cf-f416-4617-a119-169a2104bc89" containerName="oc" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.544054 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c84ef1cf-f416-4617-a119-169a2104bc89" containerName="oc" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.544259 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea960b85-3b3c-4afb-a363-a3a0e3327701" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.544285 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c84ef1cf-f416-4617-a119-169a2104bc89" containerName="oc" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.544916 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.546538 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.547368 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.547497 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.547718 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.554120 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq"] Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.740134 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffdr6\" (UniqueName: \"kubernetes.io/projected/0dddd99f-1673-4e6e-983e-f5667c60e686-kube-api-access-ffdr6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq\" (UID: \"0dddd99f-1673-4e6e-983e-f5667c60e686\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.740303 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dddd99f-1673-4e6e-983e-f5667c60e686-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq\" (UID: \"0dddd99f-1673-4e6e-983e-f5667c60e686\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.740371 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dddd99f-1673-4e6e-983e-f5667c60e686-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq\" (UID: \"0dddd99f-1673-4e6e-983e-f5667c60e686\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.841658 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dddd99f-1673-4e6e-983e-f5667c60e686-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq\" (UID: \"0dddd99f-1673-4e6e-983e-f5667c60e686\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.841747 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dddd99f-1673-4e6e-983e-f5667c60e686-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq\" (UID: \"0dddd99f-1673-4e6e-983e-f5667c60e686\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.841795 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffdr6\" (UniqueName: \"kubernetes.io/projected/0dddd99f-1673-4e6e-983e-f5667c60e686-kube-api-access-ffdr6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq\" (UID: \"0dddd99f-1673-4e6e-983e-f5667c60e686\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.854351 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dddd99f-1673-4e6e-983e-f5667c60e686-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq\" (UID: \"0dddd99f-1673-4e6e-983e-f5667c60e686\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.854397 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dddd99f-1673-4e6e-983e-f5667c60e686-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq\" (UID: \"0dddd99f-1673-4e6e-983e-f5667c60e686\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" Mar 12 13:38:12 crc kubenswrapper[4921]: I0312 13:38:12.870528 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffdr6\" (UniqueName: \"kubernetes.io/projected/0dddd99f-1673-4e6e-983e-f5667c60e686-kube-api-access-ffdr6\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq\" (UID: \"0dddd99f-1673-4e6e-983e-f5667c60e686\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" Mar 12 13:38:13 crc kubenswrapper[4921]: I0312 13:38:13.167161 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" Mar 12 13:38:13 crc kubenswrapper[4921]: I0312 13:38:13.508665 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq"] Mar 12 13:38:14 crc kubenswrapper[4921]: I0312 13:38:14.385296 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" event={"ID":"0dddd99f-1673-4e6e-983e-f5667c60e686","Type":"ContainerStarted","Data":"1acf22ddeb5d88f6a619c688958b4a143b097e3e0314fa0c5f5765363c6f35b3"} Mar 12 13:38:14 crc kubenswrapper[4921]: I0312 13:38:14.385558 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" event={"ID":"0dddd99f-1673-4e6e-983e-f5667c60e686","Type":"ContainerStarted","Data":"c041cd4768278f390cd8f4709308c7874bbc8819560cbaf997ec28420245608b"} Mar 12 13:38:14 crc kubenswrapper[4921]: I0312 13:38:14.403720 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" podStartSLOduration=1.932294838 podStartE2EDuration="2.403700352s" podCreationTimestamp="2026-03-12 13:38:12 +0000 UTC" firstStartedPulling="2026-03-12 13:38:13.507596298 +0000 UTC m=+1716.197668269" lastFinishedPulling="2026-03-12 13:38:13.979001812 +0000 UTC m=+1716.669073783" observedRunningTime="2026-03-12 13:38:14.399648089 +0000 UTC m=+1717.089720130" watchObservedRunningTime="2026-03-12 13:38:14.403700352 +0000 UTC m=+1717.093772323" Mar 12 13:38:18 crc kubenswrapper[4921]: I0312 13:38:18.043493 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bd2gn"] Mar 12 13:38:18 crc kubenswrapper[4921]: I0312 13:38:18.051165 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bd2gn"] Mar 12 13:38:19 crc kubenswrapper[4921]: I0312 13:38:19.430493 4921 generic.go:334] "Generic (PLEG): container finished" podID="0dddd99f-1673-4e6e-983e-f5667c60e686" containerID="1acf22ddeb5d88f6a619c688958b4a143b097e3e0314fa0c5f5765363c6f35b3" exitCode=0 Mar 12 13:38:19 crc kubenswrapper[4921]: I0312 13:38:19.430616 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" event={"ID":"0dddd99f-1673-4e6e-983e-f5667c60e686","Type":"ContainerDied","Data":"1acf22ddeb5d88f6a619c688958b4a143b097e3e0314fa0c5f5765363c6f35b3"} Mar 12 13:38:19 crc kubenswrapper[4921]: I0312 13:38:19.995189 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97c3d1a4-1ecf-4c47-86e2-068336153e40" path="/var/lib/kubelet/pods/97c3d1a4-1ecf-4c47-86e2-068336153e40/volumes" Mar 12 13:38:20 crc kubenswrapper[4921]: I0312 13:38:20.919425 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.007681 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dddd99f-1673-4e6e-983e-f5667c60e686-inventory\") pod \"0dddd99f-1673-4e6e-983e-f5667c60e686\" (UID: \"0dddd99f-1673-4e6e-983e-f5667c60e686\") " Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.007973 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dddd99f-1673-4e6e-983e-f5667c60e686-ssh-key-openstack-edpm-ipam\") pod \"0dddd99f-1673-4e6e-983e-f5667c60e686\" (UID: \"0dddd99f-1673-4e6e-983e-f5667c60e686\") " Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.008057 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffdr6\" (UniqueName: \"kubernetes.io/projected/0dddd99f-1673-4e6e-983e-f5667c60e686-kube-api-access-ffdr6\") pod \"0dddd99f-1673-4e6e-983e-f5667c60e686\" (UID: \"0dddd99f-1673-4e6e-983e-f5667c60e686\") " Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.017624 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dddd99f-1673-4e6e-983e-f5667c60e686-kube-api-access-ffdr6" (OuterVolumeSpecName: "kube-api-access-ffdr6") pod "0dddd99f-1673-4e6e-983e-f5667c60e686" (UID: "0dddd99f-1673-4e6e-983e-f5667c60e686"). InnerVolumeSpecName "kube-api-access-ffdr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.035729 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dddd99f-1673-4e6e-983e-f5667c60e686-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0dddd99f-1673-4e6e-983e-f5667c60e686" (UID: "0dddd99f-1673-4e6e-983e-f5667c60e686"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.051626 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dddd99f-1673-4e6e-983e-f5667c60e686-inventory" (OuterVolumeSpecName: "inventory") pod "0dddd99f-1673-4e6e-983e-f5667c60e686" (UID: "0dddd99f-1673-4e6e-983e-f5667c60e686"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.110932 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dddd99f-1673-4e6e-983e-f5667c60e686-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.110984 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dddd99f-1673-4e6e-983e-f5667c60e686-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.111005 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffdr6\" (UniqueName: \"kubernetes.io/projected/0dddd99f-1673-4e6e-983e-f5667c60e686-kube-api-access-ffdr6\") on node \"crc\" DevicePath \"\"" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.460326 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" event={"ID":"0dddd99f-1673-4e6e-983e-f5667c60e686","Type":"ContainerDied","Data":"c041cd4768278f390cd8f4709308c7874bbc8819560cbaf997ec28420245608b"} Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.460368 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c041cd4768278f390cd8f4709308c7874bbc8819560cbaf997ec28420245608b" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.460415 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.568497 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg"] Mar 12 13:38:21 crc kubenswrapper[4921]: E0312 13:38:21.569023 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dddd99f-1673-4e6e-983e-f5667c60e686" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.569049 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dddd99f-1673-4e6e-983e-f5667c60e686" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.569296 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dddd99f-1673-4e6e-983e-f5667c60e686" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.570039 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.572888 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.573514 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.573541 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.583400 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.585485 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg"] Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.719471 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz9ws\" (UniqueName: \"kubernetes.io/projected/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-kube-api-access-wz9ws\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-v5mbg\" (UID: \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.719523 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-v5mbg\" (UID: \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.719580 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-v5mbg\" (UID: \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.820953 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz9ws\" (UniqueName: \"kubernetes.io/projected/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-kube-api-access-wz9ws\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-v5mbg\" (UID: \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.821000 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-v5mbg\" (UID: \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.821053 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-v5mbg\" (UID: \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.826707 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-v5mbg\" (UID: \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.826851 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-v5mbg\" (UID: \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.839313 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz9ws\" (UniqueName: \"kubernetes.io/projected/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-kube-api-access-wz9ws\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-v5mbg\" (UID: \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" Mar 12 13:38:21 crc kubenswrapper[4921]: I0312 13:38:21.895954 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" Mar 12 13:38:22 crc kubenswrapper[4921]: I0312 13:38:22.424258 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg"] Mar 12 13:38:22 crc kubenswrapper[4921]: W0312 13:38:22.430900 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5ed6b8d_cd37_4f5a_b673_90c3527d99dd.slice/crio-8842bab4286821d5a8b424bb6eea45d06610a954241f3f4568951009c438ebcd WatchSource:0}: Error finding container 8842bab4286821d5a8b424bb6eea45d06610a954241f3f4568951009c438ebcd: Status 404 returned error can't find the container with id 8842bab4286821d5a8b424bb6eea45d06610a954241f3f4568951009c438ebcd Mar 12 13:38:22 crc kubenswrapper[4921]: I0312 13:38:22.433540 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:38:22 crc kubenswrapper[4921]: I0312 13:38:22.469362 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" event={"ID":"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd","Type":"ContainerStarted","Data":"8842bab4286821d5a8b424bb6eea45d06610a954241f3f4568951009c438ebcd"} Mar 12 13:38:23 crc kubenswrapper[4921]: I0312 13:38:23.041579 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hnqjm"] Mar 12 13:38:23 crc kubenswrapper[4921]: I0312 13:38:23.058642 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hnqjm"] Mar 12 13:38:23 crc kubenswrapper[4921]: I0312 13:38:23.479687 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" event={"ID":"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd","Type":"ContainerStarted","Data":"4bbf860c94b9ecdf0cb3c7221f98a8ca71810aa582ecad1f6a39308e27449edd"} Mar 12 13:38:23 crc kubenswrapper[4921]: I0312 13:38:23.515293 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" podStartSLOduration=1.713411008 podStartE2EDuration="2.515272161s" podCreationTimestamp="2026-03-12 13:38:21 +0000 UTC" firstStartedPulling="2026-03-12 13:38:22.433264742 +0000 UTC m=+1725.123336723" lastFinishedPulling="2026-03-12 13:38:23.235125895 +0000 UTC m=+1725.925197876" observedRunningTime="2026-03-12 13:38:23.508246324 +0000 UTC m=+1726.198318305" watchObservedRunningTime="2026-03-12 13:38:23.515272161 +0000 UTC m=+1726.205344132" Mar 12 13:38:24 crc kubenswrapper[4921]: I0312 13:38:23.999952 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c4cb6d3-4372-4daa-bebb-49c822b98228" path="/var/lib/kubelet/pods/7c4cb6d3-4372-4daa-bebb-49c822b98228/volumes" Mar 12 13:38:24 crc kubenswrapper[4921]: I0312 13:38:24.984517 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:38:24 crc kubenswrapper[4921]: E0312 13:38:24.985113 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.068017 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b1f7-account-create-update-7qvml"] Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.073115 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qs7p6"] Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.088527 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-jwmtn"] Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.096458 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-vdqdz"] Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.102865 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ceba-account-create-update-wpb2z"] Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.109508 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qs7p6"] Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.116644 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-jwmtn"] Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.123546 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ceba-account-create-update-wpb2z"] Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.130321 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b1f7-account-create-update-7qvml"] Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.140113 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-vdqdz"] Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.148105 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-5b2c-account-create-update-wtbtv"] Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.154451 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-5b2c-account-create-update-wtbtv"] Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.997544 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="234f532a-d318-49e4-91b9-731f2caa088d" path="/var/lib/kubelet/pods/234f532a-d318-49e4-91b9-731f2caa088d/volumes" Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.998442 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30f75b77-6080-41e8-a5db-9aa45c1c8fec" path="/var/lib/kubelet/pods/30f75b77-6080-41e8-a5db-9aa45c1c8fec/volumes" Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.998948 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4063981c-ffb9-4312-887c-8ca83e478a9d" path="/var/lib/kubelet/pods/4063981c-ffb9-4312-887c-8ca83e478a9d/volumes" Mar 12 13:38:33 crc kubenswrapper[4921]: I0312 13:38:33.999420 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e61f83e-1a98-4c70-adfd-537d68cf4d62" path="/var/lib/kubelet/pods/5e61f83e-1a98-4c70-adfd-537d68cf4d62/volumes" Mar 12 13:38:34 crc kubenswrapper[4921]: I0312 13:38:34.000387 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95af087a-ade7-45d2-b6a6-6ba5f6377393" path="/var/lib/kubelet/pods/95af087a-ade7-45d2-b6a6-6ba5f6377393/volumes" Mar 12 13:38:34 crc kubenswrapper[4921]: I0312 13:38:34.000896 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b224a4c1-1b4b-47d5-ac92-98560fbb0ca9" path="/var/lib/kubelet/pods/b224a4c1-1b4b-47d5-ac92-98560fbb0ca9/volumes" Mar 12 13:38:39 crc kubenswrapper[4921]: I0312 13:38:39.983988 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:38:39 crc kubenswrapper[4921]: E0312 13:38:39.985191 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:38:40 crc kubenswrapper[4921]: I0312 13:38:40.982563 4921 scope.go:117] "RemoveContainer" containerID="c5a788ad4a878e35801a780683af5419bc1e72b1a39b63de7f8324361e451cc8" Mar 12 13:38:41 crc kubenswrapper[4921]: I0312 13:38:41.011587 4921 scope.go:117] "RemoveContainer" containerID="97a9d21a513fc667b427eec317df9c0a344aab77b82686f22a9fe4bcced47675" Mar 12 13:38:41 crc kubenswrapper[4921]: I0312 13:38:41.048717 4921 scope.go:117] "RemoveContainer" containerID="de7f23af07688eabaa9e26675fde1b7e4b563ecbdbec31b2feb3d49976f22ea0" Mar 12 13:38:41 crc kubenswrapper[4921]: I0312 13:38:41.091038 4921 scope.go:117] "RemoveContainer" containerID="f3c22bce66ad9c60d84382052c554ec84ad5fedfddff0fb93f94d516208b2105" Mar 12 13:38:41 crc kubenswrapper[4921]: I0312 13:38:41.140958 4921 scope.go:117] "RemoveContainer" containerID="7a63e758ad1ee2f086920e4d67ff603bedb55499f9447dbeb3145b227a411b3a" Mar 12 13:38:41 crc kubenswrapper[4921]: I0312 13:38:41.168411 4921 scope.go:117] "RemoveContainer" containerID="6918883df55f669524e9f77d8ffb0901e2288660f9b9e5a59566193fe46eb6f8" Mar 12 13:38:41 crc kubenswrapper[4921]: I0312 13:38:41.209588 4921 scope.go:117] "RemoveContainer" containerID="5b283a65f5056f6ae601c22ed7b82889c24f2d6afaa8210f42d6439b250c45ef" Mar 12 13:38:41 crc kubenswrapper[4921]: I0312 13:38:41.232644 4921 scope.go:117] "RemoveContainer" containerID="31c30a1d36d9057df2e5dfe033bc499851831c8a1a89ae99c250e9ea97fdb240" Mar 12 13:38:41 crc kubenswrapper[4921]: I0312 13:38:41.255249 4921 scope.go:117] "RemoveContainer" containerID="9cec882aa5a1f8e0abc7ae0ff9eb59c325d554327f2270dd8000777bdbdd8629" Mar 12 13:38:41 crc kubenswrapper[4921]: I0312 13:38:41.277040 4921 scope.go:117] "RemoveContainer" containerID="bc8fdd860ab4f86d44fc1e13fab924a2c27b7cbae950ef922b8e9f891f6f72b4" Mar 12 13:38:41 crc kubenswrapper[4921]: I0312 13:38:41.306349 4921 scope.go:117] "RemoveContainer" containerID="1f851b7e53f0bc224acd376a19e567dfb54923dfe171a0a2f2b16de487de0f93" Mar 12 13:38:41 crc kubenswrapper[4921]: I0312 13:38:41.326210 4921 scope.go:117] "RemoveContainer" containerID="9a257cbc99df1790ab488b536010ac8715d24cfba59c78d993f9bc8b8cb969b3" Mar 12 13:38:41 crc kubenswrapper[4921]: I0312 13:38:41.366091 4921 scope.go:117] "RemoveContainer" containerID="b3bfff4e7e90150268e08900a3c490fdf92482415e78b8f9ef56da1dd9945d4c" Mar 12 13:38:41 crc kubenswrapper[4921]: I0312 13:38:41.385781 4921 scope.go:117] "RemoveContainer" containerID="8e321f1a10378780b629912a343c698799d64170b4292d09a4f72f1bada3bf74" Mar 12 13:38:41 crc kubenswrapper[4921]: I0312 13:38:41.404524 4921 scope.go:117] "RemoveContainer" containerID="d03755685a95e15c9d405cf6b91bf49be24aff7ef896032795f7e72c3475ba95" Mar 12 13:38:42 crc kubenswrapper[4921]: I0312 13:38:42.036387 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-9sscd"] Mar 12 13:38:42 crc kubenswrapper[4921]: I0312 13:38:42.051186 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-9sscd"] Mar 12 13:38:43 crc kubenswrapper[4921]: I0312 13:38:43.998343 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5b6fc72-721e-4dc3-9aa7-98707cfd403c" path="/var/lib/kubelet/pods/f5b6fc72-721e-4dc3-9aa7-98707cfd403c/volumes" Mar 12 13:38:52 crc kubenswrapper[4921]: I0312 13:38:52.984378 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:38:52 crc kubenswrapper[4921]: E0312 13:38:52.985385 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:39:05 crc kubenswrapper[4921]: I0312 13:39:05.051070 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fv8c4"] Mar 12 13:39:05 crc kubenswrapper[4921]: I0312 13:39:05.060052 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fv8c4"] Mar 12 13:39:05 crc kubenswrapper[4921]: I0312 13:39:05.878503 4921 generic.go:334] "Generic (PLEG): container finished" podID="a5ed6b8d-cd37-4f5a-b673-90c3527d99dd" containerID="4bbf860c94b9ecdf0cb3c7221f98a8ca71810aa582ecad1f6a39308e27449edd" exitCode=0 Mar 12 13:39:05 crc kubenswrapper[4921]: I0312 13:39:05.878586 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" event={"ID":"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd","Type":"ContainerDied","Data":"4bbf860c94b9ecdf0cb3c7221f98a8ca71810aa582ecad1f6a39308e27449edd"} Mar 12 13:39:05 crc kubenswrapper[4921]: I0312 13:39:05.984231 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:39:05 crc kubenswrapper[4921]: E0312 13:39:05.984682 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:39:06 crc kubenswrapper[4921]: I0312 13:39:06.001907 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5331d35e-1086-4e7f-aa2f-164117b3df44" path="/var/lib/kubelet/pods/5331d35e-1086-4e7f-aa2f-164117b3df44/volumes" Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.038217 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-7jtcj"] Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.045887 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-7jtcj"] Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.345115 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.522919 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz9ws\" (UniqueName: \"kubernetes.io/projected/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-kube-api-access-wz9ws\") pod \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\" (UID: \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\") " Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.523081 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-ssh-key-openstack-edpm-ipam\") pod \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\" (UID: \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\") " Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.523212 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-inventory\") pod \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\" (UID: \"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd\") " Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.533015 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-kube-api-access-wz9ws" (OuterVolumeSpecName: "kube-api-access-wz9ws") pod "a5ed6b8d-cd37-4f5a-b673-90c3527d99dd" (UID: "a5ed6b8d-cd37-4f5a-b673-90c3527d99dd"). InnerVolumeSpecName "kube-api-access-wz9ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.556076 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a5ed6b8d-cd37-4f5a-b673-90c3527d99dd" (UID: "a5ed6b8d-cd37-4f5a-b673-90c3527d99dd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.564035 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-inventory" (OuterVolumeSpecName: "inventory") pod "a5ed6b8d-cd37-4f5a-b673-90c3527d99dd" (UID: "a5ed6b8d-cd37-4f5a-b673-90c3527d99dd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.625060 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.625106 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.625126 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz9ws\" (UniqueName: \"kubernetes.io/projected/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd-kube-api-access-wz9ws\") on node \"crc\" DevicePath \"\"" Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.897546 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" event={"ID":"a5ed6b8d-cd37-4f5a-b673-90c3527d99dd","Type":"ContainerDied","Data":"8842bab4286821d5a8b424bb6eea45d06610a954241f3f4568951009c438ebcd"} Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.897585 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8842bab4286821d5a8b424bb6eea45d06610a954241f3f4568951009c438ebcd" Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.898010 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg" Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.997554 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd04ca5a-99dd-40dc-9fb8-0722ca1e4015" path="/var/lib/kubelet/pods/dd04ca5a-99dd-40dc-9fb8-0722ca1e4015/volumes" Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.998346 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx"] Mar 12 13:39:07 crc kubenswrapper[4921]: E0312 13:39:07.998711 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5ed6b8d-cd37-4f5a-b673-90c3527d99dd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.998731 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5ed6b8d-cd37-4f5a-b673-90c3527d99dd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.999003 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5ed6b8d-cd37-4f5a-b673-90c3527d99dd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:39:07 crc kubenswrapper[4921]: I0312 13:39:07.999721 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.001918 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.002407 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.002456 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.002550 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.004589 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx"] Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.035546 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2db44bac-3fcf-42b8-9703-4dc130b8b31f-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx\" (UID: \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.035594 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2db44bac-3fcf-42b8-9703-4dc130b8b31f-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx\" (UID: \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.035948 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tflzz\" (UniqueName: \"kubernetes.io/projected/2db44bac-3fcf-42b8-9703-4dc130b8b31f-kube-api-access-tflzz\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx\" (UID: \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.137627 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tflzz\" (UniqueName: \"kubernetes.io/projected/2db44bac-3fcf-42b8-9703-4dc130b8b31f-kube-api-access-tflzz\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx\" (UID: \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.137855 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2db44bac-3fcf-42b8-9703-4dc130b8b31f-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx\" (UID: \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.137901 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2db44bac-3fcf-42b8-9703-4dc130b8b31f-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx\" (UID: \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.142444 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2db44bac-3fcf-42b8-9703-4dc130b8b31f-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx\" (UID: \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.143249 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2db44bac-3fcf-42b8-9703-4dc130b8b31f-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx\" (UID: \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.157963 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tflzz\" (UniqueName: \"kubernetes.io/projected/2db44bac-3fcf-42b8-9703-4dc130b8b31f-kube-api-access-tflzz\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx\" (UID: \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.327093 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.863874 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx"] Mar 12 13:39:08 crc kubenswrapper[4921]: I0312 13:39:08.909933 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" event={"ID":"2db44bac-3fcf-42b8-9703-4dc130b8b31f","Type":"ContainerStarted","Data":"67de4ddba7a16fe3bc47b96617813e2c3fbe3225ecdc9fe32cda8e49f6893369"} Mar 12 13:39:09 crc kubenswrapper[4921]: I0312 13:39:09.918565 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" event={"ID":"2db44bac-3fcf-42b8-9703-4dc130b8b31f","Type":"ContainerStarted","Data":"87b1556fc0b76ff6dc9a3b973a1949039dbe94fada8cca491cee4ba53e803be2"} Mar 12 13:39:09 crc kubenswrapper[4921]: I0312 13:39:09.935650 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" podStartSLOduration=2.457005877 podStartE2EDuration="2.935628302s" podCreationTimestamp="2026-03-12 13:39:07 +0000 UTC" firstStartedPulling="2026-03-12 13:39:08.870575195 +0000 UTC m=+1771.560647186" lastFinishedPulling="2026-03-12 13:39:09.34919763 +0000 UTC m=+1772.039269611" observedRunningTime="2026-03-12 13:39:09.932353172 +0000 UTC m=+1772.622425143" watchObservedRunningTime="2026-03-12 13:39:09.935628302 +0000 UTC m=+1772.625700283" Mar 12 13:39:12 crc kubenswrapper[4921]: I0312 13:39:12.039860 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kf74w"] Mar 12 13:39:12 crc kubenswrapper[4921]: I0312 13:39:12.050496 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kf74w"] Mar 12 13:39:13 crc kubenswrapper[4921]: I0312 13:39:13.957101 4921 generic.go:334] "Generic (PLEG): container finished" podID="2db44bac-3fcf-42b8-9703-4dc130b8b31f" containerID="87b1556fc0b76ff6dc9a3b973a1949039dbe94fada8cca491cee4ba53e803be2" exitCode=0 Mar 12 13:39:13 crc kubenswrapper[4921]: I0312 13:39:13.957168 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" event={"ID":"2db44bac-3fcf-42b8-9703-4dc130b8b31f","Type":"ContainerDied","Data":"87b1556fc0b76ff6dc9a3b973a1949039dbe94fada8cca491cee4ba53e803be2"} Mar 12 13:39:13 crc kubenswrapper[4921]: I0312 13:39:13.994351 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e49b37-c533-4d52-9ed8-dcb54e4c0955" path="/var/lib/kubelet/pods/e3e49b37-c533-4d52-9ed8-dcb54e4c0955/volumes" Mar 12 13:39:15 crc kubenswrapper[4921]: I0312 13:39:15.366594 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" Mar 12 13:39:15 crc kubenswrapper[4921]: I0312 13:39:15.481847 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2db44bac-3fcf-42b8-9703-4dc130b8b31f-ssh-key-openstack-edpm-ipam\") pod \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\" (UID: \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\") " Mar 12 13:39:15 crc kubenswrapper[4921]: I0312 13:39:15.482166 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2db44bac-3fcf-42b8-9703-4dc130b8b31f-inventory\") pod \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\" (UID: \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\") " Mar 12 13:39:15 crc kubenswrapper[4921]: I0312 13:39:15.482249 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tflzz\" (UniqueName: \"kubernetes.io/projected/2db44bac-3fcf-42b8-9703-4dc130b8b31f-kube-api-access-tflzz\") pod \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\" (UID: \"2db44bac-3fcf-42b8-9703-4dc130b8b31f\") " Mar 12 13:39:15 crc kubenswrapper[4921]: I0312 13:39:15.487890 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db44bac-3fcf-42b8-9703-4dc130b8b31f-kube-api-access-tflzz" (OuterVolumeSpecName: "kube-api-access-tflzz") pod "2db44bac-3fcf-42b8-9703-4dc130b8b31f" (UID: "2db44bac-3fcf-42b8-9703-4dc130b8b31f"). InnerVolumeSpecName "kube-api-access-tflzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:39:15 crc kubenswrapper[4921]: I0312 13:39:15.518379 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db44bac-3fcf-42b8-9703-4dc130b8b31f-inventory" (OuterVolumeSpecName: "inventory") pod "2db44bac-3fcf-42b8-9703-4dc130b8b31f" (UID: "2db44bac-3fcf-42b8-9703-4dc130b8b31f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:39:15 crc kubenswrapper[4921]: I0312 13:39:15.518955 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db44bac-3fcf-42b8-9703-4dc130b8b31f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2db44bac-3fcf-42b8-9703-4dc130b8b31f" (UID: "2db44bac-3fcf-42b8-9703-4dc130b8b31f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:39:15 crc kubenswrapper[4921]: I0312 13:39:15.584766 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tflzz\" (UniqueName: \"kubernetes.io/projected/2db44bac-3fcf-42b8-9703-4dc130b8b31f-kube-api-access-tflzz\") on node \"crc\" DevicePath \"\"" Mar 12 13:39:15 crc kubenswrapper[4921]: I0312 13:39:15.584824 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2db44bac-3fcf-42b8-9703-4dc130b8b31f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:39:15 crc kubenswrapper[4921]: I0312 13:39:15.584839 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2db44bac-3fcf-42b8-9703-4dc130b8b31f-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:39:15 crc kubenswrapper[4921]: I0312 13:39:15.978314 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" event={"ID":"2db44bac-3fcf-42b8-9703-4dc130b8b31f","Type":"ContainerDied","Data":"67de4ddba7a16fe3bc47b96617813e2c3fbe3225ecdc9fe32cda8e49f6893369"} Mar 12 13:39:15 crc kubenswrapper[4921]: I0312 13:39:15.978381 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67de4ddba7a16fe3bc47b96617813e2c3fbe3225ecdc9fe32cda8e49f6893369" Mar 12 13:39:15 crc kubenswrapper[4921]: I0312 13:39:15.978349 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.047708 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb"] Mar 12 13:39:16 crc kubenswrapper[4921]: E0312 13:39:16.048068 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db44bac-3fcf-42b8-9703-4dc130b8b31f" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.048085 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db44bac-3fcf-42b8-9703-4dc130b8b31f" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.048257 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db44bac-3fcf-42b8-9703-4dc130b8b31f" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.048976 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.051228 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.051298 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.051777 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.052464 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.073864 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb"] Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.200389 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f57794-2ec2-4a54-b5ec-e955a3e65288-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-djzqb\" (UID: \"13f57794-2ec2-4a54-b5ec-e955a3e65288\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.200465 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4kzj\" (UniqueName: \"kubernetes.io/projected/13f57794-2ec2-4a54-b5ec-e955a3e65288-kube-api-access-s4kzj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-djzqb\" (UID: \"13f57794-2ec2-4a54-b5ec-e955a3e65288\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.200563 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/13f57794-2ec2-4a54-b5ec-e955a3e65288-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-djzqb\" (UID: \"13f57794-2ec2-4a54-b5ec-e955a3e65288\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.302143 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4kzj\" (UniqueName: \"kubernetes.io/projected/13f57794-2ec2-4a54-b5ec-e955a3e65288-kube-api-access-s4kzj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-djzqb\" (UID: \"13f57794-2ec2-4a54-b5ec-e955a3e65288\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.302301 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/13f57794-2ec2-4a54-b5ec-e955a3e65288-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-djzqb\" (UID: \"13f57794-2ec2-4a54-b5ec-e955a3e65288\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.302380 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f57794-2ec2-4a54-b5ec-e955a3e65288-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-djzqb\" (UID: \"13f57794-2ec2-4a54-b5ec-e955a3e65288\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.307037 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f57794-2ec2-4a54-b5ec-e955a3e65288-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-djzqb\" (UID: \"13f57794-2ec2-4a54-b5ec-e955a3e65288\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.312612 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/13f57794-2ec2-4a54-b5ec-e955a3e65288-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-djzqb\" (UID: \"13f57794-2ec2-4a54-b5ec-e955a3e65288\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.323623 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4kzj\" (UniqueName: \"kubernetes.io/projected/13f57794-2ec2-4a54-b5ec-e955a3e65288-kube-api-access-s4kzj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-djzqb\" (UID: \"13f57794-2ec2-4a54-b5ec-e955a3e65288\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.367274 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" Mar 12 13:39:16 crc kubenswrapper[4921]: I0312 13:39:16.873705 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb"] Mar 12 13:39:17 crc kubenswrapper[4921]: I0312 13:39:17.035896 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:39:17 crc kubenswrapper[4921]: E0312 13:39:17.036200 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:39:17 crc kubenswrapper[4921]: I0312 13:39:17.048290 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" event={"ID":"13f57794-2ec2-4a54-b5ec-e955a3e65288","Type":"ContainerStarted","Data":"3554283191f05082689f10140e29bdfdca529e28a0a859762465a5a553ef6c9d"} Mar 12 13:39:19 crc kubenswrapper[4921]: I0312 13:39:19.069603 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" event={"ID":"13f57794-2ec2-4a54-b5ec-e955a3e65288","Type":"ContainerStarted","Data":"3d99b376685fc632acec5ed4815f10ab19f97d0d9dbef0c549226197f3cf3612"} Mar 12 13:39:19 crc kubenswrapper[4921]: I0312 13:39:19.090996 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" podStartSLOduration=2.007645085 podStartE2EDuration="3.090976653s" podCreationTimestamp="2026-03-12 13:39:16 +0000 UTC" firstStartedPulling="2026-03-12 13:39:16.880913566 +0000 UTC m=+1779.570985537" lastFinishedPulling="2026-03-12 13:39:17.964245134 +0000 UTC m=+1780.654317105" observedRunningTime="2026-03-12 13:39:19.090357594 +0000 UTC m=+1781.780429585" watchObservedRunningTime="2026-03-12 13:39:19.090976653 +0000 UTC m=+1781.781048624" Mar 12 13:39:23 crc kubenswrapper[4921]: I0312 13:39:23.063189 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-b8t7z"] Mar 12 13:39:23 crc kubenswrapper[4921]: I0312 13:39:23.078230 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-b8t7z"] Mar 12 13:39:23 crc kubenswrapper[4921]: I0312 13:39:23.998757 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9d3161d-0fd9-4116-8e46-74d541735563" path="/var/lib/kubelet/pods/a9d3161d-0fd9-4116-8e46-74d541735563/volumes" Mar 12 13:39:26 crc kubenswrapper[4921]: I0312 13:39:26.034402 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-4d9jl"] Mar 12 13:39:26 crc kubenswrapper[4921]: I0312 13:39:26.043062 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-4d9jl"] Mar 12 13:39:27 crc kubenswrapper[4921]: I0312 13:39:27.991050 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:39:27 crc kubenswrapper[4921]: E0312 13:39:27.992022 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:39:28 crc kubenswrapper[4921]: I0312 13:39:28.007101 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb9315da-7a44-4703-bc68-935d517a4412" path="/var/lib/kubelet/pods/cb9315da-7a44-4703-bc68-935d517a4412/volumes" Mar 12 13:39:38 crc kubenswrapper[4921]: I0312 13:39:38.983545 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:39:38 crc kubenswrapper[4921]: E0312 13:39:38.984514 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:39:41 crc kubenswrapper[4921]: I0312 13:39:41.677964 4921 scope.go:117] "RemoveContainer" containerID="6dafa166e8b02279a8e99a63389b9d5e57fcd514cc809c05db9630ce997e963d" Mar 12 13:39:41 crc kubenswrapper[4921]: I0312 13:39:41.727409 4921 scope.go:117] "RemoveContainer" containerID="1df323f90efffdd1e6938c4300a799d121d63fc7aead625f796dded8e776229e" Mar 12 13:39:41 crc kubenswrapper[4921]: I0312 13:39:41.776416 4921 scope.go:117] "RemoveContainer" containerID="87b5541dd0e5017eb29e642444838157d37ef27ff4c6589461c2ca3515b083ca" Mar 12 13:39:41 crc kubenswrapper[4921]: I0312 13:39:41.800138 4921 scope.go:117] "RemoveContainer" containerID="5fc25203596b634784c962b5461abde49155a0352a648d5c2f127a66a9bc7a0f" Mar 12 13:39:41 crc kubenswrapper[4921]: I0312 13:39:41.852032 4921 scope.go:117] "RemoveContainer" containerID="549fabab75489983c2ff504acf4fb7d6e20b4ed5a226b2274e4a242f4d2f24e5" Mar 12 13:39:41 crc kubenswrapper[4921]: I0312 13:39:41.880112 4921 scope.go:117] "RemoveContainer" containerID="05a4aa398d018d3520833cceba9c5a91b21d92a30f54ad282d3f72f506cce6f5" Mar 12 13:39:51 crc kubenswrapper[4921]: I0312 13:39:51.983261 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:39:51 crc kubenswrapper[4921]: E0312 13:39:51.984312 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:40:00 crc kubenswrapper[4921]: I0312 13:40:00.156415 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555380-s2f9t"] Mar 12 13:40:00 crc kubenswrapper[4921]: I0312 13:40:00.181183 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555380-s2f9t"] Mar 12 13:40:00 crc kubenswrapper[4921]: I0312 13:40:00.181306 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555380-s2f9t" Mar 12 13:40:00 crc kubenswrapper[4921]: I0312 13:40:00.188005 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:40:00 crc kubenswrapper[4921]: I0312 13:40:00.191893 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:40:00 crc kubenswrapper[4921]: I0312 13:40:00.192111 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:40:00 crc kubenswrapper[4921]: I0312 13:40:00.199252 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbtsl\" (UniqueName: \"kubernetes.io/projected/ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1-kube-api-access-cbtsl\") pod \"auto-csr-approver-29555380-s2f9t\" (UID: \"ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1\") " pod="openshift-infra/auto-csr-approver-29555380-s2f9t" Mar 12 13:40:00 crc kubenswrapper[4921]: I0312 13:40:00.304885 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbtsl\" (UniqueName: \"kubernetes.io/projected/ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1-kube-api-access-cbtsl\") pod \"auto-csr-approver-29555380-s2f9t\" (UID: \"ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1\") " pod="openshift-infra/auto-csr-approver-29555380-s2f9t" Mar 12 13:40:00 crc kubenswrapper[4921]: I0312 13:40:00.340214 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbtsl\" (UniqueName: \"kubernetes.io/projected/ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1-kube-api-access-cbtsl\") pod \"auto-csr-approver-29555380-s2f9t\" (UID: \"ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1\") " pod="openshift-infra/auto-csr-approver-29555380-s2f9t" Mar 12 13:40:00 crc kubenswrapper[4921]: I0312 13:40:00.507318 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555380-s2f9t" Mar 12 13:40:00 crc kubenswrapper[4921]: I0312 13:40:00.982131 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555380-s2f9t"] Mar 12 13:40:01 crc kubenswrapper[4921]: I0312 13:40:01.050597 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c4c4-account-create-update-p77m4"] Mar 12 13:40:01 crc kubenswrapper[4921]: I0312 13:40:01.060664 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c4c4-account-create-update-p77m4"] Mar 12 13:40:01 crc kubenswrapper[4921]: I0312 13:40:01.484485 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555380-s2f9t" event={"ID":"ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1","Type":"ContainerStarted","Data":"2a535f0f1992d232cdbba1563722782978ecdbe15c6e2b2dafb17f14638060cb"} Mar 12 13:40:01 crc kubenswrapper[4921]: I0312 13:40:01.995725 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="751eb572-c5a1-4154-bbf0-e076d215faed" path="/var/lib/kubelet/pods/751eb572-c5a1-4154-bbf0-e076d215faed/volumes" Mar 12 13:40:02 crc kubenswrapper[4921]: I0312 13:40:02.036180 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-m98rn"] Mar 12 13:40:02 crc kubenswrapper[4921]: I0312 13:40:02.051493 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ce1b-account-create-update-dpqvt"] Mar 12 13:40:02 crc kubenswrapper[4921]: I0312 13:40:02.063763 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-mxzdk"] Mar 12 13:40:02 crc kubenswrapper[4921]: I0312 13:40:02.074242 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-m98rn"] Mar 12 13:40:02 crc kubenswrapper[4921]: I0312 13:40:02.082647 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ce1b-account-create-update-dpqvt"] Mar 12 13:40:02 crc kubenswrapper[4921]: I0312 13:40:02.088914 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-mxzdk"] Mar 12 13:40:02 crc kubenswrapper[4921]: I0312 13:40:02.095426 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-247a-account-create-update-4l9sn"] Mar 12 13:40:02 crc kubenswrapper[4921]: I0312 13:40:02.101884 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-247a-account-create-update-4l9sn"] Mar 12 13:40:02 crc kubenswrapper[4921]: I0312 13:40:02.108047 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-glss8"] Mar 12 13:40:02 crc kubenswrapper[4921]: I0312 13:40:02.114216 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-glss8"] Mar 12 13:40:03 crc kubenswrapper[4921]: I0312 13:40:03.994413 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="314bb914-0157-480d-b873-57bcc6c6eaad" path="/var/lib/kubelet/pods/314bb914-0157-480d-b873-57bcc6c6eaad/volumes" Mar 12 13:40:03 crc kubenswrapper[4921]: I0312 13:40:03.996514 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39bbbc63-15d5-418d-bd13-95a97ae85e63" path="/var/lib/kubelet/pods/39bbbc63-15d5-418d-bd13-95a97ae85e63/volumes" Mar 12 13:40:03 crc kubenswrapper[4921]: I0312 13:40:03.997865 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="956a8195-f544-4073-9042-544d311ef500" path="/var/lib/kubelet/pods/956a8195-f544-4073-9042-544d311ef500/volumes" Mar 12 13:40:03 crc kubenswrapper[4921]: I0312 13:40:03.998763 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd52e387-66f4-4b53-bfa7-23199af03b5e" path="/var/lib/kubelet/pods/cd52e387-66f4-4b53-bfa7-23199af03b5e/volumes" Mar 12 13:40:04 crc kubenswrapper[4921]: I0312 13:40:04.002041 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eba94cc5-7d33-4ec9-a923-f711f9794a5a" path="/var/lib/kubelet/pods/eba94cc5-7d33-4ec9-a923-f711f9794a5a/volumes" Mar 12 13:40:04 crc kubenswrapper[4921]: I0312 13:40:04.513781 4921 generic.go:334] "Generic (PLEG): container finished" podID="ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1" containerID="bb49d6322eed3ab617b5198826c0d92e19218aace45460b8e6fc78b761b0f700" exitCode=0 Mar 12 13:40:04 crc kubenswrapper[4921]: I0312 13:40:04.513857 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555380-s2f9t" event={"ID":"ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1","Type":"ContainerDied","Data":"bb49d6322eed3ab617b5198826c0d92e19218aace45460b8e6fc78b761b0f700"} Mar 12 13:40:05 crc kubenswrapper[4921]: I0312 13:40:05.922733 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555380-s2f9t" Mar 12 13:40:06 crc kubenswrapper[4921]: I0312 13:40:06.120979 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbtsl\" (UniqueName: \"kubernetes.io/projected/ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1-kube-api-access-cbtsl\") pod \"ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1\" (UID: \"ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1\") " Mar 12 13:40:06 crc kubenswrapper[4921]: I0312 13:40:06.131586 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1-kube-api-access-cbtsl" (OuterVolumeSpecName: "kube-api-access-cbtsl") pod "ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1" (UID: "ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1"). InnerVolumeSpecName "kube-api-access-cbtsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:40:06 crc kubenswrapper[4921]: I0312 13:40:06.224285 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbtsl\" (UniqueName: \"kubernetes.io/projected/ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1-kube-api-access-cbtsl\") on node \"crc\" DevicePath \"\"" Mar 12 13:40:06 crc kubenswrapper[4921]: I0312 13:40:06.539431 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555380-s2f9t" event={"ID":"ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1","Type":"ContainerDied","Data":"2a535f0f1992d232cdbba1563722782978ecdbe15c6e2b2dafb17f14638060cb"} Mar 12 13:40:06 crc kubenswrapper[4921]: I0312 13:40:06.539477 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a535f0f1992d232cdbba1563722782978ecdbe15c6e2b2dafb17f14638060cb" Mar 12 13:40:06 crc kubenswrapper[4921]: I0312 13:40:06.539543 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555380-s2f9t" Mar 12 13:40:06 crc kubenswrapper[4921]: I0312 13:40:06.983887 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:40:06 crc kubenswrapper[4921]: E0312 13:40:06.984429 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:40:06 crc kubenswrapper[4921]: I0312 13:40:06.990453 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555374-scjkw"] Mar 12 13:40:07 crc kubenswrapper[4921]: I0312 13:40:07.003049 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555374-scjkw"] Mar 12 13:40:07 crc kubenswrapper[4921]: I0312 13:40:07.992027 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60" path="/var/lib/kubelet/pods/e04e6bc7-1db4-4d33-89a0-ba4c75bcfe60/volumes" Mar 12 13:40:10 crc kubenswrapper[4921]: E0312 13:40:10.218463 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13f57794_2ec2_4a54_b5ec_e955a3e65288.slice/crio-3d99b376685fc632acec5ed4815f10ab19f97d0d9dbef0c549226197f3cf3612.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13f57794_2ec2_4a54_b5ec_e955a3e65288.slice/crio-conmon-3d99b376685fc632acec5ed4815f10ab19f97d0d9dbef0c549226197f3cf3612.scope\": RecentStats: unable to find data in memory cache]" Mar 12 13:40:10 crc kubenswrapper[4921]: I0312 13:40:10.583530 4921 generic.go:334] "Generic (PLEG): container finished" podID="13f57794-2ec2-4a54-b5ec-e955a3e65288" containerID="3d99b376685fc632acec5ed4815f10ab19f97d0d9dbef0c549226197f3cf3612" exitCode=0 Mar 12 13:40:10 crc kubenswrapper[4921]: I0312 13:40:10.583570 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" event={"ID":"13f57794-2ec2-4a54-b5ec-e955a3e65288","Type":"ContainerDied","Data":"3d99b376685fc632acec5ed4815f10ab19f97d0d9dbef0c549226197f3cf3612"} Mar 12 13:40:11 crc kubenswrapper[4921]: I0312 13:40:11.990288 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.136064 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f57794-2ec2-4a54-b5ec-e955a3e65288-inventory\") pod \"13f57794-2ec2-4a54-b5ec-e955a3e65288\" (UID: \"13f57794-2ec2-4a54-b5ec-e955a3e65288\") " Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.136190 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/13f57794-2ec2-4a54-b5ec-e955a3e65288-ssh-key-openstack-edpm-ipam\") pod \"13f57794-2ec2-4a54-b5ec-e955a3e65288\" (UID: \"13f57794-2ec2-4a54-b5ec-e955a3e65288\") " Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.136849 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4kzj\" (UniqueName: \"kubernetes.io/projected/13f57794-2ec2-4a54-b5ec-e955a3e65288-kube-api-access-s4kzj\") pod \"13f57794-2ec2-4a54-b5ec-e955a3e65288\" (UID: \"13f57794-2ec2-4a54-b5ec-e955a3e65288\") " Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.142047 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f57794-2ec2-4a54-b5ec-e955a3e65288-kube-api-access-s4kzj" (OuterVolumeSpecName: "kube-api-access-s4kzj") pod "13f57794-2ec2-4a54-b5ec-e955a3e65288" (UID: "13f57794-2ec2-4a54-b5ec-e955a3e65288"). InnerVolumeSpecName "kube-api-access-s4kzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.168085 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f57794-2ec2-4a54-b5ec-e955a3e65288-inventory" (OuterVolumeSpecName: "inventory") pod "13f57794-2ec2-4a54-b5ec-e955a3e65288" (UID: "13f57794-2ec2-4a54-b5ec-e955a3e65288"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.169758 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f57794-2ec2-4a54-b5ec-e955a3e65288-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "13f57794-2ec2-4a54-b5ec-e955a3e65288" (UID: "13f57794-2ec2-4a54-b5ec-e955a3e65288"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.239577 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f57794-2ec2-4a54-b5ec-e955a3e65288-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.239616 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/13f57794-2ec2-4a54-b5ec-e955a3e65288-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.239631 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4kzj\" (UniqueName: \"kubernetes.io/projected/13f57794-2ec2-4a54-b5ec-e955a3e65288-kube-api-access-s4kzj\") on node \"crc\" DevicePath \"\"" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.606792 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" event={"ID":"13f57794-2ec2-4a54-b5ec-e955a3e65288","Type":"ContainerDied","Data":"3554283191f05082689f10140e29bdfdca529e28a0a859762465a5a553ef6c9d"} Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.606887 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3554283191f05082689f10140e29bdfdca529e28a0a859762465a5a553ef6c9d" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.606954 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.682463 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-tklmg"] Mar 12 13:40:12 crc kubenswrapper[4921]: E0312 13:40:12.682942 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1" containerName="oc" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.682960 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1" containerName="oc" Mar 12 13:40:12 crc kubenswrapper[4921]: E0312 13:40:12.682978 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f57794-2ec2-4a54-b5ec-e955a3e65288" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.682990 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f57794-2ec2-4a54-b5ec-e955a3e65288" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.683205 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f57794-2ec2-4a54-b5ec-e955a3e65288" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.683227 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1" containerName="oc" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.683952 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.686299 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.687031 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.687364 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.688401 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.703295 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-tklmg"] Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.749999 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/43644579-35dc-4418-86e1-4f40c7bdcb8c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-tklmg\" (UID: \"43644579-35dc-4418-86e1-4f40c7bdcb8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.750105 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brmnq\" (UniqueName: \"kubernetes.io/projected/43644579-35dc-4418-86e1-4f40c7bdcb8c-kube-api-access-brmnq\") pod \"ssh-known-hosts-edpm-deployment-tklmg\" (UID: \"43644579-35dc-4418-86e1-4f40c7bdcb8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.750168 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43644579-35dc-4418-86e1-4f40c7bdcb8c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-tklmg\" (UID: \"43644579-35dc-4418-86e1-4f40c7bdcb8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.852167 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/43644579-35dc-4418-86e1-4f40c7bdcb8c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-tklmg\" (UID: \"43644579-35dc-4418-86e1-4f40c7bdcb8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.852444 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brmnq\" (UniqueName: \"kubernetes.io/projected/43644579-35dc-4418-86e1-4f40c7bdcb8c-kube-api-access-brmnq\") pod \"ssh-known-hosts-edpm-deployment-tklmg\" (UID: \"43644579-35dc-4418-86e1-4f40c7bdcb8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.852524 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43644579-35dc-4418-86e1-4f40c7bdcb8c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-tklmg\" (UID: \"43644579-35dc-4418-86e1-4f40c7bdcb8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.856281 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43644579-35dc-4418-86e1-4f40c7bdcb8c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-tklmg\" (UID: \"43644579-35dc-4418-86e1-4f40c7bdcb8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.857118 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/43644579-35dc-4418-86e1-4f40c7bdcb8c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-tklmg\" (UID: \"43644579-35dc-4418-86e1-4f40c7bdcb8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" Mar 12 13:40:12 crc kubenswrapper[4921]: I0312 13:40:12.874114 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brmnq\" (UniqueName: \"kubernetes.io/projected/43644579-35dc-4418-86e1-4f40c7bdcb8c-kube-api-access-brmnq\") pod \"ssh-known-hosts-edpm-deployment-tklmg\" (UID: \"43644579-35dc-4418-86e1-4f40c7bdcb8c\") " pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" Mar 12 13:40:13 crc kubenswrapper[4921]: I0312 13:40:13.001608 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" Mar 12 13:40:13 crc kubenswrapper[4921]: I0312 13:40:13.367917 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-tklmg"] Mar 12 13:40:13 crc kubenswrapper[4921]: I0312 13:40:13.618774 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" event={"ID":"43644579-35dc-4418-86e1-4f40c7bdcb8c","Type":"ContainerStarted","Data":"d9051a044a5c2ce219a3dd48ccd8a41ce4aba8962fa740a157c6bfe11e7cd97a"} Mar 12 13:40:14 crc kubenswrapper[4921]: I0312 13:40:14.631279 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" event={"ID":"43644579-35dc-4418-86e1-4f40c7bdcb8c","Type":"ContainerStarted","Data":"3a0f22f57c77b380da1205f336f508b487366b0af040624e7ca73b5b43cb3ac5"} Mar 12 13:40:14 crc kubenswrapper[4921]: I0312 13:40:14.657348 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" podStartSLOduration=2.130757386 podStartE2EDuration="2.657319242s" podCreationTimestamp="2026-03-12 13:40:12 +0000 UTC" firstStartedPulling="2026-03-12 13:40:13.37766841 +0000 UTC m=+1836.067740391" lastFinishedPulling="2026-03-12 13:40:13.904230266 +0000 UTC m=+1836.594302247" observedRunningTime="2026-03-12 13:40:14.653475413 +0000 UTC m=+1837.343547424" watchObservedRunningTime="2026-03-12 13:40:14.657319242 +0000 UTC m=+1837.347391253" Mar 12 13:40:20 crc kubenswrapper[4921]: I0312 13:40:20.983269 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:40:20 crc kubenswrapper[4921]: E0312 13:40:20.984037 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:40:23 crc kubenswrapper[4921]: I0312 13:40:23.498118 4921 generic.go:334] "Generic (PLEG): container finished" podID="43644579-35dc-4418-86e1-4f40c7bdcb8c" containerID="3a0f22f57c77b380da1205f336f508b487366b0af040624e7ca73b5b43cb3ac5" exitCode=0 Mar 12 13:40:23 crc kubenswrapper[4921]: I0312 13:40:23.498209 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" event={"ID":"43644579-35dc-4418-86e1-4f40c7bdcb8c","Type":"ContainerDied","Data":"3a0f22f57c77b380da1205f336f508b487366b0af040624e7ca73b5b43cb3ac5"} Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.007764 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.109707 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brmnq\" (UniqueName: \"kubernetes.io/projected/43644579-35dc-4418-86e1-4f40c7bdcb8c-kube-api-access-brmnq\") pod \"43644579-35dc-4418-86e1-4f40c7bdcb8c\" (UID: \"43644579-35dc-4418-86e1-4f40c7bdcb8c\") " Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.110282 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/43644579-35dc-4418-86e1-4f40c7bdcb8c-inventory-0\") pod \"43644579-35dc-4418-86e1-4f40c7bdcb8c\" (UID: \"43644579-35dc-4418-86e1-4f40c7bdcb8c\") " Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.110434 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43644579-35dc-4418-86e1-4f40c7bdcb8c-ssh-key-openstack-edpm-ipam\") pod \"43644579-35dc-4418-86e1-4f40c7bdcb8c\" (UID: \"43644579-35dc-4418-86e1-4f40c7bdcb8c\") " Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.122973 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43644579-35dc-4418-86e1-4f40c7bdcb8c-kube-api-access-brmnq" (OuterVolumeSpecName: "kube-api-access-brmnq") pod "43644579-35dc-4418-86e1-4f40c7bdcb8c" (UID: "43644579-35dc-4418-86e1-4f40c7bdcb8c"). InnerVolumeSpecName "kube-api-access-brmnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.144681 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43644579-35dc-4418-86e1-4f40c7bdcb8c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "43644579-35dc-4418-86e1-4f40c7bdcb8c" (UID: "43644579-35dc-4418-86e1-4f40c7bdcb8c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.154534 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43644579-35dc-4418-86e1-4f40c7bdcb8c-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "43644579-35dc-4418-86e1-4f40c7bdcb8c" (UID: "43644579-35dc-4418-86e1-4f40c7bdcb8c"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.212932 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/43644579-35dc-4418-86e1-4f40c7bdcb8c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.213104 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brmnq\" (UniqueName: \"kubernetes.io/projected/43644579-35dc-4418-86e1-4f40c7bdcb8c-kube-api-access-brmnq\") on node \"crc\" DevicePath \"\"" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.213160 4921 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/43644579-35dc-4418-86e1-4f40c7bdcb8c-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.519249 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" event={"ID":"43644579-35dc-4418-86e1-4f40c7bdcb8c","Type":"ContainerDied","Data":"d9051a044a5c2ce219a3dd48ccd8a41ce4aba8962fa740a157c6bfe11e7cd97a"} Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.519301 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-tklmg" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.519302 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9051a044a5c2ce219a3dd48ccd8a41ce4aba8962fa740a157c6bfe11e7cd97a" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.604266 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj"] Mar 12 13:40:25 crc kubenswrapper[4921]: E0312 13:40:25.604662 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43644579-35dc-4418-86e1-4f40c7bdcb8c" containerName="ssh-known-hosts-edpm-deployment" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.604680 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="43644579-35dc-4418-86e1-4f40c7bdcb8c" containerName="ssh-known-hosts-edpm-deployment" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.604844 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="43644579-35dc-4418-86e1-4f40c7bdcb8c" containerName="ssh-known-hosts-edpm-deployment" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.605654 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.608701 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.609078 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.609651 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.610155 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.619014 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj"] Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.720212 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0bf6515-dde5-4125-b030-f56edc8f6e31-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-22qhj\" (UID: \"b0bf6515-dde5-4125-b030-f56edc8f6e31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.720403 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg6xw\" (UniqueName: \"kubernetes.io/projected/b0bf6515-dde5-4125-b030-f56edc8f6e31-kube-api-access-cg6xw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-22qhj\" (UID: \"b0bf6515-dde5-4125-b030-f56edc8f6e31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.720441 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0bf6515-dde5-4125-b030-f56edc8f6e31-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-22qhj\" (UID: \"b0bf6515-dde5-4125-b030-f56edc8f6e31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.821900 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg6xw\" (UniqueName: \"kubernetes.io/projected/b0bf6515-dde5-4125-b030-f56edc8f6e31-kube-api-access-cg6xw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-22qhj\" (UID: \"b0bf6515-dde5-4125-b030-f56edc8f6e31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.821956 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0bf6515-dde5-4125-b030-f56edc8f6e31-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-22qhj\" (UID: \"b0bf6515-dde5-4125-b030-f56edc8f6e31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.822036 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0bf6515-dde5-4125-b030-f56edc8f6e31-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-22qhj\" (UID: \"b0bf6515-dde5-4125-b030-f56edc8f6e31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.825601 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0bf6515-dde5-4125-b030-f56edc8f6e31-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-22qhj\" (UID: \"b0bf6515-dde5-4125-b030-f56edc8f6e31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.826649 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0bf6515-dde5-4125-b030-f56edc8f6e31-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-22qhj\" (UID: \"b0bf6515-dde5-4125-b030-f56edc8f6e31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.839774 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg6xw\" (UniqueName: \"kubernetes.io/projected/b0bf6515-dde5-4125-b030-f56edc8f6e31-kube-api-access-cg6xw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-22qhj\" (UID: \"b0bf6515-dde5-4125-b030-f56edc8f6e31\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" Mar 12 13:40:25 crc kubenswrapper[4921]: I0312 13:40:25.929943 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" Mar 12 13:40:26 crc kubenswrapper[4921]: I0312 13:40:26.463106 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj"] Mar 12 13:40:26 crc kubenswrapper[4921]: W0312 13:40:26.464419 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0bf6515_dde5_4125_b030_f56edc8f6e31.slice/crio-d51bc86bb38881ba28693f066654d6d3aa6e4054275b5bb83d77c4cf12988959 WatchSource:0}: Error finding container d51bc86bb38881ba28693f066654d6d3aa6e4054275b5bb83d77c4cf12988959: Status 404 returned error can't find the container with id d51bc86bb38881ba28693f066654d6d3aa6e4054275b5bb83d77c4cf12988959 Mar 12 13:40:26 crc kubenswrapper[4921]: I0312 13:40:26.529742 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" event={"ID":"b0bf6515-dde5-4125-b030-f56edc8f6e31","Type":"ContainerStarted","Data":"d51bc86bb38881ba28693f066654d6d3aa6e4054275b5bb83d77c4cf12988959"} Mar 12 13:40:27 crc kubenswrapper[4921]: I0312 13:40:27.064951 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t9g6d"] Mar 12 13:40:27 crc kubenswrapper[4921]: I0312 13:40:27.078286 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-t9g6d"] Mar 12 13:40:27 crc kubenswrapper[4921]: I0312 13:40:27.539269 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" event={"ID":"b0bf6515-dde5-4125-b030-f56edc8f6e31","Type":"ContainerStarted","Data":"f9ccbfee9fa36be612914ab85b2455ec477225b301d234b847e9a3a585366ed8"} Mar 12 13:40:27 crc kubenswrapper[4921]: I0312 13:40:27.554726 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" podStartSLOduration=1.983630963 podStartE2EDuration="2.554706475s" podCreationTimestamp="2026-03-12 13:40:25 +0000 UTC" firstStartedPulling="2026-03-12 13:40:26.466398694 +0000 UTC m=+1849.156470675" lastFinishedPulling="2026-03-12 13:40:27.037474206 +0000 UTC m=+1849.727546187" observedRunningTime="2026-03-12 13:40:27.552687564 +0000 UTC m=+1850.242759535" watchObservedRunningTime="2026-03-12 13:40:27.554706475 +0000 UTC m=+1850.244778446" Mar 12 13:40:28 crc kubenswrapper[4921]: I0312 13:40:28.008643 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="409470f7-5137-49c5-8d79-358a4466e1db" path="/var/lib/kubelet/pods/409470f7-5137-49c5-8d79-358a4466e1db/volumes" Mar 12 13:40:35 crc kubenswrapper[4921]: I0312 13:40:35.983095 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:40:35 crc kubenswrapper[4921]: E0312 13:40:35.983700 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:40:38 crc kubenswrapper[4921]: I0312 13:40:38.766374 4921 generic.go:334] "Generic (PLEG): container finished" podID="b0bf6515-dde5-4125-b030-f56edc8f6e31" containerID="f9ccbfee9fa36be612914ab85b2455ec477225b301d234b847e9a3a585366ed8" exitCode=0 Mar 12 13:40:38 crc kubenswrapper[4921]: I0312 13:40:38.766466 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" event={"ID":"b0bf6515-dde5-4125-b030-f56edc8f6e31","Type":"ContainerDied","Data":"f9ccbfee9fa36be612914ab85b2455ec477225b301d234b847e9a3a585366ed8"} Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.142827 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.306767 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0bf6515-dde5-4125-b030-f56edc8f6e31-inventory\") pod \"b0bf6515-dde5-4125-b030-f56edc8f6e31\" (UID: \"b0bf6515-dde5-4125-b030-f56edc8f6e31\") " Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.306888 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0bf6515-dde5-4125-b030-f56edc8f6e31-ssh-key-openstack-edpm-ipam\") pod \"b0bf6515-dde5-4125-b030-f56edc8f6e31\" (UID: \"b0bf6515-dde5-4125-b030-f56edc8f6e31\") " Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.306953 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg6xw\" (UniqueName: \"kubernetes.io/projected/b0bf6515-dde5-4125-b030-f56edc8f6e31-kube-api-access-cg6xw\") pod \"b0bf6515-dde5-4125-b030-f56edc8f6e31\" (UID: \"b0bf6515-dde5-4125-b030-f56edc8f6e31\") " Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.312715 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0bf6515-dde5-4125-b030-f56edc8f6e31-kube-api-access-cg6xw" (OuterVolumeSpecName: "kube-api-access-cg6xw") pod "b0bf6515-dde5-4125-b030-f56edc8f6e31" (UID: "b0bf6515-dde5-4125-b030-f56edc8f6e31"). InnerVolumeSpecName "kube-api-access-cg6xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.332754 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0bf6515-dde5-4125-b030-f56edc8f6e31-inventory" (OuterVolumeSpecName: "inventory") pod "b0bf6515-dde5-4125-b030-f56edc8f6e31" (UID: "b0bf6515-dde5-4125-b030-f56edc8f6e31"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.348394 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0bf6515-dde5-4125-b030-f56edc8f6e31-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b0bf6515-dde5-4125-b030-f56edc8f6e31" (UID: "b0bf6515-dde5-4125-b030-f56edc8f6e31"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.408468 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0bf6515-dde5-4125-b030-f56edc8f6e31-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.408746 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0bf6515-dde5-4125-b030-f56edc8f6e31-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.408759 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg6xw\" (UniqueName: \"kubernetes.io/projected/b0bf6515-dde5-4125-b030-f56edc8f6e31-kube-api-access-cg6xw\") on node \"crc\" DevicePath \"\"" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.791575 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" event={"ID":"b0bf6515-dde5-4125-b030-f56edc8f6e31","Type":"ContainerDied","Data":"d51bc86bb38881ba28693f066654d6d3aa6e4054275b5bb83d77c4cf12988959"} Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.791614 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d51bc86bb38881ba28693f066654d6d3aa6e4054275b5bb83d77c4cf12988959" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.791643 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.871016 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b"] Mar 12 13:40:40 crc kubenswrapper[4921]: E0312 13:40:40.871405 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0bf6515-dde5-4125-b030-f56edc8f6e31" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.871423 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0bf6515-dde5-4125-b030-f56edc8f6e31" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.871587 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0bf6515-dde5-4125-b030-f56edc8f6e31" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.872165 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.877155 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.877324 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.877437 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.877569 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:40:40 crc kubenswrapper[4921]: I0312 13:40:40.887336 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b"] Mar 12 13:40:41 crc kubenswrapper[4921]: I0312 13:40:41.020489 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2031892-1082-4d65-8768-ac76c82bfff0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b\" (UID: \"e2031892-1082-4d65-8768-ac76c82bfff0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" Mar 12 13:40:41 crc kubenswrapper[4921]: I0312 13:40:41.020920 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2wc2\" (UniqueName: \"kubernetes.io/projected/e2031892-1082-4d65-8768-ac76c82bfff0-kube-api-access-l2wc2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b\" (UID: \"e2031892-1082-4d65-8768-ac76c82bfff0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" Mar 12 13:40:41 crc kubenswrapper[4921]: I0312 13:40:41.021398 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2031892-1082-4d65-8768-ac76c82bfff0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b\" (UID: \"e2031892-1082-4d65-8768-ac76c82bfff0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" Mar 12 13:40:41 crc kubenswrapper[4921]: I0312 13:40:41.122808 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2wc2\" (UniqueName: \"kubernetes.io/projected/e2031892-1082-4d65-8768-ac76c82bfff0-kube-api-access-l2wc2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b\" (UID: \"e2031892-1082-4d65-8768-ac76c82bfff0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" Mar 12 13:40:41 crc kubenswrapper[4921]: I0312 13:40:41.122932 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2031892-1082-4d65-8768-ac76c82bfff0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b\" (UID: \"e2031892-1082-4d65-8768-ac76c82bfff0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" Mar 12 13:40:41 crc kubenswrapper[4921]: I0312 13:40:41.123110 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2031892-1082-4d65-8768-ac76c82bfff0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b\" (UID: \"e2031892-1082-4d65-8768-ac76c82bfff0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" Mar 12 13:40:41 crc kubenswrapper[4921]: I0312 13:40:41.128377 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2031892-1082-4d65-8768-ac76c82bfff0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b\" (UID: \"e2031892-1082-4d65-8768-ac76c82bfff0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" Mar 12 13:40:41 crc kubenswrapper[4921]: I0312 13:40:41.128554 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2031892-1082-4d65-8768-ac76c82bfff0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b\" (UID: \"e2031892-1082-4d65-8768-ac76c82bfff0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" Mar 12 13:40:41 crc kubenswrapper[4921]: I0312 13:40:41.151225 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2wc2\" (UniqueName: \"kubernetes.io/projected/e2031892-1082-4d65-8768-ac76c82bfff0-kube-api-access-l2wc2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b\" (UID: \"e2031892-1082-4d65-8768-ac76c82bfff0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" Mar 12 13:40:41 crc kubenswrapper[4921]: I0312 13:40:41.242664 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" Mar 12 13:40:41 crc kubenswrapper[4921]: I0312 13:40:41.796718 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b"] Mar 12 13:40:41 crc kubenswrapper[4921]: W0312 13:40:41.800967 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2031892_1082_4d65_8768_ac76c82bfff0.slice/crio-84cdcf425414cee2bffca1a8987f86ea1037a340a7c05dac382ae8b5b86e6e50 WatchSource:0}: Error finding container 84cdcf425414cee2bffca1a8987f86ea1037a340a7c05dac382ae8b5b86e6e50: Status 404 returned error can't find the container with id 84cdcf425414cee2bffca1a8987f86ea1037a340a7c05dac382ae8b5b86e6e50 Mar 12 13:40:42 crc kubenswrapper[4921]: I0312 13:40:42.068326 4921 scope.go:117] "RemoveContainer" containerID="cb20a646af29c4b79c48b9d2b43126e629bcb9df396a8e9a5b583c72aa642e88" Mar 12 13:40:42 crc kubenswrapper[4921]: I0312 13:40:42.098589 4921 scope.go:117] "RemoveContainer" containerID="d6f5cbdfc8deaeb448f96c4bc2f6407691a67838a0ec753aad5250273989bbc6" Mar 12 13:40:42 crc kubenswrapper[4921]: I0312 13:40:42.129442 4921 scope.go:117] "RemoveContainer" containerID="95e6cfa841f2a0604bdaa1a3bc441224014b19a3878b899fd92d48dd8160d5af" Mar 12 13:40:42 crc kubenswrapper[4921]: I0312 13:40:42.197662 4921 scope.go:117] "RemoveContainer" containerID="74516089e9d6d419de312e366d1fb36ab956835375f0c9c9f071d367d710f30a" Mar 12 13:40:42 crc kubenswrapper[4921]: I0312 13:40:42.275661 4921 scope.go:117] "RemoveContainer" containerID="a363a2891ae209c0738384e64d213015ecdd740a02bafc4e7683a47b027c8f77" Mar 12 13:40:42 crc kubenswrapper[4921]: I0312 13:40:42.298449 4921 scope.go:117] "RemoveContainer" containerID="4132b1c906451df3acc71d98e9d47f8d05eed4f07057f8c7a5a3285dea15d9cb" Mar 12 13:40:42 crc kubenswrapper[4921]: I0312 13:40:42.346539 4921 scope.go:117] "RemoveContainer" containerID="cfea70834a70ba5fe61f533f89246186810dd47d5fec6d592137de4774f7d54b" Mar 12 13:40:42 crc kubenswrapper[4921]: I0312 13:40:42.396634 4921 scope.go:117] "RemoveContainer" containerID="1b2a8ea04d664bb89c44fecd7ca1ebde743a3540b6627c4e98a301293bdafb38" Mar 12 13:40:42 crc kubenswrapper[4921]: I0312 13:40:42.819283 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" event={"ID":"e2031892-1082-4d65-8768-ac76c82bfff0","Type":"ContainerStarted","Data":"8a2dc04e56154073b796882a618b9dce3727c28d2a8a4f8c61b892117102b146"} Mar 12 13:40:42 crc kubenswrapper[4921]: I0312 13:40:42.819351 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" event={"ID":"e2031892-1082-4d65-8768-ac76c82bfff0","Type":"ContainerStarted","Data":"84cdcf425414cee2bffca1a8987f86ea1037a340a7c05dac382ae8b5b86e6e50"} Mar 12 13:40:42 crc kubenswrapper[4921]: I0312 13:40:42.842280 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" podStartSLOduration=2.166872081 podStartE2EDuration="2.842264243s" podCreationTimestamp="2026-03-12 13:40:40 +0000 UTC" firstStartedPulling="2026-03-12 13:40:41.803157162 +0000 UTC m=+1864.493229133" lastFinishedPulling="2026-03-12 13:40:42.478549324 +0000 UTC m=+1865.168621295" observedRunningTime="2026-03-12 13:40:42.833935107 +0000 UTC m=+1865.524007098" watchObservedRunningTime="2026-03-12 13:40:42.842264243 +0000 UTC m=+1865.532336214" Mar 12 13:40:44 crc kubenswrapper[4921]: I0312 13:40:44.676290 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cfsth"] Mar 12 13:40:44 crc kubenswrapper[4921]: I0312 13:40:44.679675 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:40:44 crc kubenswrapper[4921]: I0312 13:40:44.696267 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfsth"] Mar 12 13:40:44 crc kubenswrapper[4921]: I0312 13:40:44.815098 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55chk\" (UniqueName: \"kubernetes.io/projected/74d62bfa-7599-4992-b5e7-4220aa3a6443-kube-api-access-55chk\") pod \"community-operators-cfsth\" (UID: \"74d62bfa-7599-4992-b5e7-4220aa3a6443\") " pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:40:44 crc kubenswrapper[4921]: I0312 13:40:44.815171 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d62bfa-7599-4992-b5e7-4220aa3a6443-catalog-content\") pod \"community-operators-cfsth\" (UID: \"74d62bfa-7599-4992-b5e7-4220aa3a6443\") " pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:40:44 crc kubenswrapper[4921]: I0312 13:40:44.815330 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d62bfa-7599-4992-b5e7-4220aa3a6443-utilities\") pod \"community-operators-cfsth\" (UID: \"74d62bfa-7599-4992-b5e7-4220aa3a6443\") " pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:40:44 crc kubenswrapper[4921]: I0312 13:40:44.916309 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55chk\" (UniqueName: \"kubernetes.io/projected/74d62bfa-7599-4992-b5e7-4220aa3a6443-kube-api-access-55chk\") pod \"community-operators-cfsth\" (UID: \"74d62bfa-7599-4992-b5e7-4220aa3a6443\") " pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:40:44 crc kubenswrapper[4921]: I0312 13:40:44.916355 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d62bfa-7599-4992-b5e7-4220aa3a6443-catalog-content\") pod \"community-operators-cfsth\" (UID: \"74d62bfa-7599-4992-b5e7-4220aa3a6443\") " pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:40:44 crc kubenswrapper[4921]: I0312 13:40:44.916414 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d62bfa-7599-4992-b5e7-4220aa3a6443-utilities\") pod \"community-operators-cfsth\" (UID: \"74d62bfa-7599-4992-b5e7-4220aa3a6443\") " pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:40:44 crc kubenswrapper[4921]: I0312 13:40:44.916952 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d62bfa-7599-4992-b5e7-4220aa3a6443-utilities\") pod \"community-operators-cfsth\" (UID: \"74d62bfa-7599-4992-b5e7-4220aa3a6443\") " pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:40:44 crc kubenswrapper[4921]: I0312 13:40:44.917055 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d62bfa-7599-4992-b5e7-4220aa3a6443-catalog-content\") pod \"community-operators-cfsth\" (UID: \"74d62bfa-7599-4992-b5e7-4220aa3a6443\") " pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:40:44 crc kubenswrapper[4921]: I0312 13:40:44.937796 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55chk\" (UniqueName: \"kubernetes.io/projected/74d62bfa-7599-4992-b5e7-4220aa3a6443-kube-api-access-55chk\") pod \"community-operators-cfsth\" (UID: \"74d62bfa-7599-4992-b5e7-4220aa3a6443\") " pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:40:45 crc kubenswrapper[4921]: I0312 13:40:45.006922 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:40:45 crc kubenswrapper[4921]: I0312 13:40:45.323512 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfsth"] Mar 12 13:40:45 crc kubenswrapper[4921]: I0312 13:40:45.843101 4921 generic.go:334] "Generic (PLEG): container finished" podID="74d62bfa-7599-4992-b5e7-4220aa3a6443" containerID="58a54eba602eb05bf019583308cd86314f6a1713355b869c1988d88f51460562" exitCode=0 Mar 12 13:40:45 crc kubenswrapper[4921]: I0312 13:40:45.843251 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfsth" event={"ID":"74d62bfa-7599-4992-b5e7-4220aa3a6443","Type":"ContainerDied","Data":"58a54eba602eb05bf019583308cd86314f6a1713355b869c1988d88f51460562"} Mar 12 13:40:45 crc kubenswrapper[4921]: I0312 13:40:45.843456 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfsth" event={"ID":"74d62bfa-7599-4992-b5e7-4220aa3a6443","Type":"ContainerStarted","Data":"2f3c90d7299a4227e42a6b82283f5f543968e4b0e5a6ede7e1d3e0f66ca1e02f"} Mar 12 13:40:46 crc kubenswrapper[4921]: I0312 13:40:46.983896 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:40:46 crc kubenswrapper[4921]: E0312 13:40:46.984213 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.067146 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lf7s5"] Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.069258 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.082252 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lf7s5"] Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.179079 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-catalog-content\") pod \"certified-operators-lf7s5\" (UID: \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\") " pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.179218 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-utilities\") pod \"certified-operators-lf7s5\" (UID: \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\") " pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.179333 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v892\" (UniqueName: \"kubernetes.io/projected/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-kube-api-access-4v892\") pod \"certified-operators-lf7s5\" (UID: \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\") " pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.280850 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v892\" (UniqueName: \"kubernetes.io/projected/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-kube-api-access-4v892\") pod \"certified-operators-lf7s5\" (UID: \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\") " pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.280962 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-catalog-content\") pod \"certified-operators-lf7s5\" (UID: \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\") " pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.281059 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-utilities\") pod \"certified-operators-lf7s5\" (UID: \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\") " pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.281676 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-utilities\") pod \"certified-operators-lf7s5\" (UID: \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\") " pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.281719 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-catalog-content\") pod \"certified-operators-lf7s5\" (UID: \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\") " pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.303449 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v892\" (UniqueName: \"kubernetes.io/projected/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-kube-api-access-4v892\") pod \"certified-operators-lf7s5\" (UID: \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\") " pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.385764 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.834676 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lf7s5"] Mar 12 13:40:47 crc kubenswrapper[4921]: W0312 13:40:47.853487 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9662bc1_fa95_47ed_be26_d91d2b0d42c2.slice/crio-fa1ca24a6e2fd7767ef5b019ae01b30f4013a218d91f3888717180a99c846ec3 WatchSource:0}: Error finding container fa1ca24a6e2fd7767ef5b019ae01b30f4013a218d91f3888717180a99c846ec3: Status 404 returned error can't find the container with id fa1ca24a6e2fd7767ef5b019ae01b30f4013a218d91f3888717180a99c846ec3 Mar 12 13:40:47 crc kubenswrapper[4921]: I0312 13:40:47.876528 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf7s5" event={"ID":"d9662bc1-fa95-47ed-be26-d91d2b0d42c2","Type":"ContainerStarted","Data":"fa1ca24a6e2fd7767ef5b019ae01b30f4013a218d91f3888717180a99c846ec3"} Mar 12 13:40:48 crc kubenswrapper[4921]: I0312 13:40:48.884916 4921 generic.go:334] "Generic (PLEG): container finished" podID="d9662bc1-fa95-47ed-be26-d91d2b0d42c2" containerID="520089e7a330e708d65ddd071342216ee677e37dd29cac8f7a365f1047f2e04c" exitCode=0 Mar 12 13:40:48 crc kubenswrapper[4921]: I0312 13:40:48.884966 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf7s5" event={"ID":"d9662bc1-fa95-47ed-be26-d91d2b0d42c2","Type":"ContainerDied","Data":"520089e7a330e708d65ddd071342216ee677e37dd29cac8f7a365f1047f2e04c"} Mar 12 13:40:50 crc kubenswrapper[4921]: I0312 13:40:50.050571 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lqd7h"] Mar 12 13:40:50 crc kubenswrapper[4921]: I0312 13:40:50.065446 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pm2mb"] Mar 12 13:40:50 crc kubenswrapper[4921]: I0312 13:40:50.086024 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pm2mb"] Mar 12 13:40:50 crc kubenswrapper[4921]: I0312 13:40:50.095117 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-lqd7h"] Mar 12 13:40:51 crc kubenswrapper[4921]: I0312 13:40:51.909976 4921 generic.go:334] "Generic (PLEG): container finished" podID="74d62bfa-7599-4992-b5e7-4220aa3a6443" containerID="f4e5d20dd222ff23b06c587f636838e8ddd6d6d72bda507cec02d43cdaa02bd8" exitCode=0 Mar 12 13:40:51 crc kubenswrapper[4921]: I0312 13:40:51.910115 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfsth" event={"ID":"74d62bfa-7599-4992-b5e7-4220aa3a6443","Type":"ContainerDied","Data":"f4e5d20dd222ff23b06c587f636838e8ddd6d6d72bda507cec02d43cdaa02bd8"} Mar 12 13:40:51 crc kubenswrapper[4921]: I0312 13:40:51.912568 4921 generic.go:334] "Generic (PLEG): container finished" podID="d9662bc1-fa95-47ed-be26-d91d2b0d42c2" containerID="d18794017c19ad5863f1a537256aaa24e725dc026916cc5e95c70e0c2aaf5116" exitCode=0 Mar 12 13:40:51 crc kubenswrapper[4921]: I0312 13:40:51.912627 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf7s5" event={"ID":"d9662bc1-fa95-47ed-be26-d91d2b0d42c2","Type":"ContainerDied","Data":"d18794017c19ad5863f1a537256aaa24e725dc026916cc5e95c70e0c2aaf5116"} Mar 12 13:40:51 crc kubenswrapper[4921]: I0312 13:40:51.992174 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f893a7c-5319-4a48-b19e-7405f7d64887" path="/var/lib/kubelet/pods/6f893a7c-5319-4a48-b19e-7405f7d64887/volumes" Mar 12 13:40:51 crc kubenswrapper[4921]: I0312 13:40:51.992722 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8b648cc-0abb-4d1a-8287-9041c336c678" path="/var/lib/kubelet/pods/b8b648cc-0abb-4d1a-8287-9041c336c678/volumes" Mar 12 13:40:52 crc kubenswrapper[4921]: I0312 13:40:52.923902 4921 generic.go:334] "Generic (PLEG): container finished" podID="e2031892-1082-4d65-8768-ac76c82bfff0" containerID="8a2dc04e56154073b796882a618b9dce3727c28d2a8a4f8c61b892117102b146" exitCode=0 Mar 12 13:40:52 crc kubenswrapper[4921]: I0312 13:40:52.924002 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" event={"ID":"e2031892-1082-4d65-8768-ac76c82bfff0","Type":"ContainerDied","Data":"8a2dc04e56154073b796882a618b9dce3727c28d2a8a4f8c61b892117102b146"} Mar 12 13:40:52 crc kubenswrapper[4921]: I0312 13:40:52.927737 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf7s5" event={"ID":"d9662bc1-fa95-47ed-be26-d91d2b0d42c2","Type":"ContainerStarted","Data":"9d84cd6da87366a336a6001137a19b3e8d5d0a4fd88df4af5687eb6d08c04bf1"} Mar 12 13:40:52 crc kubenswrapper[4921]: I0312 13:40:52.931122 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfsth" event={"ID":"74d62bfa-7599-4992-b5e7-4220aa3a6443","Type":"ContainerStarted","Data":"612cdfde1ebeecd20e592969d30f2fae7d04770150f6259ab7b606ecfa59ccc8"} Mar 12 13:40:52 crc kubenswrapper[4921]: I0312 13:40:52.971165 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cfsth" podStartSLOduration=2.38175793 podStartE2EDuration="8.971147873s" podCreationTimestamp="2026-03-12 13:40:44 +0000 UTC" firstStartedPulling="2026-03-12 13:40:45.844788275 +0000 UTC m=+1868.534860246" lastFinishedPulling="2026-03-12 13:40:52.434178218 +0000 UTC m=+1875.124250189" observedRunningTime="2026-03-12 13:40:52.967545143 +0000 UTC m=+1875.657617164" watchObservedRunningTime="2026-03-12 13:40:52.971147873 +0000 UTC m=+1875.661219844" Mar 12 13:40:52 crc kubenswrapper[4921]: I0312 13:40:52.998847 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lf7s5" podStartSLOduration=2.451957109 podStartE2EDuration="5.998825583s" podCreationTimestamp="2026-03-12 13:40:47 +0000 UTC" firstStartedPulling="2026-03-12 13:40:48.88944784 +0000 UTC m=+1871.579519811" lastFinishedPulling="2026-03-12 13:40:52.436316304 +0000 UTC m=+1875.126388285" observedRunningTime="2026-03-12 13:40:52.997983737 +0000 UTC m=+1875.688055728" watchObservedRunningTime="2026-03-12 13:40:52.998825583 +0000 UTC m=+1875.688897564" Mar 12 13:40:54 crc kubenswrapper[4921]: I0312 13:40:54.360545 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" Mar 12 13:40:54 crc kubenswrapper[4921]: I0312 13:40:54.437406 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2wc2\" (UniqueName: \"kubernetes.io/projected/e2031892-1082-4d65-8768-ac76c82bfff0-kube-api-access-l2wc2\") pod \"e2031892-1082-4d65-8768-ac76c82bfff0\" (UID: \"e2031892-1082-4d65-8768-ac76c82bfff0\") " Mar 12 13:40:54 crc kubenswrapper[4921]: I0312 13:40:54.438254 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2031892-1082-4d65-8768-ac76c82bfff0-ssh-key-openstack-edpm-ipam\") pod \"e2031892-1082-4d65-8768-ac76c82bfff0\" (UID: \"e2031892-1082-4d65-8768-ac76c82bfff0\") " Mar 12 13:40:54 crc kubenswrapper[4921]: I0312 13:40:54.438320 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2031892-1082-4d65-8768-ac76c82bfff0-inventory\") pod \"e2031892-1082-4d65-8768-ac76c82bfff0\" (UID: \"e2031892-1082-4d65-8768-ac76c82bfff0\") " Mar 12 13:40:54 crc kubenswrapper[4921]: I0312 13:40:54.443028 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2031892-1082-4d65-8768-ac76c82bfff0-kube-api-access-l2wc2" (OuterVolumeSpecName: "kube-api-access-l2wc2") pod "e2031892-1082-4d65-8768-ac76c82bfff0" (UID: "e2031892-1082-4d65-8768-ac76c82bfff0"). InnerVolumeSpecName "kube-api-access-l2wc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:40:54 crc kubenswrapper[4921]: I0312 13:40:54.462158 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2031892-1082-4d65-8768-ac76c82bfff0-inventory" (OuterVolumeSpecName: "inventory") pod "e2031892-1082-4d65-8768-ac76c82bfff0" (UID: "e2031892-1082-4d65-8768-ac76c82bfff0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:40:54 crc kubenswrapper[4921]: I0312 13:40:54.463730 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2031892-1082-4d65-8768-ac76c82bfff0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e2031892-1082-4d65-8768-ac76c82bfff0" (UID: "e2031892-1082-4d65-8768-ac76c82bfff0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:40:54 crc kubenswrapper[4921]: I0312 13:40:54.540132 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2031892-1082-4d65-8768-ac76c82bfff0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:40:54 crc kubenswrapper[4921]: I0312 13:40:54.540164 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2031892-1082-4d65-8768-ac76c82bfff0-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:40:54 crc kubenswrapper[4921]: I0312 13:40:54.540173 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2wc2\" (UniqueName: \"kubernetes.io/projected/e2031892-1082-4d65-8768-ac76c82bfff0-kube-api-access-l2wc2\") on node \"crc\" DevicePath \"\"" Mar 12 13:40:54 crc kubenswrapper[4921]: I0312 13:40:54.956955 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" event={"ID":"e2031892-1082-4d65-8768-ac76c82bfff0","Type":"ContainerDied","Data":"84cdcf425414cee2bffca1a8987f86ea1037a340a7c05dac382ae8b5b86e6e50"} Mar 12 13:40:54 crc kubenswrapper[4921]: I0312 13:40:54.957001 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84cdcf425414cee2bffca1a8987f86ea1037a340a7c05dac382ae8b5b86e6e50" Mar 12 13:40:54 crc kubenswrapper[4921]: I0312 13:40:54.957047 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b" Mar 12 13:40:55 crc kubenswrapper[4921]: I0312 13:40:55.008140 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:40:55 crc kubenswrapper[4921]: I0312 13:40:55.008197 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:40:55 crc kubenswrapper[4921]: I0312 13:40:55.098541 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:40:57 crc kubenswrapper[4921]: I0312 13:40:57.386440 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:57 crc kubenswrapper[4921]: I0312 13:40:57.387094 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:57 crc kubenswrapper[4921]: I0312 13:40:57.432649 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:58 crc kubenswrapper[4921]: I0312 13:40:58.084334 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:40:58 crc kubenswrapper[4921]: I0312 13:40:58.673608 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lf7s5"] Mar 12 13:40:58 crc kubenswrapper[4921]: I0312 13:40:58.984593 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:40:58 crc kubenswrapper[4921]: E0312 13:40:58.985150 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:41:00 crc kubenswrapper[4921]: I0312 13:41:00.008166 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lf7s5" podUID="d9662bc1-fa95-47ed-be26-d91d2b0d42c2" containerName="registry-server" containerID="cri-o://9d84cd6da87366a336a6001137a19b3e8d5d0a4fd88df4af5687eb6d08c04bf1" gracePeriod=2 Mar 12 13:41:00 crc kubenswrapper[4921]: I0312 13:41:00.502305 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:41:00 crc kubenswrapper[4921]: I0312 13:41:00.564345 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-catalog-content\") pod \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\" (UID: \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\") " Mar 12 13:41:00 crc kubenswrapper[4921]: I0312 13:41:00.564468 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-utilities\") pod \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\" (UID: \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\") " Mar 12 13:41:00 crc kubenswrapper[4921]: I0312 13:41:00.564493 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v892\" (UniqueName: \"kubernetes.io/projected/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-kube-api-access-4v892\") pod \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\" (UID: \"d9662bc1-fa95-47ed-be26-d91d2b0d42c2\") " Mar 12 13:41:00 crc kubenswrapper[4921]: I0312 13:41:00.565445 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-utilities" (OuterVolumeSpecName: "utilities") pod "d9662bc1-fa95-47ed-be26-d91d2b0d42c2" (UID: "d9662bc1-fa95-47ed-be26-d91d2b0d42c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:41:00 crc kubenswrapper[4921]: I0312 13:41:00.578451 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-kube-api-access-4v892" (OuterVolumeSpecName: "kube-api-access-4v892") pod "d9662bc1-fa95-47ed-be26-d91d2b0d42c2" (UID: "d9662bc1-fa95-47ed-be26-d91d2b0d42c2"). InnerVolumeSpecName "kube-api-access-4v892". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:41:00 crc kubenswrapper[4921]: I0312 13:41:00.651638 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9662bc1-fa95-47ed-be26-d91d2b0d42c2" (UID: "d9662bc1-fa95-47ed-be26-d91d2b0d42c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:41:00 crc kubenswrapper[4921]: I0312 13:41:00.666900 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:41:00 crc kubenswrapper[4921]: I0312 13:41:00.666933 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:41:00 crc kubenswrapper[4921]: I0312 13:41:00.666945 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v892\" (UniqueName: \"kubernetes.io/projected/d9662bc1-fa95-47ed-be26-d91d2b0d42c2-kube-api-access-4v892\") on node \"crc\" DevicePath \"\"" Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.021140 4921 generic.go:334] "Generic (PLEG): container finished" podID="d9662bc1-fa95-47ed-be26-d91d2b0d42c2" containerID="9d84cd6da87366a336a6001137a19b3e8d5d0a4fd88df4af5687eb6d08c04bf1" exitCode=0 Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.021193 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf7s5" event={"ID":"d9662bc1-fa95-47ed-be26-d91d2b0d42c2","Type":"ContainerDied","Data":"9d84cd6da87366a336a6001137a19b3e8d5d0a4fd88df4af5687eb6d08c04bf1"} Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.021263 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf7s5" event={"ID":"d9662bc1-fa95-47ed-be26-d91d2b0d42c2","Type":"ContainerDied","Data":"fa1ca24a6e2fd7767ef5b019ae01b30f4013a218d91f3888717180a99c846ec3"} Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.021280 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lf7s5" Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.021296 4921 scope.go:117] "RemoveContainer" containerID="9d84cd6da87366a336a6001137a19b3e8d5d0a4fd88df4af5687eb6d08c04bf1" Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.042335 4921 scope.go:117] "RemoveContainer" containerID="d18794017c19ad5863f1a537256aaa24e725dc026916cc5e95c70e0c2aaf5116" Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.077478 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lf7s5"] Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.082356 4921 scope.go:117] "RemoveContainer" containerID="520089e7a330e708d65ddd071342216ee677e37dd29cac8f7a365f1047f2e04c" Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.087526 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lf7s5"] Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.122303 4921 scope.go:117] "RemoveContainer" containerID="9d84cd6da87366a336a6001137a19b3e8d5d0a4fd88df4af5687eb6d08c04bf1" Mar 12 13:41:01 crc kubenswrapper[4921]: E0312 13:41:01.124533 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d84cd6da87366a336a6001137a19b3e8d5d0a4fd88df4af5687eb6d08c04bf1\": container with ID starting with 9d84cd6da87366a336a6001137a19b3e8d5d0a4fd88df4af5687eb6d08c04bf1 not found: ID does not exist" containerID="9d84cd6da87366a336a6001137a19b3e8d5d0a4fd88df4af5687eb6d08c04bf1" Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.124583 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d84cd6da87366a336a6001137a19b3e8d5d0a4fd88df4af5687eb6d08c04bf1"} err="failed to get container status \"9d84cd6da87366a336a6001137a19b3e8d5d0a4fd88df4af5687eb6d08c04bf1\": rpc error: code = NotFound desc = could not find container \"9d84cd6da87366a336a6001137a19b3e8d5d0a4fd88df4af5687eb6d08c04bf1\": container with ID starting with 9d84cd6da87366a336a6001137a19b3e8d5d0a4fd88df4af5687eb6d08c04bf1 not found: ID does not exist" Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.124618 4921 scope.go:117] "RemoveContainer" containerID="d18794017c19ad5863f1a537256aaa24e725dc026916cc5e95c70e0c2aaf5116" Mar 12 13:41:01 crc kubenswrapper[4921]: E0312 13:41:01.124977 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18794017c19ad5863f1a537256aaa24e725dc026916cc5e95c70e0c2aaf5116\": container with ID starting with d18794017c19ad5863f1a537256aaa24e725dc026916cc5e95c70e0c2aaf5116 not found: ID does not exist" containerID="d18794017c19ad5863f1a537256aaa24e725dc026916cc5e95c70e0c2aaf5116" Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.125005 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18794017c19ad5863f1a537256aaa24e725dc026916cc5e95c70e0c2aaf5116"} err="failed to get container status \"d18794017c19ad5863f1a537256aaa24e725dc026916cc5e95c70e0c2aaf5116\": rpc error: code = NotFound desc = could not find container \"d18794017c19ad5863f1a537256aaa24e725dc026916cc5e95c70e0c2aaf5116\": container with ID starting with d18794017c19ad5863f1a537256aaa24e725dc026916cc5e95c70e0c2aaf5116 not found: ID does not exist" Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.125021 4921 scope.go:117] "RemoveContainer" containerID="520089e7a330e708d65ddd071342216ee677e37dd29cac8f7a365f1047f2e04c" Mar 12 13:41:01 crc kubenswrapper[4921]: E0312 13:41:01.125708 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"520089e7a330e708d65ddd071342216ee677e37dd29cac8f7a365f1047f2e04c\": container with ID starting with 520089e7a330e708d65ddd071342216ee677e37dd29cac8f7a365f1047f2e04c not found: ID does not exist" containerID="520089e7a330e708d65ddd071342216ee677e37dd29cac8f7a365f1047f2e04c" Mar 12 13:41:01 crc kubenswrapper[4921]: I0312 13:41:01.125739 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"520089e7a330e708d65ddd071342216ee677e37dd29cac8f7a365f1047f2e04c"} err="failed to get container status \"520089e7a330e708d65ddd071342216ee677e37dd29cac8f7a365f1047f2e04c\": rpc error: code = NotFound desc = could not find container \"520089e7a330e708d65ddd071342216ee677e37dd29cac8f7a365f1047f2e04c\": container with ID starting with 520089e7a330e708d65ddd071342216ee677e37dd29cac8f7a365f1047f2e04c not found: ID does not exist" Mar 12 13:41:02 crc kubenswrapper[4921]: I0312 13:41:02.002870 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9662bc1-fa95-47ed-be26-d91d2b0d42c2" path="/var/lib/kubelet/pods/d9662bc1-fa95-47ed-be26-d91d2b0d42c2/volumes" Mar 12 13:41:05 crc kubenswrapper[4921]: I0312 13:41:05.065068 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cfsth" Mar 12 13:41:08 crc kubenswrapper[4921]: I0312 13:41:08.294150 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfsth"] Mar 12 13:41:08 crc kubenswrapper[4921]: I0312 13:41:08.471357 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-drchr"] Mar 12 13:41:08 crc kubenswrapper[4921]: I0312 13:41:08.471789 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-drchr" podUID="333229d4-397f-4504-9e02-f793f42324f4" containerName="registry-server" containerID="cri-o://0ea5667e79a1622dcd0742e618db96e41479f5120bae756a04281871e8d2b135" gracePeriod=2 Mar 12 13:41:08 crc kubenswrapper[4921]: I0312 13:41:08.982089 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-drchr" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.054035 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcthw\" (UniqueName: \"kubernetes.io/projected/333229d4-397f-4504-9e02-f793f42324f4-kube-api-access-jcthw\") pod \"333229d4-397f-4504-9e02-f793f42324f4\" (UID: \"333229d4-397f-4504-9e02-f793f42324f4\") " Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.054103 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/333229d4-397f-4504-9e02-f793f42324f4-utilities\") pod \"333229d4-397f-4504-9e02-f793f42324f4\" (UID: \"333229d4-397f-4504-9e02-f793f42324f4\") " Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.054275 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/333229d4-397f-4504-9e02-f793f42324f4-catalog-content\") pod \"333229d4-397f-4504-9e02-f793f42324f4\" (UID: \"333229d4-397f-4504-9e02-f793f42324f4\") " Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.054901 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/333229d4-397f-4504-9e02-f793f42324f4-utilities" (OuterVolumeSpecName: "utilities") pod "333229d4-397f-4504-9e02-f793f42324f4" (UID: "333229d4-397f-4504-9e02-f793f42324f4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.061008 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/333229d4-397f-4504-9e02-f793f42324f4-kube-api-access-jcthw" (OuterVolumeSpecName: "kube-api-access-jcthw") pod "333229d4-397f-4504-9e02-f793f42324f4" (UID: "333229d4-397f-4504-9e02-f793f42324f4"). InnerVolumeSpecName "kube-api-access-jcthw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.108339 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/333229d4-397f-4504-9e02-f793f42324f4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "333229d4-397f-4504-9e02-f793f42324f4" (UID: "333229d4-397f-4504-9e02-f793f42324f4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.123867 4921 generic.go:334] "Generic (PLEG): container finished" podID="333229d4-397f-4504-9e02-f793f42324f4" containerID="0ea5667e79a1622dcd0742e618db96e41479f5120bae756a04281871e8d2b135" exitCode=0 Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.123938 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-drchr" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.123957 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-drchr" event={"ID":"333229d4-397f-4504-9e02-f793f42324f4","Type":"ContainerDied","Data":"0ea5667e79a1622dcd0742e618db96e41479f5120bae756a04281871e8d2b135"} Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.124027 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-drchr" event={"ID":"333229d4-397f-4504-9e02-f793f42324f4","Type":"ContainerDied","Data":"ef8c16ca9f3b46878fba56a45597c4580bde2e2ef5705c4331e696f3fe0664cd"} Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.124046 4921 scope.go:117] "RemoveContainer" containerID="0ea5667e79a1622dcd0742e618db96e41479f5120bae756a04281871e8d2b135" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.147904 4921 scope.go:117] "RemoveContainer" containerID="3d4e0c57be575143ea1013d89ace638b59d3c5d7cc5dc354e3fe812da5db0bd4" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.155206 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-drchr"] Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.155652 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/333229d4-397f-4504-9e02-f793f42324f4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.155671 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcthw\" (UniqueName: \"kubernetes.io/projected/333229d4-397f-4504-9e02-f793f42324f4-kube-api-access-jcthw\") on node \"crc\" DevicePath \"\"" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.155683 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/333229d4-397f-4504-9e02-f793f42324f4-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.162765 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-drchr"] Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.168483 4921 scope.go:117] "RemoveContainer" containerID="a78d12a4c60f8a5b36db1dd4b417b6ea47e9aea39886d04b7a8bd45328b7606d" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.201923 4921 scope.go:117] "RemoveContainer" containerID="0ea5667e79a1622dcd0742e618db96e41479f5120bae756a04281871e8d2b135" Mar 12 13:41:09 crc kubenswrapper[4921]: E0312 13:41:09.202343 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ea5667e79a1622dcd0742e618db96e41479f5120bae756a04281871e8d2b135\": container with ID starting with 0ea5667e79a1622dcd0742e618db96e41479f5120bae756a04281871e8d2b135 not found: ID does not exist" containerID="0ea5667e79a1622dcd0742e618db96e41479f5120bae756a04281871e8d2b135" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.202386 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea5667e79a1622dcd0742e618db96e41479f5120bae756a04281871e8d2b135"} err="failed to get container status \"0ea5667e79a1622dcd0742e618db96e41479f5120bae756a04281871e8d2b135\": rpc error: code = NotFound desc = could not find container \"0ea5667e79a1622dcd0742e618db96e41479f5120bae756a04281871e8d2b135\": container with ID starting with 0ea5667e79a1622dcd0742e618db96e41479f5120bae756a04281871e8d2b135 not found: ID does not exist" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.202413 4921 scope.go:117] "RemoveContainer" containerID="3d4e0c57be575143ea1013d89ace638b59d3c5d7cc5dc354e3fe812da5db0bd4" Mar 12 13:41:09 crc kubenswrapper[4921]: E0312 13:41:09.202764 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d4e0c57be575143ea1013d89ace638b59d3c5d7cc5dc354e3fe812da5db0bd4\": container with ID starting with 3d4e0c57be575143ea1013d89ace638b59d3c5d7cc5dc354e3fe812da5db0bd4 not found: ID does not exist" containerID="3d4e0c57be575143ea1013d89ace638b59d3c5d7cc5dc354e3fe812da5db0bd4" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.202806 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d4e0c57be575143ea1013d89ace638b59d3c5d7cc5dc354e3fe812da5db0bd4"} err="failed to get container status \"3d4e0c57be575143ea1013d89ace638b59d3c5d7cc5dc354e3fe812da5db0bd4\": rpc error: code = NotFound desc = could not find container \"3d4e0c57be575143ea1013d89ace638b59d3c5d7cc5dc354e3fe812da5db0bd4\": container with ID starting with 3d4e0c57be575143ea1013d89ace638b59d3c5d7cc5dc354e3fe812da5db0bd4 not found: ID does not exist" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.202848 4921 scope.go:117] "RemoveContainer" containerID="a78d12a4c60f8a5b36db1dd4b417b6ea47e9aea39886d04b7a8bd45328b7606d" Mar 12 13:41:09 crc kubenswrapper[4921]: E0312 13:41:09.203281 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a78d12a4c60f8a5b36db1dd4b417b6ea47e9aea39886d04b7a8bd45328b7606d\": container with ID starting with a78d12a4c60f8a5b36db1dd4b417b6ea47e9aea39886d04b7a8bd45328b7606d not found: ID does not exist" containerID="a78d12a4c60f8a5b36db1dd4b417b6ea47e9aea39886d04b7a8bd45328b7606d" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.203334 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a78d12a4c60f8a5b36db1dd4b417b6ea47e9aea39886d04b7a8bd45328b7606d"} err="failed to get container status \"a78d12a4c60f8a5b36db1dd4b417b6ea47e9aea39886d04b7a8bd45328b7606d\": rpc error: code = NotFound desc = could not find container \"a78d12a4c60f8a5b36db1dd4b417b6ea47e9aea39886d04b7a8bd45328b7606d\": container with ID starting with a78d12a4c60f8a5b36db1dd4b417b6ea47e9aea39886d04b7a8bd45328b7606d not found: ID does not exist" Mar 12 13:41:09 crc kubenswrapper[4921]: I0312 13:41:09.992169 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="333229d4-397f-4504-9e02-f793f42324f4" path="/var/lib/kubelet/pods/333229d4-397f-4504-9e02-f793f42324f4/volumes" Mar 12 13:41:10 crc kubenswrapper[4921]: I0312 13:41:10.983392 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:41:10 crc kubenswrapper[4921]: E0312 13:41:10.983675 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:41:25 crc kubenswrapper[4921]: I0312 13:41:25.983379 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:41:25 crc kubenswrapper[4921]: E0312 13:41:25.984021 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:41:34 crc kubenswrapper[4921]: I0312 13:41:34.056669 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-hm292"] Mar 12 13:41:34 crc kubenswrapper[4921]: I0312 13:41:34.065497 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-hm292"] Mar 12 13:41:36 crc kubenswrapper[4921]: I0312 13:41:36.003905 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00afedfb-6f74-48a9-92cc-4b7d6ac94161" path="/var/lib/kubelet/pods/00afedfb-6f74-48a9-92cc-4b7d6ac94161/volumes" Mar 12 13:41:39 crc kubenswrapper[4921]: I0312 13:41:39.984075 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:41:40 crc kubenswrapper[4921]: I0312 13:41:40.475247 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"a4a7fc64c7f961b98d4fd823b1232409bdff82a1f121c2f39f52f57afd9c59e2"} Mar 12 13:41:42 crc kubenswrapper[4921]: I0312 13:41:42.602888 4921 scope.go:117] "RemoveContainer" containerID="93cea6bebede9d4972c1fa8eca3402fa265a189c91c85e9ccbe28acf6dc9487c" Mar 12 13:41:42 crc kubenswrapper[4921]: I0312 13:41:42.669938 4921 scope.go:117] "RemoveContainer" containerID="cec077e080f0d09301e16fc0bedc02ab7146565f449d5c63057e60869a1d8412" Mar 12 13:41:42 crc kubenswrapper[4921]: I0312 13:41:42.727909 4921 scope.go:117] "RemoveContainer" containerID="6aa235037faf86183a1d5c20a523b0adf18be64c6990fdca817c654d763fe1ae" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.141669 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555382-9p2xd"] Mar 12 13:42:00 crc kubenswrapper[4921]: E0312 13:42:00.142902 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333229d4-397f-4504-9e02-f793f42324f4" containerName="extract-content" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.142923 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="333229d4-397f-4504-9e02-f793f42324f4" containerName="extract-content" Mar 12 13:42:00 crc kubenswrapper[4921]: E0312 13:42:00.142947 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9662bc1-fa95-47ed-be26-d91d2b0d42c2" containerName="extract-utilities" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.142998 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9662bc1-fa95-47ed-be26-d91d2b0d42c2" containerName="extract-utilities" Mar 12 13:42:00 crc kubenswrapper[4921]: E0312 13:42:00.143026 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9662bc1-fa95-47ed-be26-d91d2b0d42c2" containerName="extract-content" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.143039 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9662bc1-fa95-47ed-be26-d91d2b0d42c2" containerName="extract-content" Mar 12 13:42:00 crc kubenswrapper[4921]: E0312 13:42:00.143051 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333229d4-397f-4504-9e02-f793f42324f4" containerName="extract-utilities" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.143062 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="333229d4-397f-4504-9e02-f793f42324f4" containerName="extract-utilities" Mar 12 13:42:00 crc kubenswrapper[4921]: E0312 13:42:00.143081 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333229d4-397f-4504-9e02-f793f42324f4" containerName="registry-server" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.143092 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="333229d4-397f-4504-9e02-f793f42324f4" containerName="registry-server" Mar 12 13:42:00 crc kubenswrapper[4921]: E0312 13:42:00.143114 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9662bc1-fa95-47ed-be26-d91d2b0d42c2" containerName="registry-server" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.143125 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9662bc1-fa95-47ed-be26-d91d2b0d42c2" containerName="registry-server" Mar 12 13:42:00 crc kubenswrapper[4921]: E0312 13:42:00.143147 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2031892-1082-4d65-8768-ac76c82bfff0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.143161 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2031892-1082-4d65-8768-ac76c82bfff0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.143465 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2031892-1082-4d65-8768-ac76c82bfff0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.143487 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="333229d4-397f-4504-9e02-f793f42324f4" containerName="registry-server" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.143508 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9662bc1-fa95-47ed-be26-d91d2b0d42c2" containerName="registry-server" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.144527 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555382-9p2xd" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.147640 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.148000 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.148175 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.151080 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555382-9p2xd"] Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.241535 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4vrh\" (UniqueName: \"kubernetes.io/projected/64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99-kube-api-access-l4vrh\") pod \"auto-csr-approver-29555382-9p2xd\" (UID: \"64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99\") " pod="openshift-infra/auto-csr-approver-29555382-9p2xd" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.342773 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4vrh\" (UniqueName: \"kubernetes.io/projected/64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99-kube-api-access-l4vrh\") pod \"auto-csr-approver-29555382-9p2xd\" (UID: \"64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99\") " pod="openshift-infra/auto-csr-approver-29555382-9p2xd" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.363448 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4vrh\" (UniqueName: \"kubernetes.io/projected/64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99-kube-api-access-l4vrh\") pod \"auto-csr-approver-29555382-9p2xd\" (UID: \"64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99\") " pod="openshift-infra/auto-csr-approver-29555382-9p2xd" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.469394 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555382-9p2xd" Mar 12 13:42:00 crc kubenswrapper[4921]: I0312 13:42:00.915588 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555382-9p2xd"] Mar 12 13:42:01 crc kubenswrapper[4921]: I0312 13:42:01.669398 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555382-9p2xd" event={"ID":"64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99","Type":"ContainerStarted","Data":"38acd4843b902a396d35ff9a7e9117b5250f09dfd0d9930baf8ee356df4729cf"} Mar 12 13:42:02 crc kubenswrapper[4921]: I0312 13:42:02.679969 4921 generic.go:334] "Generic (PLEG): container finished" podID="64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99" containerID="86fd9a55d095552a12faf6f254fc0f37bf76c29cd783a1fb9032e2b2cca9982b" exitCode=0 Mar 12 13:42:02 crc kubenswrapper[4921]: I0312 13:42:02.680050 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555382-9p2xd" event={"ID":"64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99","Type":"ContainerDied","Data":"86fd9a55d095552a12faf6f254fc0f37bf76c29cd783a1fb9032e2b2cca9982b"} Mar 12 13:42:04 crc kubenswrapper[4921]: I0312 13:42:04.015653 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555382-9p2xd" Mar 12 13:42:04 crc kubenswrapper[4921]: I0312 13:42:04.110637 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4vrh\" (UniqueName: \"kubernetes.io/projected/64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99-kube-api-access-l4vrh\") pod \"64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99\" (UID: \"64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99\") " Mar 12 13:42:04 crc kubenswrapper[4921]: I0312 13:42:04.115833 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99-kube-api-access-l4vrh" (OuterVolumeSpecName: "kube-api-access-l4vrh") pod "64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99" (UID: "64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99"). InnerVolumeSpecName "kube-api-access-l4vrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:42:04 crc kubenswrapper[4921]: I0312 13:42:04.214553 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4vrh\" (UniqueName: \"kubernetes.io/projected/64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99-kube-api-access-l4vrh\") on node \"crc\" DevicePath \"\"" Mar 12 13:42:04 crc kubenswrapper[4921]: I0312 13:42:04.697632 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555382-9p2xd" event={"ID":"64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99","Type":"ContainerDied","Data":"38acd4843b902a396d35ff9a7e9117b5250f09dfd0d9930baf8ee356df4729cf"} Mar 12 13:42:04 crc kubenswrapper[4921]: I0312 13:42:04.697677 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38acd4843b902a396d35ff9a7e9117b5250f09dfd0d9930baf8ee356df4729cf" Mar 12 13:42:04 crc kubenswrapper[4921]: I0312 13:42:04.697748 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555382-9p2xd" Mar 12 13:42:05 crc kubenswrapper[4921]: I0312 13:42:05.077993 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555376-2fkvr"] Mar 12 13:42:05 crc kubenswrapper[4921]: I0312 13:42:05.084668 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555376-2fkvr"] Mar 12 13:42:06 crc kubenswrapper[4921]: I0312 13:42:06.000184 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cc7ff0c-3e22-46c8-b128-e637f21f83d3" path="/var/lib/kubelet/pods/7cc7ff0c-3e22-46c8-b128-e637f21f83d3/volumes" Mar 12 13:42:42 crc kubenswrapper[4921]: I0312 13:42:42.910896 4921 scope.go:117] "RemoveContainer" containerID="6488db2f8389620c8d4df043174953e4e0dafe0036a135bac043ac1faba99efc" Mar 12 13:43:56 crc kubenswrapper[4921]: I0312 13:43:56.324223 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:43:56 crc kubenswrapper[4921]: I0312 13:43:56.324831 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:44:00 crc kubenswrapper[4921]: I0312 13:44:00.142781 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555384-ckh8q"] Mar 12 13:44:00 crc kubenswrapper[4921]: E0312 13:44:00.143456 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99" containerName="oc" Mar 12 13:44:00 crc kubenswrapper[4921]: I0312 13:44:00.143469 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99" containerName="oc" Mar 12 13:44:00 crc kubenswrapper[4921]: I0312 13:44:00.143638 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99" containerName="oc" Mar 12 13:44:00 crc kubenswrapper[4921]: I0312 13:44:00.144328 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555384-ckh8q" Mar 12 13:44:00 crc kubenswrapper[4921]: I0312 13:44:00.149477 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:44:00 crc kubenswrapper[4921]: I0312 13:44:00.149689 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:44:00 crc kubenswrapper[4921]: I0312 13:44:00.150049 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:44:00 crc kubenswrapper[4921]: I0312 13:44:00.160077 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555384-ckh8q"] Mar 12 13:44:00 crc kubenswrapper[4921]: I0312 13:44:00.296429 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fmg6\" (UniqueName: \"kubernetes.io/projected/993efdcb-eebd-4fef-87eb-3a28609a17c4-kube-api-access-7fmg6\") pod \"auto-csr-approver-29555384-ckh8q\" (UID: \"993efdcb-eebd-4fef-87eb-3a28609a17c4\") " pod="openshift-infra/auto-csr-approver-29555384-ckh8q" Mar 12 13:44:00 crc kubenswrapper[4921]: I0312 13:44:00.398548 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fmg6\" (UniqueName: \"kubernetes.io/projected/993efdcb-eebd-4fef-87eb-3a28609a17c4-kube-api-access-7fmg6\") pod \"auto-csr-approver-29555384-ckh8q\" (UID: \"993efdcb-eebd-4fef-87eb-3a28609a17c4\") " pod="openshift-infra/auto-csr-approver-29555384-ckh8q" Mar 12 13:44:00 crc kubenswrapper[4921]: I0312 13:44:00.432078 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fmg6\" (UniqueName: \"kubernetes.io/projected/993efdcb-eebd-4fef-87eb-3a28609a17c4-kube-api-access-7fmg6\") pod \"auto-csr-approver-29555384-ckh8q\" (UID: \"993efdcb-eebd-4fef-87eb-3a28609a17c4\") " pod="openshift-infra/auto-csr-approver-29555384-ckh8q" Mar 12 13:44:00 crc kubenswrapper[4921]: I0312 13:44:00.463194 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555384-ckh8q" Mar 12 13:44:00 crc kubenswrapper[4921]: I0312 13:44:00.973922 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555384-ckh8q"] Mar 12 13:44:00 crc kubenswrapper[4921]: W0312 13:44:00.991277 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod993efdcb_eebd_4fef_87eb_3a28609a17c4.slice/crio-b1b1d26bc0c359a67ac3ef2c4a7ead955e6f7243e656991d6d1075c8cab66982 WatchSource:0}: Error finding container b1b1d26bc0c359a67ac3ef2c4a7ead955e6f7243e656991d6d1075c8cab66982: Status 404 returned error can't find the container with id b1b1d26bc0c359a67ac3ef2c4a7ead955e6f7243e656991d6d1075c8cab66982 Mar 12 13:44:00 crc kubenswrapper[4921]: I0312 13:44:00.994538 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:44:01 crc kubenswrapper[4921]: I0312 13:44:01.053800 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555384-ckh8q" event={"ID":"993efdcb-eebd-4fef-87eb-3a28609a17c4","Type":"ContainerStarted","Data":"b1b1d26bc0c359a67ac3ef2c4a7ead955e6f7243e656991d6d1075c8cab66982"} Mar 12 13:44:03 crc kubenswrapper[4921]: I0312 13:44:03.082729 4921 generic.go:334] "Generic (PLEG): container finished" podID="993efdcb-eebd-4fef-87eb-3a28609a17c4" containerID="a308e62970f42847874ce1824eca4d190448484417d1fcf05ddc2f476ca3fab5" exitCode=0 Mar 12 13:44:03 crc kubenswrapper[4921]: I0312 13:44:03.082901 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555384-ckh8q" event={"ID":"993efdcb-eebd-4fef-87eb-3a28609a17c4","Type":"ContainerDied","Data":"a308e62970f42847874ce1824eca4d190448484417d1fcf05ddc2f476ca3fab5"} Mar 12 13:44:04 crc kubenswrapper[4921]: I0312 13:44:04.374860 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555384-ckh8q" Mar 12 13:44:04 crc kubenswrapper[4921]: I0312 13:44:04.486678 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fmg6\" (UniqueName: \"kubernetes.io/projected/993efdcb-eebd-4fef-87eb-3a28609a17c4-kube-api-access-7fmg6\") pod \"993efdcb-eebd-4fef-87eb-3a28609a17c4\" (UID: \"993efdcb-eebd-4fef-87eb-3a28609a17c4\") " Mar 12 13:44:04 crc kubenswrapper[4921]: I0312 13:44:04.493206 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/993efdcb-eebd-4fef-87eb-3a28609a17c4-kube-api-access-7fmg6" (OuterVolumeSpecName: "kube-api-access-7fmg6") pod "993efdcb-eebd-4fef-87eb-3a28609a17c4" (UID: "993efdcb-eebd-4fef-87eb-3a28609a17c4"). InnerVolumeSpecName "kube-api-access-7fmg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:44:04 crc kubenswrapper[4921]: I0312 13:44:04.588627 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fmg6\" (UniqueName: \"kubernetes.io/projected/993efdcb-eebd-4fef-87eb-3a28609a17c4-kube-api-access-7fmg6\") on node \"crc\" DevicePath \"\"" Mar 12 13:44:05 crc kubenswrapper[4921]: I0312 13:44:05.106627 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555384-ckh8q" event={"ID":"993efdcb-eebd-4fef-87eb-3a28609a17c4","Type":"ContainerDied","Data":"b1b1d26bc0c359a67ac3ef2c4a7ead955e6f7243e656991d6d1075c8cab66982"} Mar 12 13:44:05 crc kubenswrapper[4921]: I0312 13:44:05.106687 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1b1d26bc0c359a67ac3ef2c4a7ead955e6f7243e656991d6d1075c8cab66982" Mar 12 13:44:05 crc kubenswrapper[4921]: I0312 13:44:05.106714 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555384-ckh8q" Mar 12 13:44:05 crc kubenswrapper[4921]: I0312 13:44:05.438764 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555378-c54xt"] Mar 12 13:44:05 crc kubenswrapper[4921]: I0312 13:44:05.446571 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555378-c54xt"] Mar 12 13:44:05 crc kubenswrapper[4921]: I0312 13:44:05.993303 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c84ef1cf-f416-4617-a119-169a2104bc89" path="/var/lib/kubelet/pods/c84ef1cf-f416-4617-a119-169a2104bc89/volumes" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.149181 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r8s9b"] Mar 12 13:44:20 crc kubenswrapper[4921]: E0312 13:44:20.151573 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993efdcb-eebd-4fef-87eb-3a28609a17c4" containerName="oc" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.151681 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="993efdcb-eebd-4fef-87eb-3a28609a17c4" containerName="oc" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.152007 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="993efdcb-eebd-4fef-87eb-3a28609a17c4" containerName="oc" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.154123 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.161540 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r8s9b"] Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.222878 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-utilities\") pod \"redhat-operators-r8s9b\" (UID: \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\") " pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.223048 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dftz\" (UniqueName: \"kubernetes.io/projected/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-kube-api-access-9dftz\") pod \"redhat-operators-r8s9b\" (UID: \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\") " pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.223112 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-catalog-content\") pod \"redhat-operators-r8s9b\" (UID: \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\") " pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.324438 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-utilities\") pod \"redhat-operators-r8s9b\" (UID: \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\") " pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.324564 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dftz\" (UniqueName: \"kubernetes.io/projected/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-kube-api-access-9dftz\") pod \"redhat-operators-r8s9b\" (UID: \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\") " pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.324598 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-catalog-content\") pod \"redhat-operators-r8s9b\" (UID: \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\") " pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.325073 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-utilities\") pod \"redhat-operators-r8s9b\" (UID: \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\") " pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.325102 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-catalog-content\") pod \"redhat-operators-r8s9b\" (UID: \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\") " pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.359597 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dftz\" (UniqueName: \"kubernetes.io/projected/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-kube-api-access-9dftz\") pod \"redhat-operators-r8s9b\" (UID: \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\") " pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.503925 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:20 crc kubenswrapper[4921]: I0312 13:44:20.977211 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r8s9b"] Mar 12 13:44:21 crc kubenswrapper[4921]: I0312 13:44:21.271671 4921 generic.go:334] "Generic (PLEG): container finished" podID="6ec8a571-b28a-4ab8-97ea-1c29932b8cad" containerID="38c7fb3776dc33d9fb59311321a6859b340f8a725fecd19a60c7999b148e2717" exitCode=0 Mar 12 13:44:21 crc kubenswrapper[4921]: I0312 13:44:21.271723 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8s9b" event={"ID":"6ec8a571-b28a-4ab8-97ea-1c29932b8cad","Type":"ContainerDied","Data":"38c7fb3776dc33d9fb59311321a6859b340f8a725fecd19a60c7999b148e2717"} Mar 12 13:44:21 crc kubenswrapper[4921]: I0312 13:44:21.271755 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8s9b" event={"ID":"6ec8a571-b28a-4ab8-97ea-1c29932b8cad","Type":"ContainerStarted","Data":"86e93e3075b7f31b22f174bac84581c67a1d79655c2f0fd7d77dace37cba324f"} Mar 12 13:44:22 crc kubenswrapper[4921]: I0312 13:44:22.290977 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8s9b" event={"ID":"6ec8a571-b28a-4ab8-97ea-1c29932b8cad","Type":"ContainerStarted","Data":"ac5559fb8c684418e119ea6a3ab0fa688a0d71fdf4cdf97174cdc3c09030ad44"} Mar 12 13:44:24 crc kubenswrapper[4921]: I0312 13:44:24.306994 4921 generic.go:334] "Generic (PLEG): container finished" podID="6ec8a571-b28a-4ab8-97ea-1c29932b8cad" containerID="ac5559fb8c684418e119ea6a3ab0fa688a0d71fdf4cdf97174cdc3c09030ad44" exitCode=0 Mar 12 13:44:24 crc kubenswrapper[4921]: I0312 13:44:24.307089 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8s9b" event={"ID":"6ec8a571-b28a-4ab8-97ea-1c29932b8cad","Type":"ContainerDied","Data":"ac5559fb8c684418e119ea6a3ab0fa688a0d71fdf4cdf97174cdc3c09030ad44"} Mar 12 13:44:26 crc kubenswrapper[4921]: I0312 13:44:26.323464 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:44:26 crc kubenswrapper[4921]: I0312 13:44:26.323855 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:44:26 crc kubenswrapper[4921]: I0312 13:44:26.325381 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8s9b" event={"ID":"6ec8a571-b28a-4ab8-97ea-1c29932b8cad","Type":"ContainerStarted","Data":"4e8abf1c305827321f6a7094567a538200298bccf5019abe69b660d7cd8c1ba1"} Mar 12 13:44:30 crc kubenswrapper[4921]: I0312 13:44:30.504462 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:30 crc kubenswrapper[4921]: I0312 13:44:30.505074 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:31 crc kubenswrapper[4921]: I0312 13:44:31.544786 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r8s9b" podUID="6ec8a571-b28a-4ab8-97ea-1c29932b8cad" containerName="registry-server" probeResult="failure" output=< Mar 12 13:44:31 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 13:44:31 crc kubenswrapper[4921]: > Mar 12 13:44:40 crc kubenswrapper[4921]: I0312 13:44:40.566428 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:40 crc kubenswrapper[4921]: I0312 13:44:40.596568 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r8s9b" podStartSLOduration=16.126148725 podStartE2EDuration="20.596543044s" podCreationTimestamp="2026-03-12 13:44:20 +0000 UTC" firstStartedPulling="2026-03-12 13:44:21.273460256 +0000 UTC m=+2083.963532227" lastFinishedPulling="2026-03-12 13:44:25.743854555 +0000 UTC m=+2088.433926546" observedRunningTime="2026-03-12 13:44:26.347489834 +0000 UTC m=+2089.037561805" watchObservedRunningTime="2026-03-12 13:44:40.596543044 +0000 UTC m=+2103.286615045" Mar 12 13:44:40 crc kubenswrapper[4921]: I0312 13:44:40.627641 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:40 crc kubenswrapper[4921]: I0312 13:44:40.807760 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r8s9b"] Mar 12 13:44:42 crc kubenswrapper[4921]: I0312 13:44:42.460632 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r8s9b" podUID="6ec8a571-b28a-4ab8-97ea-1c29932b8cad" containerName="registry-server" containerID="cri-o://4e8abf1c305827321f6a7094567a538200298bccf5019abe69b660d7cd8c1ba1" gracePeriod=2 Mar 12 13:44:42 crc kubenswrapper[4921]: I0312 13:44:42.943314 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.032166 4921 scope.go:117] "RemoveContainer" containerID="ffc2423c09527a0584d870700ec1c9dbbe5c170e2966414c03c79efebf0d440f" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.048832 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dftz\" (UniqueName: \"kubernetes.io/projected/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-kube-api-access-9dftz\") pod \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\" (UID: \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\") " Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.048987 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-catalog-content\") pod \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\" (UID: \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\") " Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.049157 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-utilities\") pod \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\" (UID: \"6ec8a571-b28a-4ab8-97ea-1c29932b8cad\") " Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.051045 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-utilities" (OuterVolumeSpecName: "utilities") pod "6ec8a571-b28a-4ab8-97ea-1c29932b8cad" (UID: "6ec8a571-b28a-4ab8-97ea-1c29932b8cad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.057028 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-kube-api-access-9dftz" (OuterVolumeSpecName: "kube-api-access-9dftz") pod "6ec8a571-b28a-4ab8-97ea-1c29932b8cad" (UID: "6ec8a571-b28a-4ab8-97ea-1c29932b8cad"). InnerVolumeSpecName "kube-api-access-9dftz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.151718 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.152082 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dftz\" (UniqueName: \"kubernetes.io/projected/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-kube-api-access-9dftz\") on node \"crc\" DevicePath \"\"" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.221844 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ec8a571-b28a-4ab8-97ea-1c29932b8cad" (UID: "6ec8a571-b28a-4ab8-97ea-1c29932b8cad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.253509 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ec8a571-b28a-4ab8-97ea-1c29932b8cad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.472328 4921 generic.go:334] "Generic (PLEG): container finished" podID="6ec8a571-b28a-4ab8-97ea-1c29932b8cad" containerID="4e8abf1c305827321f6a7094567a538200298bccf5019abe69b660d7cd8c1ba1" exitCode=0 Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.472397 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8s9b" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.472414 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8s9b" event={"ID":"6ec8a571-b28a-4ab8-97ea-1c29932b8cad","Type":"ContainerDied","Data":"4e8abf1c305827321f6a7094567a538200298bccf5019abe69b660d7cd8c1ba1"} Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.472483 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8s9b" event={"ID":"6ec8a571-b28a-4ab8-97ea-1c29932b8cad","Type":"ContainerDied","Data":"86e93e3075b7f31b22f174bac84581c67a1d79655c2f0fd7d77dace37cba324f"} Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.472508 4921 scope.go:117] "RemoveContainer" containerID="4e8abf1c305827321f6a7094567a538200298bccf5019abe69b660d7cd8c1ba1" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.519651 4921 scope.go:117] "RemoveContainer" containerID="ac5559fb8c684418e119ea6a3ab0fa688a0d71fdf4cdf97174cdc3c09030ad44" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.525315 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r8s9b"] Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.534893 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r8s9b"] Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.542028 4921 scope.go:117] "RemoveContainer" containerID="38c7fb3776dc33d9fb59311321a6859b340f8a725fecd19a60c7999b148e2717" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.558428 4921 scope.go:117] "RemoveContainer" containerID="4e8abf1c305827321f6a7094567a538200298bccf5019abe69b660d7cd8c1ba1" Mar 12 13:44:43 crc kubenswrapper[4921]: E0312 13:44:43.558934 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e8abf1c305827321f6a7094567a538200298bccf5019abe69b660d7cd8c1ba1\": container with ID starting with 4e8abf1c305827321f6a7094567a538200298bccf5019abe69b660d7cd8c1ba1 not found: ID does not exist" containerID="4e8abf1c305827321f6a7094567a538200298bccf5019abe69b660d7cd8c1ba1" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.558981 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8abf1c305827321f6a7094567a538200298bccf5019abe69b660d7cd8c1ba1"} err="failed to get container status \"4e8abf1c305827321f6a7094567a538200298bccf5019abe69b660d7cd8c1ba1\": rpc error: code = NotFound desc = could not find container \"4e8abf1c305827321f6a7094567a538200298bccf5019abe69b660d7cd8c1ba1\": container with ID starting with 4e8abf1c305827321f6a7094567a538200298bccf5019abe69b660d7cd8c1ba1 not found: ID does not exist" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.559010 4921 scope.go:117] "RemoveContainer" containerID="ac5559fb8c684418e119ea6a3ab0fa688a0d71fdf4cdf97174cdc3c09030ad44" Mar 12 13:44:43 crc kubenswrapper[4921]: E0312 13:44:43.559440 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac5559fb8c684418e119ea6a3ab0fa688a0d71fdf4cdf97174cdc3c09030ad44\": container with ID starting with ac5559fb8c684418e119ea6a3ab0fa688a0d71fdf4cdf97174cdc3c09030ad44 not found: ID does not exist" containerID="ac5559fb8c684418e119ea6a3ab0fa688a0d71fdf4cdf97174cdc3c09030ad44" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.559562 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5559fb8c684418e119ea6a3ab0fa688a0d71fdf4cdf97174cdc3c09030ad44"} err="failed to get container status \"ac5559fb8c684418e119ea6a3ab0fa688a0d71fdf4cdf97174cdc3c09030ad44\": rpc error: code = NotFound desc = could not find container \"ac5559fb8c684418e119ea6a3ab0fa688a0d71fdf4cdf97174cdc3c09030ad44\": container with ID starting with ac5559fb8c684418e119ea6a3ab0fa688a0d71fdf4cdf97174cdc3c09030ad44 not found: ID does not exist" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.559671 4921 scope.go:117] "RemoveContainer" containerID="38c7fb3776dc33d9fb59311321a6859b340f8a725fecd19a60c7999b148e2717" Mar 12 13:44:43 crc kubenswrapper[4921]: E0312 13:44:43.560025 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c7fb3776dc33d9fb59311321a6859b340f8a725fecd19a60c7999b148e2717\": container with ID starting with 38c7fb3776dc33d9fb59311321a6859b340f8a725fecd19a60c7999b148e2717 not found: ID does not exist" containerID="38c7fb3776dc33d9fb59311321a6859b340f8a725fecd19a60c7999b148e2717" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.560154 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c7fb3776dc33d9fb59311321a6859b340f8a725fecd19a60c7999b148e2717"} err="failed to get container status \"38c7fb3776dc33d9fb59311321a6859b340f8a725fecd19a60c7999b148e2717\": rpc error: code = NotFound desc = could not find container \"38c7fb3776dc33d9fb59311321a6859b340f8a725fecd19a60c7999b148e2717\": container with ID starting with 38c7fb3776dc33d9fb59311321a6859b340f8a725fecd19a60c7999b148e2717 not found: ID does not exist" Mar 12 13:44:43 crc kubenswrapper[4921]: I0312 13:44:43.998245 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec8a571-b28a-4ab8-97ea-1c29932b8cad" path="/var/lib/kubelet/pods/6ec8a571-b28a-4ab8-97ea-1c29932b8cad/volumes" Mar 12 13:44:56 crc kubenswrapper[4921]: I0312 13:44:56.324442 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:44:56 crc kubenswrapper[4921]: I0312 13:44:56.325179 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:44:56 crc kubenswrapper[4921]: I0312 13:44:56.325249 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:44:56 crc kubenswrapper[4921]: I0312 13:44:56.326201 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a4a7fc64c7f961b98d4fd823b1232409bdff82a1f121c2f39f52f57afd9c59e2"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 13:44:56 crc kubenswrapper[4921]: I0312 13:44:56.326310 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://a4a7fc64c7f961b98d4fd823b1232409bdff82a1f121c2f39f52f57afd9c59e2" gracePeriod=600 Mar 12 13:44:56 crc kubenswrapper[4921]: I0312 13:44:56.594350 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="a4a7fc64c7f961b98d4fd823b1232409bdff82a1f121c2f39f52f57afd9c59e2" exitCode=0 Mar 12 13:44:56 crc kubenswrapper[4921]: I0312 13:44:56.594386 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"a4a7fc64c7f961b98d4fd823b1232409bdff82a1f121c2f39f52f57afd9c59e2"} Mar 12 13:44:56 crc kubenswrapper[4921]: I0312 13:44:56.594721 4921 scope.go:117] "RemoveContainer" containerID="860c6b9b18b961d0e63a2c38be90f07d57f4869350f55730a8da257230eb70f8" Mar 12 13:44:57 crc kubenswrapper[4921]: I0312 13:44:57.609679 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44"} Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.168809 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7"] Mar 12 13:45:00 crc kubenswrapper[4921]: E0312 13:45:00.169476 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec8a571-b28a-4ab8-97ea-1c29932b8cad" containerName="extract-utilities" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.169488 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec8a571-b28a-4ab8-97ea-1c29932b8cad" containerName="extract-utilities" Mar 12 13:45:00 crc kubenswrapper[4921]: E0312 13:45:00.169505 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec8a571-b28a-4ab8-97ea-1c29932b8cad" containerName="extract-content" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.169512 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec8a571-b28a-4ab8-97ea-1c29932b8cad" containerName="extract-content" Mar 12 13:45:00 crc kubenswrapper[4921]: E0312 13:45:00.169531 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec8a571-b28a-4ab8-97ea-1c29932b8cad" containerName="registry-server" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.169537 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec8a571-b28a-4ab8-97ea-1c29932b8cad" containerName="registry-server" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.169691 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec8a571-b28a-4ab8-97ea-1c29932b8cad" containerName="registry-server" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.170301 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.172291 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.174571 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.186320 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7"] Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.186936 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/816dce54-4563-4471-815c-2c8ecbb5bad1-config-volume\") pod \"collect-profiles-29555385-q57v7\" (UID: \"816dce54-4563-4471-815c-2c8ecbb5bad1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.187018 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbmnm\" (UniqueName: \"kubernetes.io/projected/816dce54-4563-4471-815c-2c8ecbb5bad1-kube-api-access-rbmnm\") pod \"collect-profiles-29555385-q57v7\" (UID: \"816dce54-4563-4471-815c-2c8ecbb5bad1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.187242 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/816dce54-4563-4471-815c-2c8ecbb5bad1-secret-volume\") pod \"collect-profiles-29555385-q57v7\" (UID: \"816dce54-4563-4471-815c-2c8ecbb5bad1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.288348 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/816dce54-4563-4471-815c-2c8ecbb5bad1-config-volume\") pod \"collect-profiles-29555385-q57v7\" (UID: \"816dce54-4563-4471-815c-2c8ecbb5bad1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.288408 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbmnm\" (UniqueName: \"kubernetes.io/projected/816dce54-4563-4471-815c-2c8ecbb5bad1-kube-api-access-rbmnm\") pod \"collect-profiles-29555385-q57v7\" (UID: \"816dce54-4563-4471-815c-2c8ecbb5bad1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.288485 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/816dce54-4563-4471-815c-2c8ecbb5bad1-secret-volume\") pod \"collect-profiles-29555385-q57v7\" (UID: \"816dce54-4563-4471-815c-2c8ecbb5bad1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.290342 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/816dce54-4563-4471-815c-2c8ecbb5bad1-config-volume\") pod \"collect-profiles-29555385-q57v7\" (UID: \"816dce54-4563-4471-815c-2c8ecbb5bad1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.297370 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/816dce54-4563-4471-815c-2c8ecbb5bad1-secret-volume\") pod \"collect-profiles-29555385-q57v7\" (UID: \"816dce54-4563-4471-815c-2c8ecbb5bad1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.320142 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbmnm\" (UniqueName: \"kubernetes.io/projected/816dce54-4563-4471-815c-2c8ecbb5bad1-kube-api-access-rbmnm\") pod \"collect-profiles-29555385-q57v7\" (UID: \"816dce54-4563-4471-815c-2c8ecbb5bad1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" Mar 12 13:45:00 crc kubenswrapper[4921]: I0312 13:45:00.492955 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" Mar 12 13:45:01 crc kubenswrapper[4921]: I0312 13:45:01.005775 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7"] Mar 12 13:45:01 crc kubenswrapper[4921]: I0312 13:45:01.694155 4921 generic.go:334] "Generic (PLEG): container finished" podID="816dce54-4563-4471-815c-2c8ecbb5bad1" containerID="a48add2b7ff4f878626c8146881de1bad7dd0ec1adfda661b4e4fad8a56794fe" exitCode=0 Mar 12 13:45:01 crc kubenswrapper[4921]: I0312 13:45:01.694215 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" event={"ID":"816dce54-4563-4471-815c-2c8ecbb5bad1","Type":"ContainerDied","Data":"a48add2b7ff4f878626c8146881de1bad7dd0ec1adfda661b4e4fad8a56794fe"} Mar 12 13:45:01 crc kubenswrapper[4921]: I0312 13:45:01.694539 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" event={"ID":"816dce54-4563-4471-815c-2c8ecbb5bad1","Type":"ContainerStarted","Data":"c853fdb41a409e7def53f8722d1b554289ad8b4bf6663e6319528f7bd7dd159c"} Mar 12 13:45:03 crc kubenswrapper[4921]: I0312 13:45:03.103649 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" Mar 12 13:45:03 crc kubenswrapper[4921]: I0312 13:45:03.246367 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/816dce54-4563-4471-815c-2c8ecbb5bad1-secret-volume\") pod \"816dce54-4563-4471-815c-2c8ecbb5bad1\" (UID: \"816dce54-4563-4471-815c-2c8ecbb5bad1\") " Mar 12 13:45:03 crc kubenswrapper[4921]: I0312 13:45:03.246550 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbmnm\" (UniqueName: \"kubernetes.io/projected/816dce54-4563-4471-815c-2c8ecbb5bad1-kube-api-access-rbmnm\") pod \"816dce54-4563-4471-815c-2c8ecbb5bad1\" (UID: \"816dce54-4563-4471-815c-2c8ecbb5bad1\") " Mar 12 13:45:03 crc kubenswrapper[4921]: I0312 13:45:03.246602 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/816dce54-4563-4471-815c-2c8ecbb5bad1-config-volume\") pod \"816dce54-4563-4471-815c-2c8ecbb5bad1\" (UID: \"816dce54-4563-4471-815c-2c8ecbb5bad1\") " Mar 12 13:45:03 crc kubenswrapper[4921]: I0312 13:45:03.247331 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/816dce54-4563-4471-815c-2c8ecbb5bad1-config-volume" (OuterVolumeSpecName: "config-volume") pod "816dce54-4563-4471-815c-2c8ecbb5bad1" (UID: "816dce54-4563-4471-815c-2c8ecbb5bad1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:45:03 crc kubenswrapper[4921]: I0312 13:45:03.254265 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/816dce54-4563-4471-815c-2c8ecbb5bad1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "816dce54-4563-4471-815c-2c8ecbb5bad1" (UID: "816dce54-4563-4471-815c-2c8ecbb5bad1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:45:03 crc kubenswrapper[4921]: I0312 13:45:03.254890 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/816dce54-4563-4471-815c-2c8ecbb5bad1-kube-api-access-rbmnm" (OuterVolumeSpecName: "kube-api-access-rbmnm") pod "816dce54-4563-4471-815c-2c8ecbb5bad1" (UID: "816dce54-4563-4471-815c-2c8ecbb5bad1"). InnerVolumeSpecName "kube-api-access-rbmnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:45:03 crc kubenswrapper[4921]: I0312 13:45:03.348280 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/816dce54-4563-4471-815c-2c8ecbb5bad1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 13:45:03 crc kubenswrapper[4921]: I0312 13:45:03.348317 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbmnm\" (UniqueName: \"kubernetes.io/projected/816dce54-4563-4471-815c-2c8ecbb5bad1-kube-api-access-rbmnm\") on node \"crc\" DevicePath \"\"" Mar 12 13:45:03 crc kubenswrapper[4921]: I0312 13:45:03.348326 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/816dce54-4563-4471-815c-2c8ecbb5bad1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 13:45:03 crc kubenswrapper[4921]: I0312 13:45:03.713917 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" event={"ID":"816dce54-4563-4471-815c-2c8ecbb5bad1","Type":"ContainerDied","Data":"c853fdb41a409e7def53f8722d1b554289ad8b4bf6663e6319528f7bd7dd159c"} Mar 12 13:45:03 crc kubenswrapper[4921]: I0312 13:45:03.713954 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c853fdb41a409e7def53f8722d1b554289ad8b4bf6663e6319528f7bd7dd159c" Mar 12 13:45:03 crc kubenswrapper[4921]: I0312 13:45:03.713970 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7" Mar 12 13:45:04 crc kubenswrapper[4921]: I0312 13:45:04.180371 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc"] Mar 12 13:45:04 crc kubenswrapper[4921]: I0312 13:45:04.187192 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555340-b9fqc"] Mar 12 13:45:05 crc kubenswrapper[4921]: I0312 13:45:05.992983 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f403288d-b503-4f0c-bf83-3b29ff86ab94" path="/var/lib/kubelet/pods/f403288d-b503-4f0c-bf83-3b29ff86ab94/volumes" Mar 12 13:45:43 crc kubenswrapper[4921]: I0312 13:45:43.136514 4921 scope.go:117] "RemoveContainer" containerID="fc91c9434028740655300c434b681676ae5f6bc96b088a1a58fa416bc98b3208" Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.156281 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.168686 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.177614 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.185124 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ngwk7"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.192164 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.198292 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-bct9b"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.203863 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.209434 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-tklmg"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.215003 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.220560 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-22qhj"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.226449 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.232142 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.237894 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-dx9qx"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.243634 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-djzqb"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.249545 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.255026 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-v5mbg"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.260545 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-kp84r"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.266244 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-tklmg"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.271893 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m7kpq"] Mar 12 13:45:56 crc kubenswrapper[4921]: I0312 13:45:56.279480 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-c5nbx"] Mar 12 13:45:57 crc kubenswrapper[4921]: I0312 13:45:57.997418 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dddd99f-1673-4e6e-983e-f5667c60e686" path="/var/lib/kubelet/pods/0dddd99f-1673-4e6e-983e-f5667c60e686/volumes" Mar 12 13:45:57 crc kubenswrapper[4921]: I0312 13:45:57.998989 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13f57794-2ec2-4a54-b5ec-e955a3e65288" path="/var/lib/kubelet/pods/13f57794-2ec2-4a54-b5ec-e955a3e65288/volumes" Mar 12 13:45:58 crc kubenswrapper[4921]: I0312 13:45:58.000099 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="287a8351-7199-4b48-90c1-e1a58233fae2" path="/var/lib/kubelet/pods/287a8351-7199-4b48-90c1-e1a58233fae2/volumes" Mar 12 13:45:58 crc kubenswrapper[4921]: I0312 13:45:58.001177 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db44bac-3fcf-42b8-9703-4dc130b8b31f" path="/var/lib/kubelet/pods/2db44bac-3fcf-42b8-9703-4dc130b8b31f/volumes" Mar 12 13:45:58 crc kubenswrapper[4921]: I0312 13:45:58.003283 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43644579-35dc-4418-86e1-4f40c7bdcb8c" path="/var/lib/kubelet/pods/43644579-35dc-4418-86e1-4f40c7bdcb8c/volumes" Mar 12 13:45:58 crc kubenswrapper[4921]: I0312 13:45:58.004427 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5ed6b8d-cd37-4f5a-b673-90c3527d99dd" path="/var/lib/kubelet/pods/a5ed6b8d-cd37-4f5a-b673-90c3527d99dd/volumes" Mar 12 13:45:58 crc kubenswrapper[4921]: I0312 13:45:58.005571 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0bf6515-dde5-4125-b030-f56edc8f6e31" path="/var/lib/kubelet/pods/b0bf6515-dde5-4125-b030-f56edc8f6e31/volumes" Mar 12 13:45:58 crc kubenswrapper[4921]: I0312 13:45:58.007977 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd563802-76ca-4f00-bb21-caef86a804ce" path="/var/lib/kubelet/pods/cd563802-76ca-4f00-bb21-caef86a804ce/volumes" Mar 12 13:45:58 crc kubenswrapper[4921]: I0312 13:45:58.008712 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2031892-1082-4d65-8768-ac76c82bfff0" path="/var/lib/kubelet/pods/e2031892-1082-4d65-8768-ac76c82bfff0/volumes" Mar 12 13:45:58 crc kubenswrapper[4921]: I0312 13:45:58.009516 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea960b85-3b3c-4afb-a363-a3a0e3327701" path="/var/lib/kubelet/pods/ea960b85-3b3c-4afb-a363-a3a0e3327701/volumes" Mar 12 13:46:00 crc kubenswrapper[4921]: I0312 13:46:00.145278 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555386-fph74"] Mar 12 13:46:00 crc kubenswrapper[4921]: E0312 13:46:00.146520 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="816dce54-4563-4471-815c-2c8ecbb5bad1" containerName="collect-profiles" Mar 12 13:46:00 crc kubenswrapper[4921]: I0312 13:46:00.146556 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="816dce54-4563-4471-815c-2c8ecbb5bad1" containerName="collect-profiles" Mar 12 13:46:00 crc kubenswrapper[4921]: I0312 13:46:00.147175 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="816dce54-4563-4471-815c-2c8ecbb5bad1" containerName="collect-profiles" Mar 12 13:46:00 crc kubenswrapper[4921]: I0312 13:46:00.148426 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555386-fph74" Mar 12 13:46:00 crc kubenswrapper[4921]: I0312 13:46:00.151068 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:46:00 crc kubenswrapper[4921]: I0312 13:46:00.151242 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:46:00 crc kubenswrapper[4921]: I0312 13:46:00.152658 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:46:00 crc kubenswrapper[4921]: I0312 13:46:00.166837 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555386-fph74"] Mar 12 13:46:00 crc kubenswrapper[4921]: I0312 13:46:00.249134 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glcb4\" (UniqueName: \"kubernetes.io/projected/4d97b3bf-7844-4079-85f8-2e38c3f16346-kube-api-access-glcb4\") pod \"auto-csr-approver-29555386-fph74\" (UID: \"4d97b3bf-7844-4079-85f8-2e38c3f16346\") " pod="openshift-infra/auto-csr-approver-29555386-fph74" Mar 12 13:46:00 crc kubenswrapper[4921]: I0312 13:46:00.351201 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glcb4\" (UniqueName: \"kubernetes.io/projected/4d97b3bf-7844-4079-85f8-2e38c3f16346-kube-api-access-glcb4\") pod \"auto-csr-approver-29555386-fph74\" (UID: \"4d97b3bf-7844-4079-85f8-2e38c3f16346\") " pod="openshift-infra/auto-csr-approver-29555386-fph74" Mar 12 13:46:00 crc kubenswrapper[4921]: I0312 13:46:00.372977 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glcb4\" (UniqueName: \"kubernetes.io/projected/4d97b3bf-7844-4079-85f8-2e38c3f16346-kube-api-access-glcb4\") pod \"auto-csr-approver-29555386-fph74\" (UID: \"4d97b3bf-7844-4079-85f8-2e38c3f16346\") " pod="openshift-infra/auto-csr-approver-29555386-fph74" Mar 12 13:46:00 crc kubenswrapper[4921]: I0312 13:46:00.479325 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555386-fph74" Mar 12 13:46:00 crc kubenswrapper[4921]: I0312 13:46:00.749901 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555386-fph74"] Mar 12 13:46:01 crc kubenswrapper[4921]: I0312 13:46:01.244789 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555386-fph74" event={"ID":"4d97b3bf-7844-4079-85f8-2e38c3f16346","Type":"ContainerStarted","Data":"6a789c5aea13b45fde184798c1f70607670b476a510fb67f1c5ea63fc0f38166"} Mar 12 13:46:02 crc kubenswrapper[4921]: I0312 13:46:02.254092 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555386-fph74" event={"ID":"4d97b3bf-7844-4079-85f8-2e38c3f16346","Type":"ContainerStarted","Data":"2cfaf000d1932ead71adf47335ba9033cdb8f2f9271112224ca5f6c4ecc4b2b0"} Mar 12 13:46:02 crc kubenswrapper[4921]: I0312 13:46:02.273697 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555386-fph74" podStartSLOduration=1.26320333 podStartE2EDuration="2.273670762s" podCreationTimestamp="2026-03-12 13:46:00 +0000 UTC" firstStartedPulling="2026-03-12 13:46:00.771132308 +0000 UTC m=+2183.461204279" lastFinishedPulling="2026-03-12 13:46:01.78159974 +0000 UTC m=+2184.471671711" observedRunningTime="2026-03-12 13:46:02.265737218 +0000 UTC m=+2184.955809199" watchObservedRunningTime="2026-03-12 13:46:02.273670762 +0000 UTC m=+2184.963742733" Mar 12 13:46:03 crc kubenswrapper[4921]: I0312 13:46:03.262495 4921 generic.go:334] "Generic (PLEG): container finished" podID="4d97b3bf-7844-4079-85f8-2e38c3f16346" containerID="2cfaf000d1932ead71adf47335ba9033cdb8f2f9271112224ca5f6c4ecc4b2b0" exitCode=0 Mar 12 13:46:03 crc kubenswrapper[4921]: I0312 13:46:03.262560 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555386-fph74" event={"ID":"4d97b3bf-7844-4079-85f8-2e38c3f16346","Type":"ContainerDied","Data":"2cfaf000d1932ead71adf47335ba9033cdb8f2f9271112224ca5f6c4ecc4b2b0"} Mar 12 13:46:04 crc kubenswrapper[4921]: I0312 13:46:04.590228 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555386-fph74" Mar 12 13:46:04 crc kubenswrapper[4921]: I0312 13:46:04.750456 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glcb4\" (UniqueName: \"kubernetes.io/projected/4d97b3bf-7844-4079-85f8-2e38c3f16346-kube-api-access-glcb4\") pod \"4d97b3bf-7844-4079-85f8-2e38c3f16346\" (UID: \"4d97b3bf-7844-4079-85f8-2e38c3f16346\") " Mar 12 13:46:04 crc kubenswrapper[4921]: I0312 13:46:04.756245 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d97b3bf-7844-4079-85f8-2e38c3f16346-kube-api-access-glcb4" (OuterVolumeSpecName: "kube-api-access-glcb4") pod "4d97b3bf-7844-4079-85f8-2e38c3f16346" (UID: "4d97b3bf-7844-4079-85f8-2e38c3f16346"). InnerVolumeSpecName "kube-api-access-glcb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:46:04 crc kubenswrapper[4921]: I0312 13:46:04.852133 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glcb4\" (UniqueName: \"kubernetes.io/projected/4d97b3bf-7844-4079-85f8-2e38c3f16346-kube-api-access-glcb4\") on node \"crc\" DevicePath \"\"" Mar 12 13:46:05 crc kubenswrapper[4921]: I0312 13:46:05.280162 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555386-fph74" event={"ID":"4d97b3bf-7844-4079-85f8-2e38c3f16346","Type":"ContainerDied","Data":"6a789c5aea13b45fde184798c1f70607670b476a510fb67f1c5ea63fc0f38166"} Mar 12 13:46:05 crc kubenswrapper[4921]: I0312 13:46:05.280200 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a789c5aea13b45fde184798c1f70607670b476a510fb67f1c5ea63fc0f38166" Mar 12 13:46:05 crc kubenswrapper[4921]: I0312 13:46:05.280238 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555386-fph74" Mar 12 13:46:05 crc kubenswrapper[4921]: I0312 13:46:05.317864 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555380-s2f9t"] Mar 12 13:46:05 crc kubenswrapper[4921]: I0312 13:46:05.326621 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555380-s2f9t"] Mar 12 13:46:05 crc kubenswrapper[4921]: I0312 13:46:05.995745 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1" path="/var/lib/kubelet/pods/ab54291f-40a8-4b0b-9f61-5e4d81fd0bd1/volumes" Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.854114 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5"] Mar 12 13:46:09 crc kubenswrapper[4921]: E0312 13:46:09.854925 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d97b3bf-7844-4079-85f8-2e38c3f16346" containerName="oc" Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.854945 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d97b3bf-7844-4079-85f8-2e38c3f16346" containerName="oc" Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.855161 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d97b3bf-7844-4079-85f8-2e38c3f16346" containerName="oc" Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.855868 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.858507 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.859242 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.861573 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.863120 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.870703 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.885137 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5"] Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.942141 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.942195 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.942218 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.942424 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:09 crc kubenswrapper[4921]: I0312 13:46:09.942539 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvmgb\" (UniqueName: \"kubernetes.io/projected/66cfa5a2-1910-4504-84cb-24e75749c210-kube-api-access-xvmgb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:10 crc kubenswrapper[4921]: I0312 13:46:10.043772 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:10 crc kubenswrapper[4921]: I0312 13:46:10.044314 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvmgb\" (UniqueName: \"kubernetes.io/projected/66cfa5a2-1910-4504-84cb-24e75749c210-kube-api-access-xvmgb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:10 crc kubenswrapper[4921]: I0312 13:46:10.044369 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:10 crc kubenswrapper[4921]: I0312 13:46:10.044419 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:10 crc kubenswrapper[4921]: I0312 13:46:10.044447 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:10 crc kubenswrapper[4921]: I0312 13:46:10.052498 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:10 crc kubenswrapper[4921]: I0312 13:46:10.052544 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:10 crc kubenswrapper[4921]: I0312 13:46:10.054085 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:10 crc kubenswrapper[4921]: I0312 13:46:10.054525 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:10 crc kubenswrapper[4921]: I0312 13:46:10.068457 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvmgb\" (UniqueName: \"kubernetes.io/projected/66cfa5a2-1910-4504-84cb-24e75749c210-kube-api-access-xvmgb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:10 crc kubenswrapper[4921]: I0312 13:46:10.180389 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:10 crc kubenswrapper[4921]: I0312 13:46:10.711945 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5"] Mar 12 13:46:11 crc kubenswrapper[4921]: I0312 13:46:11.330832 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" event={"ID":"66cfa5a2-1910-4504-84cb-24e75749c210","Type":"ContainerStarted","Data":"aeb3a754229c44239cdd3f0deac13bf5a533663aec638485115d90432a2e8b09"} Mar 12 13:46:12 crc kubenswrapper[4921]: I0312 13:46:12.343186 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" event={"ID":"66cfa5a2-1910-4504-84cb-24e75749c210","Type":"ContainerStarted","Data":"d5781f124070fb9add485a297298a535821c8fea27d4cf0cad3f2407ca7e84bd"} Mar 12 13:46:12 crc kubenswrapper[4921]: I0312 13:46:12.366634 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" podStartSLOduration=2.929304524 podStartE2EDuration="3.366615284s" podCreationTimestamp="2026-03-12 13:46:09 +0000 UTC" firstStartedPulling="2026-03-12 13:46:10.720876752 +0000 UTC m=+2193.410948723" lastFinishedPulling="2026-03-12 13:46:11.158187482 +0000 UTC m=+2193.848259483" observedRunningTime="2026-03-12 13:46:12.35900673 +0000 UTC m=+2195.049078711" watchObservedRunningTime="2026-03-12 13:46:12.366615284 +0000 UTC m=+2195.056687255" Mar 12 13:46:23 crc kubenswrapper[4921]: I0312 13:46:23.479075 4921 generic.go:334] "Generic (PLEG): container finished" podID="66cfa5a2-1910-4504-84cb-24e75749c210" containerID="d5781f124070fb9add485a297298a535821c8fea27d4cf0cad3f2407ca7e84bd" exitCode=0 Mar 12 13:46:23 crc kubenswrapper[4921]: I0312 13:46:23.479197 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" event={"ID":"66cfa5a2-1910-4504-84cb-24e75749c210","Type":"ContainerDied","Data":"d5781f124070fb9add485a297298a535821c8fea27d4cf0cad3f2407ca7e84bd"} Mar 12 13:46:24 crc kubenswrapper[4921]: I0312 13:46:24.880951 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.038118 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-inventory\") pod \"66cfa5a2-1910-4504-84cb-24e75749c210\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.038234 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-repo-setup-combined-ca-bundle\") pod \"66cfa5a2-1910-4504-84cb-24e75749c210\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.038273 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-ceph\") pod \"66cfa5a2-1910-4504-84cb-24e75749c210\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.038329 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvmgb\" (UniqueName: \"kubernetes.io/projected/66cfa5a2-1910-4504-84cb-24e75749c210-kube-api-access-xvmgb\") pod \"66cfa5a2-1910-4504-84cb-24e75749c210\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.038378 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-ssh-key-openstack-edpm-ipam\") pod \"66cfa5a2-1910-4504-84cb-24e75749c210\" (UID: \"66cfa5a2-1910-4504-84cb-24e75749c210\") " Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.044972 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "66cfa5a2-1910-4504-84cb-24e75749c210" (UID: "66cfa5a2-1910-4504-84cb-24e75749c210"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.045006 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66cfa5a2-1910-4504-84cb-24e75749c210-kube-api-access-xvmgb" (OuterVolumeSpecName: "kube-api-access-xvmgb") pod "66cfa5a2-1910-4504-84cb-24e75749c210" (UID: "66cfa5a2-1910-4504-84cb-24e75749c210"). InnerVolumeSpecName "kube-api-access-xvmgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.047277 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-ceph" (OuterVolumeSpecName: "ceph") pod "66cfa5a2-1910-4504-84cb-24e75749c210" (UID: "66cfa5a2-1910-4504-84cb-24e75749c210"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.072908 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "66cfa5a2-1910-4504-84cb-24e75749c210" (UID: "66cfa5a2-1910-4504-84cb-24e75749c210"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.080743 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-inventory" (OuterVolumeSpecName: "inventory") pod "66cfa5a2-1910-4504-84cb-24e75749c210" (UID: "66cfa5a2-1910-4504-84cb-24e75749c210"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.140622 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvmgb\" (UniqueName: \"kubernetes.io/projected/66cfa5a2-1910-4504-84cb-24e75749c210-kube-api-access-xvmgb\") on node \"crc\" DevicePath \"\"" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.140648 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.140656 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.140665 4921 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.140674 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/66cfa5a2-1910-4504-84cb-24e75749c210-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.499000 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" event={"ID":"66cfa5a2-1910-4504-84cb-24e75749c210","Type":"ContainerDied","Data":"aeb3a754229c44239cdd3f0deac13bf5a533663aec638485115d90432a2e8b09"} Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.499059 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aeb3a754229c44239cdd3f0deac13bf5a533663aec638485115d90432a2e8b09" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.499072 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.618407 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf"] Mar 12 13:46:25 crc kubenswrapper[4921]: E0312 13:46:25.618924 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66cfa5a2-1910-4504-84cb-24e75749c210" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.618947 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="66cfa5a2-1910-4504-84cb-24e75749c210" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.619170 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="66cfa5a2-1910-4504-84cb-24e75749c210" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.620039 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.624262 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.624280 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.624339 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.624502 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.632537 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.632674 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf"] Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.765217 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.765486 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vx5f\" (UniqueName: \"kubernetes.io/projected/e5130d9e-9678-42d8-9394-bcced05db054-kube-api-access-6vx5f\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.765541 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.765589 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.765673 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.867210 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.867563 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vx5f\" (UniqueName: \"kubernetes.io/projected/e5130d9e-9678-42d8-9394-bcced05db054-kube-api-access-6vx5f\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.867588 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.867608 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.867642 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.871935 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.871998 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.872616 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.873138 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.891114 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vx5f\" (UniqueName: \"kubernetes.io/projected/e5130d9e-9678-42d8-9394-bcced05db054-kube-api-access-6vx5f\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:25 crc kubenswrapper[4921]: I0312 13:46:25.950384 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:46:26 crc kubenswrapper[4921]: I0312 13:46:26.546520 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf"] Mar 12 13:46:26 crc kubenswrapper[4921]: W0312 13:46:26.557832 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5130d9e_9678_42d8_9394_bcced05db054.slice/crio-13d8b3c2384d38346a9c0ca7b4e0ba8b7e0d6d8686b9dd7c77d0927729f12798 WatchSource:0}: Error finding container 13d8b3c2384d38346a9c0ca7b4e0ba8b7e0d6d8686b9dd7c77d0927729f12798: Status 404 returned error can't find the container with id 13d8b3c2384d38346a9c0ca7b4e0ba8b7e0d6d8686b9dd7c77d0927729f12798 Mar 12 13:46:27 crc kubenswrapper[4921]: I0312 13:46:27.537768 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" event={"ID":"e5130d9e-9678-42d8-9394-bcced05db054","Type":"ContainerStarted","Data":"29adaa419c2821b3d7dea3bd6fb2a14b3591751a1f7ad74ea82dad5b096a3843"} Mar 12 13:46:27 crc kubenswrapper[4921]: I0312 13:46:27.538244 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" event={"ID":"e5130d9e-9678-42d8-9394-bcced05db054","Type":"ContainerStarted","Data":"13d8b3c2384d38346a9c0ca7b4e0ba8b7e0d6d8686b9dd7c77d0927729f12798"} Mar 12 13:46:27 crc kubenswrapper[4921]: I0312 13:46:27.558756 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" podStartSLOduration=2.137328996 podStartE2EDuration="2.558739153s" podCreationTimestamp="2026-03-12 13:46:25 +0000 UTC" firstStartedPulling="2026-03-12 13:46:26.560148664 +0000 UTC m=+2209.250220645" lastFinishedPulling="2026-03-12 13:46:26.981558821 +0000 UTC m=+2209.671630802" observedRunningTime="2026-03-12 13:46:27.554797443 +0000 UTC m=+2210.244869414" watchObservedRunningTime="2026-03-12 13:46:27.558739153 +0000 UTC m=+2210.248811124" Mar 12 13:46:43 crc kubenswrapper[4921]: I0312 13:46:43.213908 4921 scope.go:117] "RemoveContainer" containerID="340af1523869e468840aa430e759e3a1bbeafd3ae7d343dcc5266e10acdb0a9c" Mar 12 13:46:43 crc kubenswrapper[4921]: I0312 13:46:43.326857 4921 scope.go:117] "RemoveContainer" containerID="1acf22ddeb5d88f6a619c688958b4a143b097e3e0314fa0c5f5765363c6f35b3" Mar 12 13:46:43 crc kubenswrapper[4921]: I0312 13:46:43.359378 4921 scope.go:117] "RemoveContainer" containerID="87b1556fc0b76ff6dc9a3b973a1949039dbe94fada8cca491cee4ba53e803be2" Mar 12 13:46:43 crc kubenswrapper[4921]: I0312 13:46:43.388282 4921 scope.go:117] "RemoveContainer" containerID="df989e8f8c9418baec821bcd9c40a977ea784f8b6e134fe2d3564147b7ed3e8e" Mar 12 13:46:43 crc kubenswrapper[4921]: I0312 13:46:43.434155 4921 scope.go:117] "RemoveContainer" containerID="3d99b376685fc632acec5ed4815f10ab19f97d0d9dbef0c549226197f3cf3612" Mar 12 13:46:43 crc kubenswrapper[4921]: I0312 13:46:43.494138 4921 scope.go:117] "RemoveContainer" containerID="8a2dc04e56154073b796882a618b9dce3727c28d2a8a4f8c61b892117102b146" Mar 12 13:46:43 crc kubenswrapper[4921]: I0312 13:46:43.538047 4921 scope.go:117] "RemoveContainer" containerID="52a7c12169126ebf1da8074f295d3a7a00929d9b8f6c6f3d8cb83850b8805005" Mar 12 13:46:43 crc kubenswrapper[4921]: I0312 13:46:43.598143 4921 scope.go:117] "RemoveContainer" containerID="3a0f22f57c77b380da1205f336f508b487366b0af040624e7ca73b5b43cb3ac5" Mar 12 13:46:43 crc kubenswrapper[4921]: I0312 13:46:43.625690 4921 scope.go:117] "RemoveContainer" containerID="4bbf860c94b9ecdf0cb3c7221f98a8ca71810aa582ecad1f6a39308e27449edd" Mar 12 13:46:43 crc kubenswrapper[4921]: I0312 13:46:43.690794 4921 scope.go:117] "RemoveContainer" containerID="f9ccbfee9fa36be612914ab85b2455ec477225b301d234b847e9a3a585366ed8" Mar 12 13:46:43 crc kubenswrapper[4921]: I0312 13:46:43.724077 4921 scope.go:117] "RemoveContainer" containerID="bb49d6322eed3ab617b5198826c0d92e19218aace45460b8e6fc78b761b0f700" Mar 12 13:46:56 crc kubenswrapper[4921]: I0312 13:46:56.323445 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:46:56 crc kubenswrapper[4921]: I0312 13:46:56.323911 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:47:21 crc kubenswrapper[4921]: I0312 13:47:21.004637 4921 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-kf975 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.30:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 13:47:21 crc kubenswrapper[4921]: I0312 13:47:21.005305 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kf975" podUID="20f1f547-f958-419e-a5c2-58695625d6ad" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.30:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 13:47:26 crc kubenswrapper[4921]: I0312 13:47:26.325866 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:47:26 crc kubenswrapper[4921]: I0312 13:47:26.326631 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:47:56 crc kubenswrapper[4921]: I0312 13:47:56.324507 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:47:56 crc kubenswrapper[4921]: I0312 13:47:56.325116 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:47:56 crc kubenswrapper[4921]: I0312 13:47:56.325162 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:47:56 crc kubenswrapper[4921]: I0312 13:47:56.325768 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 13:47:56 crc kubenswrapper[4921]: I0312 13:47:56.325851 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" gracePeriod=600 Mar 12 13:47:56 crc kubenswrapper[4921]: E0312 13:47:56.472873 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:47:57 crc kubenswrapper[4921]: I0312 13:47:57.385331 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" exitCode=0 Mar 12 13:47:57 crc kubenswrapper[4921]: I0312 13:47:57.385399 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44"} Mar 12 13:47:57 crc kubenswrapper[4921]: I0312 13:47:57.385712 4921 scope.go:117] "RemoveContainer" containerID="a4a7fc64c7f961b98d4fd823b1232409bdff82a1f121c2f39f52f57afd9c59e2" Mar 12 13:47:57 crc kubenswrapper[4921]: I0312 13:47:57.386492 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:47:57 crc kubenswrapper[4921]: E0312 13:47:57.386788 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:48:00 crc kubenswrapper[4921]: I0312 13:48:00.145908 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555388-nhldt"] Mar 12 13:48:00 crc kubenswrapper[4921]: I0312 13:48:00.147362 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555388-nhldt" Mar 12 13:48:00 crc kubenswrapper[4921]: I0312 13:48:00.150110 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:48:00 crc kubenswrapper[4921]: I0312 13:48:00.150614 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:48:00 crc kubenswrapper[4921]: I0312 13:48:00.151557 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:48:00 crc kubenswrapper[4921]: I0312 13:48:00.155361 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555388-nhldt"] Mar 12 13:48:00 crc kubenswrapper[4921]: I0312 13:48:00.283036 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b68nm\" (UniqueName: \"kubernetes.io/projected/236910c2-bd47-434a-af0c-f599664fda24-kube-api-access-b68nm\") pod \"auto-csr-approver-29555388-nhldt\" (UID: \"236910c2-bd47-434a-af0c-f599664fda24\") " pod="openshift-infra/auto-csr-approver-29555388-nhldt" Mar 12 13:48:00 crc kubenswrapper[4921]: I0312 13:48:00.388361 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b68nm\" (UniqueName: \"kubernetes.io/projected/236910c2-bd47-434a-af0c-f599664fda24-kube-api-access-b68nm\") pod \"auto-csr-approver-29555388-nhldt\" (UID: \"236910c2-bd47-434a-af0c-f599664fda24\") " pod="openshift-infra/auto-csr-approver-29555388-nhldt" Mar 12 13:48:00 crc kubenswrapper[4921]: I0312 13:48:00.414691 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b68nm\" (UniqueName: \"kubernetes.io/projected/236910c2-bd47-434a-af0c-f599664fda24-kube-api-access-b68nm\") pod \"auto-csr-approver-29555388-nhldt\" (UID: \"236910c2-bd47-434a-af0c-f599664fda24\") " pod="openshift-infra/auto-csr-approver-29555388-nhldt" Mar 12 13:48:00 crc kubenswrapper[4921]: I0312 13:48:00.466436 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555388-nhldt" Mar 12 13:48:00 crc kubenswrapper[4921]: I0312 13:48:00.901111 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555388-nhldt"] Mar 12 13:48:01 crc kubenswrapper[4921]: I0312 13:48:01.425013 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555388-nhldt" event={"ID":"236910c2-bd47-434a-af0c-f599664fda24","Type":"ContainerStarted","Data":"eb32c5b00bca833faf2d363fe9e7ee51ebebff6d209d59e465ad3829cc6412c5"} Mar 12 13:48:03 crc kubenswrapper[4921]: I0312 13:48:03.443754 4921 generic.go:334] "Generic (PLEG): container finished" podID="236910c2-bd47-434a-af0c-f599664fda24" containerID="9c27737841a993b5bd568e4afa70a46914aead1836c096c9e3c7edbacedc46d6" exitCode=0 Mar 12 13:48:03 crc kubenswrapper[4921]: I0312 13:48:03.443832 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555388-nhldt" event={"ID":"236910c2-bd47-434a-af0c-f599664fda24","Type":"ContainerDied","Data":"9c27737841a993b5bd568e4afa70a46914aead1836c096c9e3c7edbacedc46d6"} Mar 12 13:48:04 crc kubenswrapper[4921]: I0312 13:48:04.805963 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555388-nhldt" Mar 12 13:48:04 crc kubenswrapper[4921]: I0312 13:48:04.875034 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b68nm\" (UniqueName: \"kubernetes.io/projected/236910c2-bd47-434a-af0c-f599664fda24-kube-api-access-b68nm\") pod \"236910c2-bd47-434a-af0c-f599664fda24\" (UID: \"236910c2-bd47-434a-af0c-f599664fda24\") " Mar 12 13:48:04 crc kubenswrapper[4921]: I0312 13:48:04.886646 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/236910c2-bd47-434a-af0c-f599664fda24-kube-api-access-b68nm" (OuterVolumeSpecName: "kube-api-access-b68nm") pod "236910c2-bd47-434a-af0c-f599664fda24" (UID: "236910c2-bd47-434a-af0c-f599664fda24"). InnerVolumeSpecName "kube-api-access-b68nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:48:04 crc kubenswrapper[4921]: I0312 13:48:04.977238 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b68nm\" (UniqueName: \"kubernetes.io/projected/236910c2-bd47-434a-af0c-f599664fda24-kube-api-access-b68nm\") on node \"crc\" DevicePath \"\"" Mar 12 13:48:05 crc kubenswrapper[4921]: I0312 13:48:05.465717 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555388-nhldt" event={"ID":"236910c2-bd47-434a-af0c-f599664fda24","Type":"ContainerDied","Data":"eb32c5b00bca833faf2d363fe9e7ee51ebebff6d209d59e465ad3829cc6412c5"} Mar 12 13:48:05 crc kubenswrapper[4921]: I0312 13:48:05.465758 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb32c5b00bca833faf2d363fe9e7ee51ebebff6d209d59e465ad3829cc6412c5" Mar 12 13:48:05 crc kubenswrapper[4921]: I0312 13:48:05.465780 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555388-nhldt" Mar 12 13:48:05 crc kubenswrapper[4921]: I0312 13:48:05.885819 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555382-9p2xd"] Mar 12 13:48:05 crc kubenswrapper[4921]: I0312 13:48:05.893691 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555382-9p2xd"] Mar 12 13:48:05 crc kubenswrapper[4921]: I0312 13:48:05.993968 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99" path="/var/lib/kubelet/pods/64b7e9a3-5b00-4d30-94c6-8bfa7c37ea99/volumes" Mar 12 13:48:09 crc kubenswrapper[4921]: I0312 13:48:09.514607 4921 generic.go:334] "Generic (PLEG): container finished" podID="e5130d9e-9678-42d8-9394-bcced05db054" containerID="29adaa419c2821b3d7dea3bd6fb2a14b3591751a1f7ad74ea82dad5b096a3843" exitCode=0 Mar 12 13:48:09 crc kubenswrapper[4921]: I0312 13:48:09.514724 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" event={"ID":"e5130d9e-9678-42d8-9394-bcced05db054","Type":"ContainerDied","Data":"29adaa419c2821b3d7dea3bd6fb2a14b3591751a1f7ad74ea82dad5b096a3843"} Mar 12 13:48:10 crc kubenswrapper[4921]: I0312 13:48:10.916786 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:48:10 crc kubenswrapper[4921]: I0312 13:48:10.987640 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-inventory\") pod \"e5130d9e-9678-42d8-9394-bcced05db054\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " Mar 12 13:48:10 crc kubenswrapper[4921]: I0312 13:48:10.988054 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-ceph\") pod \"e5130d9e-9678-42d8-9394-bcced05db054\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " Mar 12 13:48:10 crc kubenswrapper[4921]: I0312 13:48:10.988185 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vx5f\" (UniqueName: \"kubernetes.io/projected/e5130d9e-9678-42d8-9394-bcced05db054-kube-api-access-6vx5f\") pod \"e5130d9e-9678-42d8-9394-bcced05db054\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " Mar 12 13:48:10 crc kubenswrapper[4921]: I0312 13:48:10.988308 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-bootstrap-combined-ca-bundle\") pod \"e5130d9e-9678-42d8-9394-bcced05db054\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " Mar 12 13:48:10 crc kubenswrapper[4921]: I0312 13:48:10.988470 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-ssh-key-openstack-edpm-ipam\") pod \"e5130d9e-9678-42d8-9394-bcced05db054\" (UID: \"e5130d9e-9678-42d8-9394-bcced05db054\") " Mar 12 13:48:10 crc kubenswrapper[4921]: I0312 13:48:10.998588 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e5130d9e-9678-42d8-9394-bcced05db054" (UID: "e5130d9e-9678-42d8-9394-bcced05db054"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.005041 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5130d9e-9678-42d8-9394-bcced05db054-kube-api-access-6vx5f" (OuterVolumeSpecName: "kube-api-access-6vx5f") pod "e5130d9e-9678-42d8-9394-bcced05db054" (UID: "e5130d9e-9678-42d8-9394-bcced05db054"). InnerVolumeSpecName "kube-api-access-6vx5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.006484 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-ceph" (OuterVolumeSpecName: "ceph") pod "e5130d9e-9678-42d8-9394-bcced05db054" (UID: "e5130d9e-9678-42d8-9394-bcced05db054"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.036578 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e5130d9e-9678-42d8-9394-bcced05db054" (UID: "e5130d9e-9678-42d8-9394-bcced05db054"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.040072 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-inventory" (OuterVolumeSpecName: "inventory") pod "e5130d9e-9678-42d8-9394-bcced05db054" (UID: "e5130d9e-9678-42d8-9394-bcced05db054"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.091426 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.091475 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.091494 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.091507 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vx5f\" (UniqueName: \"kubernetes.io/projected/e5130d9e-9678-42d8-9394-bcced05db054-kube-api-access-6vx5f\") on node \"crc\" DevicePath \"\"" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.091518 4921 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5130d9e-9678-42d8-9394-bcced05db054-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.554543 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" event={"ID":"e5130d9e-9678-42d8-9394-bcced05db054","Type":"ContainerDied","Data":"13d8b3c2384d38346a9c0ca7b4e0ba8b7e0d6d8686b9dd7c77d0927729f12798"} Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.555019 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.555061 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13d8b3c2384d38346a9c0ca7b4e0ba8b7e0d6d8686b9dd7c77d0927729f12798" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.632210 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw"] Mar 12 13:48:11 crc kubenswrapper[4921]: E0312 13:48:11.632554 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236910c2-bd47-434a-af0c-f599664fda24" containerName="oc" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.632572 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="236910c2-bd47-434a-af0c-f599664fda24" containerName="oc" Mar 12 13:48:11 crc kubenswrapper[4921]: E0312 13:48:11.632597 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5130d9e-9678-42d8-9394-bcced05db054" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.632605 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5130d9e-9678-42d8-9394-bcced05db054" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.632772 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="236910c2-bd47-434a-af0c-f599664fda24" containerName="oc" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.632785 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5130d9e-9678-42d8-9394-bcced05db054" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.633343 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.637235 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.637257 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.637429 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.637712 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.638111 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.651135 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw"] Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.700974 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.701030 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.701079 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.701184 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbvhx\" (UniqueName: \"kubernetes.io/projected/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-kube-api-access-cbvhx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.803115 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.803313 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbvhx\" (UniqueName: \"kubernetes.io/projected/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-kube-api-access-cbvhx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.803439 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.803493 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.808462 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.813237 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.816371 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.836539 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbvhx\" (UniqueName: \"kubernetes.io/projected/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-kube-api-access-cbvhx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.958442 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:11 crc kubenswrapper[4921]: I0312 13:48:11.983727 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:48:11 crc kubenswrapper[4921]: E0312 13:48:11.984515 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:48:12 crc kubenswrapper[4921]: I0312 13:48:12.553803 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw"] Mar 12 13:48:13 crc kubenswrapper[4921]: I0312 13:48:13.574117 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" event={"ID":"0a18ea59-b5e6-40e3-8096-0f2bda4563bb","Type":"ContainerStarted","Data":"89a780d3186d95de2e0f4d12702d1b2088401c890ac0bf0a79fe84f0ce7840fa"} Mar 12 13:48:14 crc kubenswrapper[4921]: I0312 13:48:14.585074 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" event={"ID":"0a18ea59-b5e6-40e3-8096-0f2bda4563bb","Type":"ContainerStarted","Data":"83d5715cbeeac7de7ca3922e2a15c1ddcc3a000d898b20b1677307c745a01077"} Mar 12 13:48:14 crc kubenswrapper[4921]: I0312 13:48:14.606557 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" podStartSLOduration=2.754887464 podStartE2EDuration="3.606541117s" podCreationTimestamp="2026-03-12 13:48:11 +0000 UTC" firstStartedPulling="2026-03-12 13:48:12.570996807 +0000 UTC m=+2315.261068778" lastFinishedPulling="2026-03-12 13:48:13.42265045 +0000 UTC m=+2316.112722431" observedRunningTime="2026-03-12 13:48:14.601741501 +0000 UTC m=+2317.291813512" watchObservedRunningTime="2026-03-12 13:48:14.606541117 +0000 UTC m=+2317.296613088" Mar 12 13:48:23 crc kubenswrapper[4921]: I0312 13:48:23.983450 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:48:23 crc kubenswrapper[4921]: E0312 13:48:23.984551 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:48:36 crc kubenswrapper[4921]: I0312 13:48:36.984054 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:48:36 crc kubenswrapper[4921]: E0312 13:48:36.985145 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:48:41 crc kubenswrapper[4921]: I0312 13:48:41.814670 4921 generic.go:334] "Generic (PLEG): container finished" podID="0a18ea59-b5e6-40e3-8096-0f2bda4563bb" containerID="83d5715cbeeac7de7ca3922e2a15c1ddcc3a000d898b20b1677307c745a01077" exitCode=0 Mar 12 13:48:41 crc kubenswrapper[4921]: I0312 13:48:41.814776 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" event={"ID":"0a18ea59-b5e6-40e3-8096-0f2bda4563bb","Type":"ContainerDied","Data":"83d5715cbeeac7de7ca3922e2a15c1ddcc3a000d898b20b1677307c745a01077"} Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.328101 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.517577 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-inventory\") pod \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.517772 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbvhx\" (UniqueName: \"kubernetes.io/projected/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-kube-api-access-cbvhx\") pod \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.517868 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-ceph\") pod \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.517942 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-ssh-key-openstack-edpm-ipam\") pod \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\" (UID: \"0a18ea59-b5e6-40e3-8096-0f2bda4563bb\") " Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.524834 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-kube-api-access-cbvhx" (OuterVolumeSpecName: "kube-api-access-cbvhx") pod "0a18ea59-b5e6-40e3-8096-0f2bda4563bb" (UID: "0a18ea59-b5e6-40e3-8096-0f2bda4563bb"). InnerVolumeSpecName "kube-api-access-cbvhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.525760 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-ceph" (OuterVolumeSpecName: "ceph") pod "0a18ea59-b5e6-40e3-8096-0f2bda4563bb" (UID: "0a18ea59-b5e6-40e3-8096-0f2bda4563bb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.554208 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-inventory" (OuterVolumeSpecName: "inventory") pod "0a18ea59-b5e6-40e3-8096-0f2bda4563bb" (UID: "0a18ea59-b5e6-40e3-8096-0f2bda4563bb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.564653 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0a18ea59-b5e6-40e3-8096-0f2bda4563bb" (UID: "0a18ea59-b5e6-40e3-8096-0f2bda4563bb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.619871 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.619902 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbvhx\" (UniqueName: \"kubernetes.io/projected/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-kube-api-access-cbvhx\") on node \"crc\" DevicePath \"\"" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.619914 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.619924 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0a18ea59-b5e6-40e3-8096-0f2bda4563bb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.834893 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" event={"ID":"0a18ea59-b5e6-40e3-8096-0f2bda4563bb","Type":"ContainerDied","Data":"89a780d3186d95de2e0f4d12702d1b2088401c890ac0bf0a79fe84f0ce7840fa"} Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.834934 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89a780d3186d95de2e0f4d12702d1b2088401c890ac0bf0a79fe84f0ce7840fa" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.835473 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.953387 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm"] Mar 12 13:48:43 crc kubenswrapper[4921]: E0312 13:48:43.954282 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a18ea59-b5e6-40e3-8096-0f2bda4563bb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.954309 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a18ea59-b5e6-40e3-8096-0f2bda4563bb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.954598 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a18ea59-b5e6-40e3-8096-0f2bda4563bb" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.955540 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.962350 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.962468 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.962489 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.962597 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.962747 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.965978 4921 scope.go:117] "RemoveContainer" containerID="86fd9a55d095552a12faf6f254fc0f37bf76c29cd783a1fb9032e2b2cca9982b" Mar 12 13:48:43 crc kubenswrapper[4921]: I0312 13:48:43.982799 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm"] Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.128975 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.129042 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.129065 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wxct\" (UniqueName: \"kubernetes.io/projected/36211ec3-db4f-4485-a93d-08dd120af919-kube-api-access-2wxct\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.129246 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.231128 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.231242 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.231286 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wxct\" (UniqueName: \"kubernetes.io/projected/36211ec3-db4f-4485-a93d-08dd120af919-kube-api-access-2wxct\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.231364 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.235752 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.235937 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.237362 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.248109 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wxct\" (UniqueName: \"kubernetes.io/projected/36211ec3-db4f-4485-a93d-08dd120af919-kube-api-access-2wxct\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.278175 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.818400 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm"] Mar 12 13:48:44 crc kubenswrapper[4921]: W0312 13:48:44.826108 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36211ec3_db4f_4485_a93d_08dd120af919.slice/crio-28a16d9666b4bf9f9f637b9f75e25f5aa215826bc3366565cccca1837fdf8ef5 WatchSource:0}: Error finding container 28a16d9666b4bf9f9f637b9f75e25f5aa215826bc3366565cccca1837fdf8ef5: Status 404 returned error can't find the container with id 28a16d9666b4bf9f9f637b9f75e25f5aa215826bc3366565cccca1837fdf8ef5 Mar 12 13:48:44 crc kubenswrapper[4921]: I0312 13:48:44.846893 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" event={"ID":"36211ec3-db4f-4485-a93d-08dd120af919","Type":"ContainerStarted","Data":"28a16d9666b4bf9f9f637b9f75e25f5aa215826bc3366565cccca1837fdf8ef5"} Mar 12 13:48:45 crc kubenswrapper[4921]: I0312 13:48:45.856580 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" event={"ID":"36211ec3-db4f-4485-a93d-08dd120af919","Type":"ContainerStarted","Data":"b3f4c41a85d0e613c1ecaad122a980f5a471b0258bede4f795d343356e56cdcf"} Mar 12 13:48:45 crc kubenswrapper[4921]: I0312 13:48:45.875837 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" podStartSLOduration=2.165311036 podStartE2EDuration="2.875767178s" podCreationTimestamp="2026-03-12 13:48:43 +0000 UTC" firstStartedPulling="2026-03-12 13:48:44.828688528 +0000 UTC m=+2347.518760499" lastFinishedPulling="2026-03-12 13:48:45.53914464 +0000 UTC m=+2348.229216641" observedRunningTime="2026-03-12 13:48:45.869394903 +0000 UTC m=+2348.559466884" watchObservedRunningTime="2026-03-12 13:48:45.875767178 +0000 UTC m=+2348.565839159" Mar 12 13:48:51 crc kubenswrapper[4921]: I0312 13:48:51.908710 4921 generic.go:334] "Generic (PLEG): container finished" podID="36211ec3-db4f-4485-a93d-08dd120af919" containerID="b3f4c41a85d0e613c1ecaad122a980f5a471b0258bede4f795d343356e56cdcf" exitCode=0 Mar 12 13:48:51 crc kubenswrapper[4921]: I0312 13:48:51.908828 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" event={"ID":"36211ec3-db4f-4485-a93d-08dd120af919","Type":"ContainerDied","Data":"b3f4c41a85d0e613c1ecaad122a980f5a471b0258bede4f795d343356e56cdcf"} Mar 12 13:48:51 crc kubenswrapper[4921]: I0312 13:48:51.983942 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:48:51 crc kubenswrapper[4921]: E0312 13:48:51.984354 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.271484 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.311442 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-inventory\") pod \"36211ec3-db4f-4485-a93d-08dd120af919\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.311520 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-ceph\") pod \"36211ec3-db4f-4485-a93d-08dd120af919\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.311590 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-ssh-key-openstack-edpm-ipam\") pod \"36211ec3-db4f-4485-a93d-08dd120af919\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.311611 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wxct\" (UniqueName: \"kubernetes.io/projected/36211ec3-db4f-4485-a93d-08dd120af919-kube-api-access-2wxct\") pod \"36211ec3-db4f-4485-a93d-08dd120af919\" (UID: \"36211ec3-db4f-4485-a93d-08dd120af919\") " Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.321202 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-ceph" (OuterVolumeSpecName: "ceph") pod "36211ec3-db4f-4485-a93d-08dd120af919" (UID: "36211ec3-db4f-4485-a93d-08dd120af919"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.322137 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36211ec3-db4f-4485-a93d-08dd120af919-kube-api-access-2wxct" (OuterVolumeSpecName: "kube-api-access-2wxct") pod "36211ec3-db4f-4485-a93d-08dd120af919" (UID: "36211ec3-db4f-4485-a93d-08dd120af919"). InnerVolumeSpecName "kube-api-access-2wxct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.347795 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "36211ec3-db4f-4485-a93d-08dd120af919" (UID: "36211ec3-db4f-4485-a93d-08dd120af919"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.368527 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-inventory" (OuterVolumeSpecName: "inventory") pod "36211ec3-db4f-4485-a93d-08dd120af919" (UID: "36211ec3-db4f-4485-a93d-08dd120af919"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.414021 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.414069 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.414089 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36211ec3-db4f-4485-a93d-08dd120af919-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.414111 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wxct\" (UniqueName: \"kubernetes.io/projected/36211ec3-db4f-4485-a93d-08dd120af919-kube-api-access-2wxct\") on node \"crc\" DevicePath \"\"" Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.932710 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" event={"ID":"36211ec3-db4f-4485-a93d-08dd120af919","Type":"ContainerDied","Data":"28a16d9666b4bf9f9f637b9f75e25f5aa215826bc3366565cccca1837fdf8ef5"} Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.933161 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28a16d9666b4bf9f9f637b9f75e25f5aa215826bc3366565cccca1837fdf8ef5" Mar 12 13:48:53 crc kubenswrapper[4921]: I0312 13:48:53.933032 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.037311 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf"] Mar 12 13:48:54 crc kubenswrapper[4921]: E0312 13:48:54.037713 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36211ec3-db4f-4485-a93d-08dd120af919" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.037734 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="36211ec3-db4f-4485-a93d-08dd120af919" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.038003 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="36211ec3-db4f-4485-a93d-08dd120af919" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.038742 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.046717 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.051593 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.051636 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.051846 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.052113 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.055565 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf"] Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.231243 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hfsqf\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.231328 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9vz4\" (UniqueName: \"kubernetes.io/projected/56567424-34cd-49a4-ad03-c72a25a07058-kube-api-access-f9vz4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hfsqf\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.231471 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hfsqf\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.231710 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hfsqf\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.333471 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hfsqf\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.333558 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9vz4\" (UniqueName: \"kubernetes.io/projected/56567424-34cd-49a4-ad03-c72a25a07058-kube-api-access-f9vz4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hfsqf\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.333593 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hfsqf\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.333682 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hfsqf\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.337298 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hfsqf\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.337315 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hfsqf\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.340970 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hfsqf\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.356086 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9vz4\" (UniqueName: \"kubernetes.io/projected/56567424-34cd-49a4-ad03-c72a25a07058-kube-api-access-f9vz4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-hfsqf\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:48:54 crc kubenswrapper[4921]: I0312 13:48:54.360529 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:48:55 crc kubenswrapper[4921]: I0312 13:48:55.035165 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf"] Mar 12 13:48:55 crc kubenswrapper[4921]: I0312 13:48:55.949808 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" event={"ID":"56567424-34cd-49a4-ad03-c72a25a07058","Type":"ContainerStarted","Data":"02461580253c51a2893153c9d5d6bb476db59b6782ffc60fec479847bbff44ff"} Mar 12 13:48:55 crc kubenswrapper[4921]: I0312 13:48:55.950187 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" event={"ID":"56567424-34cd-49a4-ad03-c72a25a07058","Type":"ContainerStarted","Data":"1ad5b01d8dfd8be3ad1b438ff9e8a3c3c48770068832020335ce9df7ace58964"} Mar 12 13:48:55 crc kubenswrapper[4921]: I0312 13:48:55.983391 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" podStartSLOduration=1.598455103 podStartE2EDuration="1.983368926s" podCreationTimestamp="2026-03-12 13:48:54 +0000 UTC" firstStartedPulling="2026-03-12 13:48:55.045915773 +0000 UTC m=+2357.735987744" lastFinishedPulling="2026-03-12 13:48:55.430829596 +0000 UTC m=+2358.120901567" observedRunningTime="2026-03-12 13:48:55.975089203 +0000 UTC m=+2358.665161174" watchObservedRunningTime="2026-03-12 13:48:55.983368926 +0000 UTC m=+2358.673440917" Mar 12 13:49:02 crc kubenswrapper[4921]: I0312 13:49:02.982851 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:49:02 crc kubenswrapper[4921]: E0312 13:49:02.983665 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:49:13 crc kubenswrapper[4921]: I0312 13:49:13.982932 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:49:13 crc kubenswrapper[4921]: E0312 13:49:13.983611 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:49:28 crc kubenswrapper[4921]: I0312 13:49:28.983965 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:49:28 crc kubenswrapper[4921]: E0312 13:49:28.984844 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:49:34 crc kubenswrapper[4921]: I0312 13:49:34.294918 4921 generic.go:334] "Generic (PLEG): container finished" podID="56567424-34cd-49a4-ad03-c72a25a07058" containerID="02461580253c51a2893153c9d5d6bb476db59b6782ffc60fec479847bbff44ff" exitCode=0 Mar 12 13:49:34 crc kubenswrapper[4921]: I0312 13:49:34.295048 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" event={"ID":"56567424-34cd-49a4-ad03-c72a25a07058","Type":"ContainerDied","Data":"02461580253c51a2893153c9d5d6bb476db59b6782ffc60fec479847bbff44ff"} Mar 12 13:49:35 crc kubenswrapper[4921]: I0312 13:49:35.747449 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:49:35 crc kubenswrapper[4921]: I0312 13:49:35.942021 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9vz4\" (UniqueName: \"kubernetes.io/projected/56567424-34cd-49a4-ad03-c72a25a07058-kube-api-access-f9vz4\") pod \"56567424-34cd-49a4-ad03-c72a25a07058\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " Mar 12 13:49:35 crc kubenswrapper[4921]: I0312 13:49:35.942573 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-ceph\") pod \"56567424-34cd-49a4-ad03-c72a25a07058\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " Mar 12 13:49:35 crc kubenswrapper[4921]: I0312 13:49:35.942950 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-inventory\") pod \"56567424-34cd-49a4-ad03-c72a25a07058\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " Mar 12 13:49:35 crc kubenswrapper[4921]: I0312 13:49:35.943180 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-ssh-key-openstack-edpm-ipam\") pod \"56567424-34cd-49a4-ad03-c72a25a07058\" (UID: \"56567424-34cd-49a4-ad03-c72a25a07058\") " Mar 12 13:49:35 crc kubenswrapper[4921]: I0312 13:49:35.948652 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-ceph" (OuterVolumeSpecName: "ceph") pod "56567424-34cd-49a4-ad03-c72a25a07058" (UID: "56567424-34cd-49a4-ad03-c72a25a07058"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:49:35 crc kubenswrapper[4921]: I0312 13:49:35.950909 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56567424-34cd-49a4-ad03-c72a25a07058-kube-api-access-f9vz4" (OuterVolumeSpecName: "kube-api-access-f9vz4") pod "56567424-34cd-49a4-ad03-c72a25a07058" (UID: "56567424-34cd-49a4-ad03-c72a25a07058"). InnerVolumeSpecName "kube-api-access-f9vz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:49:35 crc kubenswrapper[4921]: I0312 13:49:35.976920 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-inventory" (OuterVolumeSpecName: "inventory") pod "56567424-34cd-49a4-ad03-c72a25a07058" (UID: "56567424-34cd-49a4-ad03-c72a25a07058"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:49:35 crc kubenswrapper[4921]: I0312 13:49:35.988264 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "56567424-34cd-49a4-ad03-c72a25a07058" (UID: "56567424-34cd-49a4-ad03-c72a25a07058"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.045613 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9vz4\" (UniqueName: \"kubernetes.io/projected/56567424-34cd-49a4-ad03-c72a25a07058-kube-api-access-f9vz4\") on node \"crc\" DevicePath \"\"" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.045646 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.045657 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.045666 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/56567424-34cd-49a4-ad03-c72a25a07058-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.319097 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" event={"ID":"56567424-34cd-49a4-ad03-c72a25a07058","Type":"ContainerDied","Data":"1ad5b01d8dfd8be3ad1b438ff9e8a3c3c48770068832020335ce9df7ace58964"} Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.319140 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ad5b01d8dfd8be3ad1b438ff9e8a3c3c48770068832020335ce9df7ace58964" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.319268 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-hfsqf" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.430836 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk"] Mar 12 13:49:36 crc kubenswrapper[4921]: E0312 13:49:36.431223 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56567424-34cd-49a4-ad03-c72a25a07058" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.431240 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="56567424-34cd-49a4-ad03-c72a25a07058" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.431414 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="56567424-34cd-49a4-ad03-c72a25a07058" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.431991 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.433994 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.434368 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.434837 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.435132 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.435428 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.449427 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk"] Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.553767 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.553933 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krpcv\" (UniqueName: \"kubernetes.io/projected/cbaebc43-5127-4000-abb3-79a878177cd2-kube-api-access-krpcv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.554027 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.554094 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.655235 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.655359 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.655430 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krpcv\" (UniqueName: \"kubernetes.io/projected/cbaebc43-5127-4000-abb3-79a878177cd2-kube-api-access-krpcv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.655468 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.660098 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.660863 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.661139 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.674555 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krpcv\" (UniqueName: \"kubernetes.io/projected/cbaebc43-5127-4000-abb3-79a878177cd2-kube-api-access-krpcv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:36 crc kubenswrapper[4921]: I0312 13:49:36.749784 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:37 crc kubenswrapper[4921]: I0312 13:49:37.240120 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk"] Mar 12 13:49:37 crc kubenswrapper[4921]: I0312 13:49:37.244631 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:49:37 crc kubenswrapper[4921]: I0312 13:49:37.329564 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" event={"ID":"cbaebc43-5127-4000-abb3-79a878177cd2","Type":"ContainerStarted","Data":"6b66033e3615adc743a48ac850c9cf8cb26e2c2c5cc5afeaa8f2ddbf5091106b"} Mar 12 13:49:38 crc kubenswrapper[4921]: I0312 13:49:38.341195 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" event={"ID":"cbaebc43-5127-4000-abb3-79a878177cd2","Type":"ContainerStarted","Data":"a9e97147c90ab636bfdf4e42b026201960333e956da5e27dfe1469caae6b413b"} Mar 12 13:49:38 crc kubenswrapper[4921]: I0312 13:49:38.366633 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" podStartSLOduration=1.72028647 podStartE2EDuration="2.366606573s" podCreationTimestamp="2026-03-12 13:49:36 +0000 UTC" firstStartedPulling="2026-03-12 13:49:37.24438942 +0000 UTC m=+2399.934461391" lastFinishedPulling="2026-03-12 13:49:37.890709523 +0000 UTC m=+2400.580781494" observedRunningTime="2026-03-12 13:49:38.360278081 +0000 UTC m=+2401.050350072" watchObservedRunningTime="2026-03-12 13:49:38.366606573 +0000 UTC m=+2401.056678564" Mar 12 13:49:40 crc kubenswrapper[4921]: I0312 13:49:40.983434 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:49:40 crc kubenswrapper[4921]: E0312 13:49:40.984196 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:49:42 crc kubenswrapper[4921]: I0312 13:49:42.374062 4921 generic.go:334] "Generic (PLEG): container finished" podID="cbaebc43-5127-4000-abb3-79a878177cd2" containerID="a9e97147c90ab636bfdf4e42b026201960333e956da5e27dfe1469caae6b413b" exitCode=0 Mar 12 13:49:42 crc kubenswrapper[4921]: I0312 13:49:42.374433 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" event={"ID":"cbaebc43-5127-4000-abb3-79a878177cd2","Type":"ContainerDied","Data":"a9e97147c90ab636bfdf4e42b026201960333e956da5e27dfe1469caae6b413b"} Mar 12 13:49:43 crc kubenswrapper[4921]: I0312 13:49:43.789574 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:43 crc kubenswrapper[4921]: I0312 13:49:43.989512 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krpcv\" (UniqueName: \"kubernetes.io/projected/cbaebc43-5127-4000-abb3-79a878177cd2-kube-api-access-krpcv\") pod \"cbaebc43-5127-4000-abb3-79a878177cd2\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " Mar 12 13:49:43 crc kubenswrapper[4921]: I0312 13:49:43.989886 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-ssh-key-openstack-edpm-ipam\") pod \"cbaebc43-5127-4000-abb3-79a878177cd2\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " Mar 12 13:49:43 crc kubenswrapper[4921]: I0312 13:49:43.989920 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-inventory\") pod \"cbaebc43-5127-4000-abb3-79a878177cd2\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " Mar 12 13:49:43 crc kubenswrapper[4921]: I0312 13:49:43.989968 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-ceph\") pod \"cbaebc43-5127-4000-abb3-79a878177cd2\" (UID: \"cbaebc43-5127-4000-abb3-79a878177cd2\") " Mar 12 13:49:43 crc kubenswrapper[4921]: I0312 13:49:43.996634 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-ceph" (OuterVolumeSpecName: "ceph") pod "cbaebc43-5127-4000-abb3-79a878177cd2" (UID: "cbaebc43-5127-4000-abb3-79a878177cd2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.009259 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbaebc43-5127-4000-abb3-79a878177cd2-kube-api-access-krpcv" (OuterVolumeSpecName: "kube-api-access-krpcv") pod "cbaebc43-5127-4000-abb3-79a878177cd2" (UID: "cbaebc43-5127-4000-abb3-79a878177cd2"). InnerVolumeSpecName "kube-api-access-krpcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.016034 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-inventory" (OuterVolumeSpecName: "inventory") pod "cbaebc43-5127-4000-abb3-79a878177cd2" (UID: "cbaebc43-5127-4000-abb3-79a878177cd2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.025625 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cbaebc43-5127-4000-abb3-79a878177cd2" (UID: "cbaebc43-5127-4000-abb3-79a878177cd2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.092599 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.092632 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.092641 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cbaebc43-5127-4000-abb3-79a878177cd2-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.092651 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krpcv\" (UniqueName: \"kubernetes.io/projected/cbaebc43-5127-4000-abb3-79a878177cd2-kube-api-access-krpcv\") on node \"crc\" DevicePath \"\"" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.398317 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" event={"ID":"cbaebc43-5127-4000-abb3-79a878177cd2","Type":"ContainerDied","Data":"6b66033e3615adc743a48ac850c9cf8cb26e2c2c5cc5afeaa8f2ddbf5091106b"} Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.398390 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b66033e3615adc743a48ac850c9cf8cb26e2c2c5cc5afeaa8f2ddbf5091106b" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.398431 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.468587 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck"] Mar 12 13:49:44 crc kubenswrapper[4921]: E0312 13:49:44.469036 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbaebc43-5127-4000-abb3-79a878177cd2" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.469058 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbaebc43-5127-4000-abb3-79a878177cd2" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.469237 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbaebc43-5127-4000-abb3-79a878177cd2" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.469989 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.473262 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.474616 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.474616 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.474695 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.474898 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.483560 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck"] Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.501094 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp2fs\" (UniqueName: \"kubernetes.io/projected/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-kube-api-access-kp2fs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcpck\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.501183 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcpck\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.501244 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcpck\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.501282 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcpck\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.602703 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcpck\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.602806 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcpck\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.603075 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp2fs\" (UniqueName: \"kubernetes.io/projected/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-kube-api-access-kp2fs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcpck\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.603557 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcpck\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.607178 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcpck\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.607219 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcpck\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.609059 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcpck\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.620090 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp2fs\" (UniqueName: \"kubernetes.io/projected/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-kube-api-access-kp2fs\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pcpck\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:49:44 crc kubenswrapper[4921]: I0312 13:49:44.786313 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:49:45 crc kubenswrapper[4921]: I0312 13:49:45.361855 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck"] Mar 12 13:49:45 crc kubenswrapper[4921]: W0312 13:49:45.375322 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a0ab9f2_e0b6_40e1_9816_a11f8135ed75.slice/crio-199bf42fabfd3a8e4be5f2785d20ff909d4509972bc989ce33729ce9a3c38a34 WatchSource:0}: Error finding container 199bf42fabfd3a8e4be5f2785d20ff909d4509972bc989ce33729ce9a3c38a34: Status 404 returned error can't find the container with id 199bf42fabfd3a8e4be5f2785d20ff909d4509972bc989ce33729ce9a3c38a34 Mar 12 13:49:45 crc kubenswrapper[4921]: I0312 13:49:45.407604 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" event={"ID":"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75","Type":"ContainerStarted","Data":"199bf42fabfd3a8e4be5f2785d20ff909d4509972bc989ce33729ce9a3c38a34"} Mar 12 13:49:46 crc kubenswrapper[4921]: I0312 13:49:46.418436 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" event={"ID":"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75","Type":"ContainerStarted","Data":"05037cb851f531599903312d9ad37e2f623f347163689beaf0a6223e7c8d46dd"} Mar 12 13:49:46 crc kubenswrapper[4921]: I0312 13:49:46.440782 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" podStartSLOduration=1.881244872 podStartE2EDuration="2.440765486s" podCreationTimestamp="2026-03-12 13:49:44 +0000 UTC" firstStartedPulling="2026-03-12 13:49:45.378863264 +0000 UTC m=+2408.068935275" lastFinishedPulling="2026-03-12 13:49:45.938383918 +0000 UTC m=+2408.628455889" observedRunningTime="2026-03-12 13:49:46.43955198 +0000 UTC m=+2409.129623951" watchObservedRunningTime="2026-03-12 13:49:46.440765486 +0000 UTC m=+2409.130837447" Mar 12 13:49:51 crc kubenswrapper[4921]: I0312 13:49:51.984115 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:49:51 crc kubenswrapper[4921]: E0312 13:49:51.985225 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:50:00 crc kubenswrapper[4921]: I0312 13:50:00.142007 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555390-sbtp8"] Mar 12 13:50:00 crc kubenswrapper[4921]: I0312 13:50:00.144325 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555390-sbtp8" Mar 12 13:50:00 crc kubenswrapper[4921]: I0312 13:50:00.146360 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:50:00 crc kubenswrapper[4921]: I0312 13:50:00.146589 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:50:00 crc kubenswrapper[4921]: I0312 13:50:00.146750 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:50:00 crc kubenswrapper[4921]: I0312 13:50:00.157365 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555390-sbtp8"] Mar 12 13:50:00 crc kubenswrapper[4921]: I0312 13:50:00.288716 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7f5j\" (UniqueName: \"kubernetes.io/projected/d12d4f9f-e152-463e-b0d8-c93036e5f85b-kube-api-access-f7f5j\") pod \"auto-csr-approver-29555390-sbtp8\" (UID: \"d12d4f9f-e152-463e-b0d8-c93036e5f85b\") " pod="openshift-infra/auto-csr-approver-29555390-sbtp8" Mar 12 13:50:00 crc kubenswrapper[4921]: I0312 13:50:00.390876 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7f5j\" (UniqueName: \"kubernetes.io/projected/d12d4f9f-e152-463e-b0d8-c93036e5f85b-kube-api-access-f7f5j\") pod \"auto-csr-approver-29555390-sbtp8\" (UID: \"d12d4f9f-e152-463e-b0d8-c93036e5f85b\") " pod="openshift-infra/auto-csr-approver-29555390-sbtp8" Mar 12 13:50:00 crc kubenswrapper[4921]: I0312 13:50:00.411355 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7f5j\" (UniqueName: \"kubernetes.io/projected/d12d4f9f-e152-463e-b0d8-c93036e5f85b-kube-api-access-f7f5j\") pod \"auto-csr-approver-29555390-sbtp8\" (UID: \"d12d4f9f-e152-463e-b0d8-c93036e5f85b\") " pod="openshift-infra/auto-csr-approver-29555390-sbtp8" Mar 12 13:50:00 crc kubenswrapper[4921]: I0312 13:50:00.465965 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555390-sbtp8" Mar 12 13:50:00 crc kubenswrapper[4921]: I0312 13:50:00.919443 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555390-sbtp8"] Mar 12 13:50:00 crc kubenswrapper[4921]: W0312 13:50:00.922646 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd12d4f9f_e152_463e_b0d8_c93036e5f85b.slice/crio-f4ecec32b384274d03a4d519c4dabf8b560be4d33a85c5d8f1fa4542cd0a4619 WatchSource:0}: Error finding container f4ecec32b384274d03a4d519c4dabf8b560be4d33a85c5d8f1fa4542cd0a4619: Status 404 returned error can't find the container with id f4ecec32b384274d03a4d519c4dabf8b560be4d33a85c5d8f1fa4542cd0a4619 Mar 12 13:50:01 crc kubenswrapper[4921]: I0312 13:50:01.563178 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555390-sbtp8" event={"ID":"d12d4f9f-e152-463e-b0d8-c93036e5f85b","Type":"ContainerStarted","Data":"f4ecec32b384274d03a4d519c4dabf8b560be4d33a85c5d8f1fa4542cd0a4619"} Mar 12 13:50:02 crc kubenswrapper[4921]: I0312 13:50:02.571489 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555390-sbtp8" event={"ID":"d12d4f9f-e152-463e-b0d8-c93036e5f85b","Type":"ContainerStarted","Data":"35862c3da763b439a0e8d53f4a00f7b7e1ff9430209db60f461494e4c1f85d94"} Mar 12 13:50:02 crc kubenswrapper[4921]: I0312 13:50:02.591995 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555390-sbtp8" podStartSLOduration=1.411403883 podStartE2EDuration="2.591972899s" podCreationTimestamp="2026-03-12 13:50:00 +0000 UTC" firstStartedPulling="2026-03-12 13:50:00.925084825 +0000 UTC m=+2423.615156796" lastFinishedPulling="2026-03-12 13:50:02.105653841 +0000 UTC m=+2424.795725812" observedRunningTime="2026-03-12 13:50:02.584636585 +0000 UTC m=+2425.274708556" watchObservedRunningTime="2026-03-12 13:50:02.591972899 +0000 UTC m=+2425.282044870" Mar 12 13:50:03 crc kubenswrapper[4921]: I0312 13:50:03.585851 4921 generic.go:334] "Generic (PLEG): container finished" podID="d12d4f9f-e152-463e-b0d8-c93036e5f85b" containerID="35862c3da763b439a0e8d53f4a00f7b7e1ff9430209db60f461494e4c1f85d94" exitCode=0 Mar 12 13:50:03 crc kubenswrapper[4921]: I0312 13:50:03.585930 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555390-sbtp8" event={"ID":"d12d4f9f-e152-463e-b0d8-c93036e5f85b","Type":"ContainerDied","Data":"35862c3da763b439a0e8d53f4a00f7b7e1ff9430209db60f461494e4c1f85d94"} Mar 12 13:50:04 crc kubenswrapper[4921]: I0312 13:50:04.983575 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:50:04 crc kubenswrapper[4921]: E0312 13:50:04.984391 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:50:04 crc kubenswrapper[4921]: I0312 13:50:04.989518 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555390-sbtp8" Mar 12 13:50:05 crc kubenswrapper[4921]: I0312 13:50:05.077850 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7f5j\" (UniqueName: \"kubernetes.io/projected/d12d4f9f-e152-463e-b0d8-c93036e5f85b-kube-api-access-f7f5j\") pod \"d12d4f9f-e152-463e-b0d8-c93036e5f85b\" (UID: \"d12d4f9f-e152-463e-b0d8-c93036e5f85b\") " Mar 12 13:50:05 crc kubenswrapper[4921]: I0312 13:50:05.093042 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d12d4f9f-e152-463e-b0d8-c93036e5f85b-kube-api-access-f7f5j" (OuterVolumeSpecName: "kube-api-access-f7f5j") pod "d12d4f9f-e152-463e-b0d8-c93036e5f85b" (UID: "d12d4f9f-e152-463e-b0d8-c93036e5f85b"). InnerVolumeSpecName "kube-api-access-f7f5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:50:05 crc kubenswrapper[4921]: I0312 13:50:05.181198 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7f5j\" (UniqueName: \"kubernetes.io/projected/d12d4f9f-e152-463e-b0d8-c93036e5f85b-kube-api-access-f7f5j\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:05 crc kubenswrapper[4921]: I0312 13:50:05.606462 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555390-sbtp8" event={"ID":"d12d4f9f-e152-463e-b0d8-c93036e5f85b","Type":"ContainerDied","Data":"f4ecec32b384274d03a4d519c4dabf8b560be4d33a85c5d8f1fa4542cd0a4619"} Mar 12 13:50:05 crc kubenswrapper[4921]: I0312 13:50:05.606512 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4ecec32b384274d03a4d519c4dabf8b560be4d33a85c5d8f1fa4542cd0a4619" Mar 12 13:50:05 crc kubenswrapper[4921]: I0312 13:50:05.606565 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555390-sbtp8" Mar 12 13:50:05 crc kubenswrapper[4921]: I0312 13:50:05.677289 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555384-ckh8q"] Mar 12 13:50:05 crc kubenswrapper[4921]: I0312 13:50:05.687076 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555384-ckh8q"] Mar 12 13:50:06 crc kubenswrapper[4921]: I0312 13:50:06.004046 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="993efdcb-eebd-4fef-87eb-3a28609a17c4" path="/var/lib/kubelet/pods/993efdcb-eebd-4fef-87eb-3a28609a17c4/volumes" Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.616005 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dkhw2"] Mar 12 13:50:08 crc kubenswrapper[4921]: E0312 13:50:08.616618 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d12d4f9f-e152-463e-b0d8-c93036e5f85b" containerName="oc" Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.616630 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12d4f9f-e152-463e-b0d8-c93036e5f85b" containerName="oc" Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.616824 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d12d4f9f-e152-463e-b0d8-c93036e5f85b" containerName="oc" Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.617961 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.635363 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkhw2"] Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.747162 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqnhf\" (UniqueName: \"kubernetes.io/projected/15b62313-2437-40eb-902a-626b26aee9ee-kube-api-access-zqnhf\") pod \"redhat-marketplace-dkhw2\" (UID: \"15b62313-2437-40eb-902a-626b26aee9ee\") " pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.747489 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15b62313-2437-40eb-902a-626b26aee9ee-catalog-content\") pod \"redhat-marketplace-dkhw2\" (UID: \"15b62313-2437-40eb-902a-626b26aee9ee\") " pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.747771 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15b62313-2437-40eb-902a-626b26aee9ee-utilities\") pod \"redhat-marketplace-dkhw2\" (UID: \"15b62313-2437-40eb-902a-626b26aee9ee\") " pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.850226 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqnhf\" (UniqueName: \"kubernetes.io/projected/15b62313-2437-40eb-902a-626b26aee9ee-kube-api-access-zqnhf\") pod \"redhat-marketplace-dkhw2\" (UID: \"15b62313-2437-40eb-902a-626b26aee9ee\") " pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.850561 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15b62313-2437-40eb-902a-626b26aee9ee-catalog-content\") pod \"redhat-marketplace-dkhw2\" (UID: \"15b62313-2437-40eb-902a-626b26aee9ee\") " pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.850721 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15b62313-2437-40eb-902a-626b26aee9ee-utilities\") pod \"redhat-marketplace-dkhw2\" (UID: \"15b62313-2437-40eb-902a-626b26aee9ee\") " pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.851079 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15b62313-2437-40eb-902a-626b26aee9ee-catalog-content\") pod \"redhat-marketplace-dkhw2\" (UID: \"15b62313-2437-40eb-902a-626b26aee9ee\") " pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.851110 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15b62313-2437-40eb-902a-626b26aee9ee-utilities\") pod \"redhat-marketplace-dkhw2\" (UID: \"15b62313-2437-40eb-902a-626b26aee9ee\") " pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.869635 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqnhf\" (UniqueName: \"kubernetes.io/projected/15b62313-2437-40eb-902a-626b26aee9ee-kube-api-access-zqnhf\") pod \"redhat-marketplace-dkhw2\" (UID: \"15b62313-2437-40eb-902a-626b26aee9ee\") " pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:08 crc kubenswrapper[4921]: I0312 13:50:08.940341 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:09 crc kubenswrapper[4921]: I0312 13:50:09.387751 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkhw2"] Mar 12 13:50:09 crc kubenswrapper[4921]: I0312 13:50:09.639257 4921 generic.go:334] "Generic (PLEG): container finished" podID="15b62313-2437-40eb-902a-626b26aee9ee" containerID="81240ce1dbf8535f0b10b1fb1d3bb915109c54dfc790bf0585c9e46f1864edfc" exitCode=0 Mar 12 13:50:09 crc kubenswrapper[4921]: I0312 13:50:09.639309 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkhw2" event={"ID":"15b62313-2437-40eb-902a-626b26aee9ee","Type":"ContainerDied","Data":"81240ce1dbf8535f0b10b1fb1d3bb915109c54dfc790bf0585c9e46f1864edfc"} Mar 12 13:50:09 crc kubenswrapper[4921]: I0312 13:50:09.639358 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkhw2" event={"ID":"15b62313-2437-40eb-902a-626b26aee9ee","Type":"ContainerStarted","Data":"171c9dd8f4c2d3b9cce245fe41dd9eea47280ca5be6abdd3c8639ccc07247096"} Mar 12 13:50:10 crc kubenswrapper[4921]: I0312 13:50:10.649378 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkhw2" event={"ID":"15b62313-2437-40eb-902a-626b26aee9ee","Type":"ContainerStarted","Data":"2f4ebc901b9eb5aa4a6b5229b7361b8903f7c5996b6257a8f40d80e1d7894da3"} Mar 12 13:50:11 crc kubenswrapper[4921]: I0312 13:50:11.660652 4921 generic.go:334] "Generic (PLEG): container finished" podID="15b62313-2437-40eb-902a-626b26aee9ee" containerID="2f4ebc901b9eb5aa4a6b5229b7361b8903f7c5996b6257a8f40d80e1d7894da3" exitCode=0 Mar 12 13:50:11 crc kubenswrapper[4921]: I0312 13:50:11.660715 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkhw2" event={"ID":"15b62313-2437-40eb-902a-626b26aee9ee","Type":"ContainerDied","Data":"2f4ebc901b9eb5aa4a6b5229b7361b8903f7c5996b6257a8f40d80e1d7894da3"} Mar 12 13:50:12 crc kubenswrapper[4921]: I0312 13:50:12.672908 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkhw2" event={"ID":"15b62313-2437-40eb-902a-626b26aee9ee","Type":"ContainerStarted","Data":"817c96943b87b89446537f15670547046c5c7ebe56dcd9cf68dfc423114e924c"} Mar 12 13:50:12 crc kubenswrapper[4921]: I0312 13:50:12.710475 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dkhw2" podStartSLOduration=2.088487325 podStartE2EDuration="4.71045288s" podCreationTimestamp="2026-03-12 13:50:08 +0000 UTC" firstStartedPulling="2026-03-12 13:50:09.640694233 +0000 UTC m=+2432.330766204" lastFinishedPulling="2026-03-12 13:50:12.262659768 +0000 UTC m=+2434.952731759" observedRunningTime="2026-03-12 13:50:12.698262917 +0000 UTC m=+2435.388334888" watchObservedRunningTime="2026-03-12 13:50:12.71045288 +0000 UTC m=+2435.400524871" Mar 12 13:50:15 crc kubenswrapper[4921]: I0312 13:50:15.983668 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:50:15 crc kubenswrapper[4921]: E0312 13:50:15.985183 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:50:18 crc kubenswrapper[4921]: I0312 13:50:18.940575 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:18 crc kubenswrapper[4921]: I0312 13:50:18.940965 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:18 crc kubenswrapper[4921]: I0312 13:50:18.993514 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:19 crc kubenswrapper[4921]: I0312 13:50:19.794106 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:20 crc kubenswrapper[4921]: I0312 13:50:20.409392 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkhw2"] Mar 12 13:50:21 crc kubenswrapper[4921]: I0312 13:50:21.750000 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dkhw2" podUID="15b62313-2437-40eb-902a-626b26aee9ee" containerName="registry-server" containerID="cri-o://817c96943b87b89446537f15670547046c5c7ebe56dcd9cf68dfc423114e924c" gracePeriod=2 Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.211437 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.295258 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqnhf\" (UniqueName: \"kubernetes.io/projected/15b62313-2437-40eb-902a-626b26aee9ee-kube-api-access-zqnhf\") pod \"15b62313-2437-40eb-902a-626b26aee9ee\" (UID: \"15b62313-2437-40eb-902a-626b26aee9ee\") " Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.295322 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15b62313-2437-40eb-902a-626b26aee9ee-utilities\") pod \"15b62313-2437-40eb-902a-626b26aee9ee\" (UID: \"15b62313-2437-40eb-902a-626b26aee9ee\") " Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.295423 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15b62313-2437-40eb-902a-626b26aee9ee-catalog-content\") pod \"15b62313-2437-40eb-902a-626b26aee9ee\" (UID: \"15b62313-2437-40eb-902a-626b26aee9ee\") " Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.297836 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15b62313-2437-40eb-902a-626b26aee9ee-utilities" (OuterVolumeSpecName: "utilities") pod "15b62313-2437-40eb-902a-626b26aee9ee" (UID: "15b62313-2437-40eb-902a-626b26aee9ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.307000 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15b62313-2437-40eb-902a-626b26aee9ee-kube-api-access-zqnhf" (OuterVolumeSpecName: "kube-api-access-zqnhf") pod "15b62313-2437-40eb-902a-626b26aee9ee" (UID: "15b62313-2437-40eb-902a-626b26aee9ee"). InnerVolumeSpecName "kube-api-access-zqnhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.322253 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15b62313-2437-40eb-902a-626b26aee9ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15b62313-2437-40eb-902a-626b26aee9ee" (UID: "15b62313-2437-40eb-902a-626b26aee9ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.397303 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqnhf\" (UniqueName: \"kubernetes.io/projected/15b62313-2437-40eb-902a-626b26aee9ee-kube-api-access-zqnhf\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.397343 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15b62313-2437-40eb-902a-626b26aee9ee-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.397353 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15b62313-2437-40eb-902a-626b26aee9ee-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.766902 4921 generic.go:334] "Generic (PLEG): container finished" podID="15b62313-2437-40eb-902a-626b26aee9ee" containerID="817c96943b87b89446537f15670547046c5c7ebe56dcd9cf68dfc423114e924c" exitCode=0 Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.766971 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkhw2" event={"ID":"15b62313-2437-40eb-902a-626b26aee9ee","Type":"ContainerDied","Data":"817c96943b87b89446537f15670547046c5c7ebe56dcd9cf68dfc423114e924c"} Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.767013 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkhw2" event={"ID":"15b62313-2437-40eb-902a-626b26aee9ee","Type":"ContainerDied","Data":"171c9dd8f4c2d3b9cce245fe41dd9eea47280ca5be6abdd3c8639ccc07247096"} Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.767043 4921 scope.go:117] "RemoveContainer" containerID="817c96943b87b89446537f15670547046c5c7ebe56dcd9cf68dfc423114e924c" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.767238 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dkhw2" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.828622 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkhw2"] Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.831549 4921 scope.go:117] "RemoveContainer" containerID="2f4ebc901b9eb5aa4a6b5229b7361b8903f7c5996b6257a8f40d80e1d7894da3" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.844511 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkhw2"] Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.867216 4921 scope.go:117] "RemoveContainer" containerID="81240ce1dbf8535f0b10b1fb1d3bb915109c54dfc790bf0585c9e46f1864edfc" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.920350 4921 scope.go:117] "RemoveContainer" containerID="817c96943b87b89446537f15670547046c5c7ebe56dcd9cf68dfc423114e924c" Mar 12 13:50:22 crc kubenswrapper[4921]: E0312 13:50:22.921403 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"817c96943b87b89446537f15670547046c5c7ebe56dcd9cf68dfc423114e924c\": container with ID starting with 817c96943b87b89446537f15670547046c5c7ebe56dcd9cf68dfc423114e924c not found: ID does not exist" containerID="817c96943b87b89446537f15670547046c5c7ebe56dcd9cf68dfc423114e924c" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.921453 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817c96943b87b89446537f15670547046c5c7ebe56dcd9cf68dfc423114e924c"} err="failed to get container status \"817c96943b87b89446537f15670547046c5c7ebe56dcd9cf68dfc423114e924c\": rpc error: code = NotFound desc = could not find container \"817c96943b87b89446537f15670547046c5c7ebe56dcd9cf68dfc423114e924c\": container with ID starting with 817c96943b87b89446537f15670547046c5c7ebe56dcd9cf68dfc423114e924c not found: ID does not exist" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.921487 4921 scope.go:117] "RemoveContainer" containerID="2f4ebc901b9eb5aa4a6b5229b7361b8903f7c5996b6257a8f40d80e1d7894da3" Mar 12 13:50:22 crc kubenswrapper[4921]: E0312 13:50:22.922072 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f4ebc901b9eb5aa4a6b5229b7361b8903f7c5996b6257a8f40d80e1d7894da3\": container with ID starting with 2f4ebc901b9eb5aa4a6b5229b7361b8903f7c5996b6257a8f40d80e1d7894da3 not found: ID does not exist" containerID="2f4ebc901b9eb5aa4a6b5229b7361b8903f7c5996b6257a8f40d80e1d7894da3" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.922108 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4ebc901b9eb5aa4a6b5229b7361b8903f7c5996b6257a8f40d80e1d7894da3"} err="failed to get container status \"2f4ebc901b9eb5aa4a6b5229b7361b8903f7c5996b6257a8f40d80e1d7894da3\": rpc error: code = NotFound desc = could not find container \"2f4ebc901b9eb5aa4a6b5229b7361b8903f7c5996b6257a8f40d80e1d7894da3\": container with ID starting with 2f4ebc901b9eb5aa4a6b5229b7361b8903f7c5996b6257a8f40d80e1d7894da3 not found: ID does not exist" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.922134 4921 scope.go:117] "RemoveContainer" containerID="81240ce1dbf8535f0b10b1fb1d3bb915109c54dfc790bf0585c9e46f1864edfc" Mar 12 13:50:22 crc kubenswrapper[4921]: E0312 13:50:22.922379 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81240ce1dbf8535f0b10b1fb1d3bb915109c54dfc790bf0585c9e46f1864edfc\": container with ID starting with 81240ce1dbf8535f0b10b1fb1d3bb915109c54dfc790bf0585c9e46f1864edfc not found: ID does not exist" containerID="81240ce1dbf8535f0b10b1fb1d3bb915109c54dfc790bf0585c9e46f1864edfc" Mar 12 13:50:22 crc kubenswrapper[4921]: I0312 13:50:22.922407 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81240ce1dbf8535f0b10b1fb1d3bb915109c54dfc790bf0585c9e46f1864edfc"} err="failed to get container status \"81240ce1dbf8535f0b10b1fb1d3bb915109c54dfc790bf0585c9e46f1864edfc\": rpc error: code = NotFound desc = could not find container \"81240ce1dbf8535f0b10b1fb1d3bb915109c54dfc790bf0585c9e46f1864edfc\": container with ID starting with 81240ce1dbf8535f0b10b1fb1d3bb915109c54dfc790bf0585c9e46f1864edfc not found: ID does not exist" Mar 12 13:50:24 crc kubenswrapper[4921]: I0312 13:50:24.018094 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15b62313-2437-40eb-902a-626b26aee9ee" path="/var/lib/kubelet/pods/15b62313-2437-40eb-902a-626b26aee9ee/volumes" Mar 12 13:50:27 crc kubenswrapper[4921]: I0312 13:50:27.993882 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:50:27 crc kubenswrapper[4921]: E0312 13:50:27.995099 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:50:28 crc kubenswrapper[4921]: I0312 13:50:28.819423 4921 generic.go:334] "Generic (PLEG): container finished" podID="5a0ab9f2-e0b6-40e1-9816-a11f8135ed75" containerID="05037cb851f531599903312d9ad37e2f623f347163689beaf0a6223e7c8d46dd" exitCode=0 Mar 12 13:50:28 crc kubenswrapper[4921]: I0312 13:50:28.819478 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" event={"ID":"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75","Type":"ContainerDied","Data":"05037cb851f531599903312d9ad37e2f623f347163689beaf0a6223e7c8d46dd"} Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.275784 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.361969 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-inventory\") pod \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.362064 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp2fs\" (UniqueName: \"kubernetes.io/projected/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-kube-api-access-kp2fs\") pod \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.362094 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-ceph\") pod \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.362117 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-ssh-key-openstack-edpm-ipam\") pod \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\" (UID: \"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75\") " Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.374973 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-kube-api-access-kp2fs" (OuterVolumeSpecName: "kube-api-access-kp2fs") pod "5a0ab9f2-e0b6-40e1-9816-a11f8135ed75" (UID: "5a0ab9f2-e0b6-40e1-9816-a11f8135ed75"). InnerVolumeSpecName "kube-api-access-kp2fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.375085 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-ceph" (OuterVolumeSpecName: "ceph") pod "5a0ab9f2-e0b6-40e1-9816-a11f8135ed75" (UID: "5a0ab9f2-e0b6-40e1-9816-a11f8135ed75"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.387083 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-inventory" (OuterVolumeSpecName: "inventory") pod "5a0ab9f2-e0b6-40e1-9816-a11f8135ed75" (UID: "5a0ab9f2-e0b6-40e1-9816-a11f8135ed75"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.412392 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5a0ab9f2-e0b6-40e1-9816-a11f8135ed75" (UID: "5a0ab9f2-e0b6-40e1-9816-a11f8135ed75"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.464433 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.464485 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp2fs\" (UniqueName: \"kubernetes.io/projected/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-kube-api-access-kp2fs\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.464497 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.464509 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0ab9f2-e0b6-40e1-9816-a11f8135ed75-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.845021 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" event={"ID":"5a0ab9f2-e0b6-40e1-9816-a11f8135ed75","Type":"ContainerDied","Data":"199bf42fabfd3a8e4be5f2785d20ff909d4509972bc989ce33729ce9a3c38a34"} Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.845070 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="199bf42fabfd3a8e4be5f2785d20ff909d4509972bc989ce33729ce9a3c38a34" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.845080 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pcpck" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.952715 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-7x2dm"] Mar 12 13:50:30 crc kubenswrapper[4921]: E0312 13:50:30.953142 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b62313-2437-40eb-902a-626b26aee9ee" containerName="extract-utilities" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.953162 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b62313-2437-40eb-902a-626b26aee9ee" containerName="extract-utilities" Mar 12 13:50:30 crc kubenswrapper[4921]: E0312 13:50:30.953195 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b62313-2437-40eb-902a-626b26aee9ee" containerName="registry-server" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.953202 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b62313-2437-40eb-902a-626b26aee9ee" containerName="registry-server" Mar 12 13:50:30 crc kubenswrapper[4921]: E0312 13:50:30.953220 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0ab9f2-e0b6-40e1-9816-a11f8135ed75" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.953228 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0ab9f2-e0b6-40e1-9816-a11f8135ed75" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:50:30 crc kubenswrapper[4921]: E0312 13:50:30.953240 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15b62313-2437-40eb-902a-626b26aee9ee" containerName="extract-content" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.953245 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="15b62313-2437-40eb-902a-626b26aee9ee" containerName="extract-content" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.953409 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a0ab9f2-e0b6-40e1-9816-a11f8135ed75" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.953420 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="15b62313-2437-40eb-902a-626b26aee9ee" containerName="registry-server" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.953998 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.968869 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-7x2dm"] Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.973724 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.974547 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.974767 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.974913 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:50:30 crc kubenswrapper[4921]: I0312 13:50:30.975017 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.074033 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz2tf\" (UniqueName: \"kubernetes.io/projected/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-kube-api-access-cz2tf\") pod \"ssh-known-hosts-edpm-deployment-7x2dm\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.074168 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-7x2dm\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.074215 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-7x2dm\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.074265 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-ceph\") pod \"ssh-known-hosts-edpm-deployment-7x2dm\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.176059 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-7x2dm\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.176127 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-7x2dm\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.176185 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-ceph\") pod \"ssh-known-hosts-edpm-deployment-7x2dm\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.176235 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2tf\" (UniqueName: \"kubernetes.io/projected/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-kube-api-access-cz2tf\") pod \"ssh-known-hosts-edpm-deployment-7x2dm\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.181521 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-ceph\") pod \"ssh-known-hosts-edpm-deployment-7x2dm\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.182396 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-7x2dm\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.184495 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-7x2dm\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.196976 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz2tf\" (UniqueName: \"kubernetes.io/projected/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-kube-api-access-cz2tf\") pod \"ssh-known-hosts-edpm-deployment-7x2dm\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.274448 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.573744 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-7x2dm"] Mar 12 13:50:31 crc kubenswrapper[4921]: W0312 13:50:31.577000 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dc60d30_c59f_4cd4_b798_7e8214c0fa52.slice/crio-94f6edb548b75e447fbe31302a2e696ce32e9481b1d540f4169633ea2c950b5e WatchSource:0}: Error finding container 94f6edb548b75e447fbe31302a2e696ce32e9481b1d540f4169633ea2c950b5e: Status 404 returned error can't find the container with id 94f6edb548b75e447fbe31302a2e696ce32e9481b1d540f4169633ea2c950b5e Mar 12 13:50:31 crc kubenswrapper[4921]: I0312 13:50:31.854465 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" event={"ID":"7dc60d30-c59f-4cd4-b798-7e8214c0fa52","Type":"ContainerStarted","Data":"94f6edb548b75e447fbe31302a2e696ce32e9481b1d540f4169633ea2c950b5e"} Mar 12 13:50:32 crc kubenswrapper[4921]: I0312 13:50:32.865225 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" event={"ID":"7dc60d30-c59f-4cd4-b798-7e8214c0fa52","Type":"ContainerStarted","Data":"855b6239551320ec9ab7ea251bde24e94a14a35910fb7ce9cc7bf5076817994e"} Mar 12 13:50:38 crc kubenswrapper[4921]: I0312 13:50:38.983455 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:50:38 crc kubenswrapper[4921]: E0312 13:50:38.984663 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:50:40 crc kubenswrapper[4921]: I0312 13:50:40.937403 4921 generic.go:334] "Generic (PLEG): container finished" podID="7dc60d30-c59f-4cd4-b798-7e8214c0fa52" containerID="855b6239551320ec9ab7ea251bde24e94a14a35910fb7ce9cc7bf5076817994e" exitCode=0 Mar 12 13:50:40 crc kubenswrapper[4921]: I0312 13:50:40.937573 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" event={"ID":"7dc60d30-c59f-4cd4-b798-7e8214c0fa52","Type":"ContainerDied","Data":"855b6239551320ec9ab7ea251bde24e94a14a35910fb7ce9cc7bf5076817994e"} Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.375618 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.490047 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz2tf\" (UniqueName: \"kubernetes.io/projected/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-kube-api-access-cz2tf\") pod \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.490865 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-inventory-0\") pod \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.491084 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-ceph\") pod \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.491295 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-ssh-key-openstack-edpm-ipam\") pod \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\" (UID: \"7dc60d30-c59f-4cd4-b798-7e8214c0fa52\") " Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.496010 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-kube-api-access-cz2tf" (OuterVolumeSpecName: "kube-api-access-cz2tf") pod "7dc60d30-c59f-4cd4-b798-7e8214c0fa52" (UID: "7dc60d30-c59f-4cd4-b798-7e8214c0fa52"). InnerVolumeSpecName "kube-api-access-cz2tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.496185 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-ceph" (OuterVolumeSpecName: "ceph") pod "7dc60d30-c59f-4cd4-b798-7e8214c0fa52" (UID: "7dc60d30-c59f-4cd4-b798-7e8214c0fa52"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.522077 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "7dc60d30-c59f-4cd4-b798-7e8214c0fa52" (UID: "7dc60d30-c59f-4cd4-b798-7e8214c0fa52"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.522517 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7dc60d30-c59f-4cd4-b798-7e8214c0fa52" (UID: "7dc60d30-c59f-4cd4-b798-7e8214c0fa52"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.593479 4921 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.593561 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.593602 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.593621 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz2tf\" (UniqueName: \"kubernetes.io/projected/7dc60d30-c59f-4cd4-b798-7e8214c0fa52-kube-api-access-cz2tf\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.959215 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" event={"ID":"7dc60d30-c59f-4cd4-b798-7e8214c0fa52","Type":"ContainerDied","Data":"94f6edb548b75e447fbe31302a2e696ce32e9481b1d540f4169633ea2c950b5e"} Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.959265 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94f6edb548b75e447fbe31302a2e696ce32e9481b1d540f4169633ea2c950b5e" Mar 12 13:50:42 crc kubenswrapper[4921]: I0312 13:50:42.959292 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-7x2dm" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.037359 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd"] Mar 12 13:50:43 crc kubenswrapper[4921]: E0312 13:50:43.037687 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc60d30-c59f-4cd4-b798-7e8214c0fa52" containerName="ssh-known-hosts-edpm-deployment" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.037706 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc60d30-c59f-4cd4-b798-7e8214c0fa52" containerName="ssh-known-hosts-edpm-deployment" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.037889 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc60d30-c59f-4cd4-b798-7e8214c0fa52" containerName="ssh-known-hosts-edpm-deployment" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.038429 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.042157 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.042184 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.042285 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.042347 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.044220 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.064050 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd"] Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.103518 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpgj4\" (UniqueName: \"kubernetes.io/projected/095fb2e2-a411-4c41-bf21-1c8b69166a54-kube-api-access-wpgj4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nzzfd\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.103640 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nzzfd\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.103720 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nzzfd\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.103838 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nzzfd\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.206125 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nzzfd\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.206212 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nzzfd\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.206256 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpgj4\" (UniqueName: \"kubernetes.io/projected/095fb2e2-a411-4c41-bf21-1c8b69166a54-kube-api-access-wpgj4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nzzfd\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.206329 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nzzfd\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.211552 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nzzfd\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.212382 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nzzfd\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.212527 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nzzfd\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.230972 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpgj4\" (UniqueName: \"kubernetes.io/projected/095fb2e2-a411-4c41-bf21-1c8b69166a54-kube-api-access-wpgj4\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nzzfd\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.354268 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.901069 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd"] Mar 12 13:50:43 crc kubenswrapper[4921]: I0312 13:50:43.968242 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" event={"ID":"095fb2e2-a411-4c41-bf21-1c8b69166a54","Type":"ContainerStarted","Data":"a752421f1e1870aecbaf811ec1f852a7511860cd84ff023a19a749549c4b0e70"} Mar 12 13:50:44 crc kubenswrapper[4921]: I0312 13:50:44.093570 4921 scope.go:117] "RemoveContainer" containerID="a308e62970f42847874ce1824eca4d190448484417d1fcf05ddc2f476ca3fab5" Mar 12 13:50:44 crc kubenswrapper[4921]: I0312 13:50:44.979516 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" event={"ID":"095fb2e2-a411-4c41-bf21-1c8b69166a54","Type":"ContainerStarted","Data":"7ae559fa6e4381b9b5ac2f8a8fd8482140c5c8de9df6b8be0840dbed6a379859"} Mar 12 13:50:45 crc kubenswrapper[4921]: I0312 13:50:45.002139 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" podStartSLOduration=1.571746855 podStartE2EDuration="2.002104967s" podCreationTimestamp="2026-03-12 13:50:43 +0000 UTC" firstStartedPulling="2026-03-12 13:50:43.904616507 +0000 UTC m=+2466.594688478" lastFinishedPulling="2026-03-12 13:50:44.334974609 +0000 UTC m=+2467.025046590" observedRunningTime="2026-03-12 13:50:45.000877339 +0000 UTC m=+2467.690949370" watchObservedRunningTime="2026-03-12 13:50:45.002104967 +0000 UTC m=+2467.692176968" Mar 12 13:50:51 crc kubenswrapper[4921]: I0312 13:50:51.985233 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:50:51 crc kubenswrapper[4921]: E0312 13:50:51.986125 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:50:52 crc kubenswrapper[4921]: I0312 13:50:52.036419 4921 generic.go:334] "Generic (PLEG): container finished" podID="095fb2e2-a411-4c41-bf21-1c8b69166a54" containerID="7ae559fa6e4381b9b5ac2f8a8fd8482140c5c8de9df6b8be0840dbed6a379859" exitCode=0 Mar 12 13:50:52 crc kubenswrapper[4921]: I0312 13:50:52.036474 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" event={"ID":"095fb2e2-a411-4c41-bf21-1c8b69166a54","Type":"ContainerDied","Data":"7ae559fa6e4381b9b5ac2f8a8fd8482140c5c8de9df6b8be0840dbed6a379859"} Mar 12 13:50:53 crc kubenswrapper[4921]: I0312 13:50:53.493539 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:53 crc kubenswrapper[4921]: I0312 13:50:53.605606 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-inventory\") pod \"095fb2e2-a411-4c41-bf21-1c8b69166a54\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " Mar 12 13:50:53 crc kubenswrapper[4921]: I0312 13:50:53.605727 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-ceph\") pod \"095fb2e2-a411-4c41-bf21-1c8b69166a54\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " Mar 12 13:50:53 crc kubenswrapper[4921]: I0312 13:50:53.605770 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpgj4\" (UniqueName: \"kubernetes.io/projected/095fb2e2-a411-4c41-bf21-1c8b69166a54-kube-api-access-wpgj4\") pod \"095fb2e2-a411-4c41-bf21-1c8b69166a54\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " Mar 12 13:50:53 crc kubenswrapper[4921]: I0312 13:50:53.605887 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-ssh-key-openstack-edpm-ipam\") pod \"095fb2e2-a411-4c41-bf21-1c8b69166a54\" (UID: \"095fb2e2-a411-4c41-bf21-1c8b69166a54\") " Mar 12 13:50:53 crc kubenswrapper[4921]: I0312 13:50:53.611361 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-ceph" (OuterVolumeSpecName: "ceph") pod "095fb2e2-a411-4c41-bf21-1c8b69166a54" (UID: "095fb2e2-a411-4c41-bf21-1c8b69166a54"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:50:53 crc kubenswrapper[4921]: I0312 13:50:53.612309 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095fb2e2-a411-4c41-bf21-1c8b69166a54-kube-api-access-wpgj4" (OuterVolumeSpecName: "kube-api-access-wpgj4") pod "095fb2e2-a411-4c41-bf21-1c8b69166a54" (UID: "095fb2e2-a411-4c41-bf21-1c8b69166a54"). InnerVolumeSpecName "kube-api-access-wpgj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:50:53 crc kubenswrapper[4921]: I0312 13:50:53.639200 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-inventory" (OuterVolumeSpecName: "inventory") pod "095fb2e2-a411-4c41-bf21-1c8b69166a54" (UID: "095fb2e2-a411-4c41-bf21-1c8b69166a54"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:50:53 crc kubenswrapper[4921]: I0312 13:50:53.639970 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "095fb2e2-a411-4c41-bf21-1c8b69166a54" (UID: "095fb2e2-a411-4c41-bf21-1c8b69166a54"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:50:53 crc kubenswrapper[4921]: I0312 13:50:53.707707 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:53 crc kubenswrapper[4921]: I0312 13:50:53.707747 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:53 crc kubenswrapper[4921]: I0312 13:50:53.707759 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/095fb2e2-a411-4c41-bf21-1c8b69166a54-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:53 crc kubenswrapper[4921]: I0312 13:50:53.707773 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpgj4\" (UniqueName: \"kubernetes.io/projected/095fb2e2-a411-4c41-bf21-1c8b69166a54-kube-api-access-wpgj4\") on node \"crc\" DevicePath \"\"" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.061530 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" event={"ID":"095fb2e2-a411-4c41-bf21-1c8b69166a54","Type":"ContainerDied","Data":"a752421f1e1870aecbaf811ec1f852a7511860cd84ff023a19a749549c4b0e70"} Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.061566 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a752421f1e1870aecbaf811ec1f852a7511860cd84ff023a19a749549c4b0e70" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.061641 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nzzfd" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.150319 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z"] Mar 12 13:50:54 crc kubenswrapper[4921]: E0312 13:50:54.150712 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="095fb2e2-a411-4c41-bf21-1c8b69166a54" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.150726 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="095fb2e2-a411-4c41-bf21-1c8b69166a54" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.150916 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="095fb2e2-a411-4c41-bf21-1c8b69166a54" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.151474 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.153578 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.154325 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.154342 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.154478 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.154742 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.156924 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z"] Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.217886 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.218053 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.218078 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k6zv\" (UniqueName: \"kubernetes.io/projected/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-kube-api-access-9k6zv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.218102 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.319842 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.320230 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k6zv\" (UniqueName: \"kubernetes.io/projected/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-kube-api-access-9k6zv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.320270 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.320369 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.325195 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.325471 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.331353 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.335642 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k6zv\" (UniqueName: \"kubernetes.io/projected/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-kube-api-access-9k6zv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.509548 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:50:54 crc kubenswrapper[4921]: I0312 13:50:54.987557 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z"] Mar 12 13:50:55 crc kubenswrapper[4921]: I0312 13:50:55.069651 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" event={"ID":"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2","Type":"ContainerStarted","Data":"dba4da49f77099d4260d143364382a1092c4c757a4e3236650c4d5b35f3c5980"} Mar 12 13:50:56 crc kubenswrapper[4921]: I0312 13:50:56.088323 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" event={"ID":"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2","Type":"ContainerStarted","Data":"9c552cf2192c938af872854552171a9ff7fba54e53319c8aaa52faf25f8a2b61"} Mar 12 13:50:56 crc kubenswrapper[4921]: I0312 13:50:56.111675 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" podStartSLOduration=1.366042945 podStartE2EDuration="2.11165048s" podCreationTimestamp="2026-03-12 13:50:54 +0000 UTC" firstStartedPulling="2026-03-12 13:50:55.005873376 +0000 UTC m=+2477.695945347" lastFinishedPulling="2026-03-12 13:50:55.751480891 +0000 UTC m=+2478.441552882" observedRunningTime="2026-03-12 13:50:56.105952555 +0000 UTC m=+2478.796024566" watchObservedRunningTime="2026-03-12 13:50:56.11165048 +0000 UTC m=+2478.801722491" Mar 12 13:50:59 crc kubenswrapper[4921]: I0312 13:50:59.809496 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pc87q"] Mar 12 13:50:59 crc kubenswrapper[4921]: I0312 13:50:59.821155 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:50:59 crc kubenswrapper[4921]: I0312 13:50:59.832378 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pc87q"] Mar 12 13:50:59 crc kubenswrapper[4921]: I0312 13:50:59.941402 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/781d89ba-908d-43e4-8d07-487e2daeafdf-catalog-content\") pod \"certified-operators-pc87q\" (UID: \"781d89ba-908d-43e4-8d07-487e2daeafdf\") " pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:50:59 crc kubenswrapper[4921]: I0312 13:50:59.941689 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/781d89ba-908d-43e4-8d07-487e2daeafdf-utilities\") pod \"certified-operators-pc87q\" (UID: \"781d89ba-908d-43e4-8d07-487e2daeafdf\") " pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:50:59 crc kubenswrapper[4921]: I0312 13:50:59.941775 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56fzt\" (UniqueName: \"kubernetes.io/projected/781d89ba-908d-43e4-8d07-487e2daeafdf-kube-api-access-56fzt\") pod \"certified-operators-pc87q\" (UID: \"781d89ba-908d-43e4-8d07-487e2daeafdf\") " pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:51:00 crc kubenswrapper[4921]: I0312 13:51:00.043693 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/781d89ba-908d-43e4-8d07-487e2daeafdf-utilities\") pod \"certified-operators-pc87q\" (UID: \"781d89ba-908d-43e4-8d07-487e2daeafdf\") " pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:51:00 crc kubenswrapper[4921]: I0312 13:51:00.043770 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56fzt\" (UniqueName: \"kubernetes.io/projected/781d89ba-908d-43e4-8d07-487e2daeafdf-kube-api-access-56fzt\") pod \"certified-operators-pc87q\" (UID: \"781d89ba-908d-43e4-8d07-487e2daeafdf\") " pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:51:00 crc kubenswrapper[4921]: I0312 13:51:00.043894 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/781d89ba-908d-43e4-8d07-487e2daeafdf-catalog-content\") pod \"certified-operators-pc87q\" (UID: \"781d89ba-908d-43e4-8d07-487e2daeafdf\") " pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:51:00 crc kubenswrapper[4921]: I0312 13:51:00.044149 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/781d89ba-908d-43e4-8d07-487e2daeafdf-utilities\") pod \"certified-operators-pc87q\" (UID: \"781d89ba-908d-43e4-8d07-487e2daeafdf\") " pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:51:00 crc kubenswrapper[4921]: I0312 13:51:00.044362 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/781d89ba-908d-43e4-8d07-487e2daeafdf-catalog-content\") pod \"certified-operators-pc87q\" (UID: \"781d89ba-908d-43e4-8d07-487e2daeafdf\") " pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:51:00 crc kubenswrapper[4921]: I0312 13:51:00.068206 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56fzt\" (UniqueName: \"kubernetes.io/projected/781d89ba-908d-43e4-8d07-487e2daeafdf-kube-api-access-56fzt\") pod \"certified-operators-pc87q\" (UID: \"781d89ba-908d-43e4-8d07-487e2daeafdf\") " pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:51:00 crc kubenswrapper[4921]: I0312 13:51:00.140968 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:51:00 crc kubenswrapper[4921]: I0312 13:51:00.696197 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pc87q"] Mar 12 13:51:00 crc kubenswrapper[4921]: W0312 13:51:00.703157 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod781d89ba_908d_43e4_8d07_487e2daeafdf.slice/crio-6d106b8be4ba8cb7f27f0db1ac850ada30821cf78458198923507e5a1ba048bd WatchSource:0}: Error finding container 6d106b8be4ba8cb7f27f0db1ac850ada30821cf78458198923507e5a1ba048bd: Status 404 returned error can't find the container with id 6d106b8be4ba8cb7f27f0db1ac850ada30821cf78458198923507e5a1ba048bd Mar 12 13:51:01 crc kubenswrapper[4921]: I0312 13:51:01.133731 4921 generic.go:334] "Generic (PLEG): container finished" podID="781d89ba-908d-43e4-8d07-487e2daeafdf" containerID="94ee6bc5f9b63876f152308907352f6dd0efeb710f126a3c94b69cba67efba93" exitCode=0 Mar 12 13:51:01 crc kubenswrapper[4921]: I0312 13:51:01.133781 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc87q" event={"ID":"781d89ba-908d-43e4-8d07-487e2daeafdf","Type":"ContainerDied","Data":"94ee6bc5f9b63876f152308907352f6dd0efeb710f126a3c94b69cba67efba93"} Mar 12 13:51:01 crc kubenswrapper[4921]: I0312 13:51:01.133827 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc87q" event={"ID":"781d89ba-908d-43e4-8d07-487e2daeafdf","Type":"ContainerStarted","Data":"6d106b8be4ba8cb7f27f0db1ac850ada30821cf78458198923507e5a1ba048bd"} Mar 12 13:51:02 crc kubenswrapper[4921]: I0312 13:51:02.146274 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc87q" event={"ID":"781d89ba-908d-43e4-8d07-487e2daeafdf","Type":"ContainerStarted","Data":"ba6c8e8fab77a09adce6d590a8cdc48bf4da882f10e98abd81363b907ad4a466"} Mar 12 13:51:03 crc kubenswrapper[4921]: I0312 13:51:03.158481 4921 generic.go:334] "Generic (PLEG): container finished" podID="781d89ba-908d-43e4-8d07-487e2daeafdf" containerID="ba6c8e8fab77a09adce6d590a8cdc48bf4da882f10e98abd81363b907ad4a466" exitCode=0 Mar 12 13:51:03 crc kubenswrapper[4921]: I0312 13:51:03.158543 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc87q" event={"ID":"781d89ba-908d-43e4-8d07-487e2daeafdf","Type":"ContainerDied","Data":"ba6c8e8fab77a09adce6d590a8cdc48bf4da882f10e98abd81363b907ad4a466"} Mar 12 13:51:04 crc kubenswrapper[4921]: I0312 13:51:04.169223 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc87q" event={"ID":"781d89ba-908d-43e4-8d07-487e2daeafdf","Type":"ContainerStarted","Data":"4ba8103b2a697dc9addbbb77adeacdf602bc8dbfed1bb4d71c61a8ffb1cc670e"} Mar 12 13:51:04 crc kubenswrapper[4921]: I0312 13:51:04.200322 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pc87q" podStartSLOduration=2.757343878 podStartE2EDuration="5.200303469s" podCreationTimestamp="2026-03-12 13:50:59 +0000 UTC" firstStartedPulling="2026-03-12 13:51:01.135708224 +0000 UTC m=+2483.825780205" lastFinishedPulling="2026-03-12 13:51:03.578667815 +0000 UTC m=+2486.268739796" observedRunningTime="2026-03-12 13:51:04.193596783 +0000 UTC m=+2486.883668754" watchObservedRunningTime="2026-03-12 13:51:04.200303469 +0000 UTC m=+2486.890375440" Mar 12 13:51:04 crc kubenswrapper[4921]: I0312 13:51:04.985568 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:51:04 crc kubenswrapper[4921]: E0312 13:51:04.986978 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:51:06 crc kubenswrapper[4921]: I0312 13:51:06.190560 4921 generic.go:334] "Generic (PLEG): container finished" podID="55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2" containerID="9c552cf2192c938af872854552171a9ff7fba54e53319c8aaa52faf25f8a2b61" exitCode=0 Mar 12 13:51:06 crc kubenswrapper[4921]: I0312 13:51:06.190913 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" event={"ID":"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2","Type":"ContainerDied","Data":"9c552cf2192c938af872854552171a9ff7fba54e53319c8aaa52faf25f8a2b61"} Mar 12 13:51:07 crc kubenswrapper[4921]: I0312 13:51:07.606983 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:51:07 crc kubenswrapper[4921]: I0312 13:51:07.704552 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-ssh-key-openstack-edpm-ipam\") pod \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " Mar 12 13:51:07 crc kubenswrapper[4921]: I0312 13:51:07.704704 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k6zv\" (UniqueName: \"kubernetes.io/projected/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-kube-api-access-9k6zv\") pod \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " Mar 12 13:51:07 crc kubenswrapper[4921]: I0312 13:51:07.704790 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-inventory\") pod \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " Mar 12 13:51:07 crc kubenswrapper[4921]: I0312 13:51:07.704841 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-ceph\") pod \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\" (UID: \"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2\") " Mar 12 13:51:07 crc kubenswrapper[4921]: I0312 13:51:07.709783 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-kube-api-access-9k6zv" (OuterVolumeSpecName: "kube-api-access-9k6zv") pod "55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2" (UID: "55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2"). InnerVolumeSpecName "kube-api-access-9k6zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:51:07 crc kubenswrapper[4921]: I0312 13:51:07.709833 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-ceph" (OuterVolumeSpecName: "ceph") pod "55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2" (UID: "55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:07 crc kubenswrapper[4921]: I0312 13:51:07.728573 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-inventory" (OuterVolumeSpecName: "inventory") pod "55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2" (UID: "55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:07 crc kubenswrapper[4921]: I0312 13:51:07.730587 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2" (UID: "55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:07 crc kubenswrapper[4921]: I0312 13:51:07.807318 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k6zv\" (UniqueName: \"kubernetes.io/projected/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-kube-api-access-9k6zv\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:07 crc kubenswrapper[4921]: I0312 13:51:07.807358 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:07 crc kubenswrapper[4921]: I0312 13:51:07.807375 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:07 crc kubenswrapper[4921]: I0312 13:51:07.807394 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.207595 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" event={"ID":"55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2","Type":"ContainerDied","Data":"dba4da49f77099d4260d143364382a1092c4c757a4e3236650c4d5b35f3c5980"} Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.207635 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dba4da49f77099d4260d143364382a1092c4c757a4e3236650c4d5b35f3c5980" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.207710 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.285706 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl"] Mar 12 13:51:08 crc kubenswrapper[4921]: E0312 13:51:08.286245 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.286264 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.286458 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.287076 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.289399 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.289529 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.290577 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.291032 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.291066 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.291132 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.291199 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.292362 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.296212 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl"] Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.419261 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.419339 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.419385 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.419489 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.419568 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.419646 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.419694 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.419753 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxqld\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-kube-api-access-zxqld\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.419793 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.419942 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.420089 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.420269 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.420306 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.522593 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.522646 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.522701 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.522764 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.522796 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.522840 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.522877 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.522907 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.522932 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.522965 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxqld\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-kube-api-access-zxqld\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.522991 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.523032 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.523098 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.528357 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.529146 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.529765 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.530410 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.530832 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.530860 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.531077 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.531893 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.532001 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.532581 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.539172 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.540411 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.544889 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxqld\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-kube-api-access-zxqld\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:08 crc kubenswrapper[4921]: I0312 13:51:08.652956 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:09 crc kubenswrapper[4921]: W0312 13:51:09.168410 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4eac827_ab86_4fef_b974_8638416f5125.slice/crio-8f4cba20c04282b58a9671053b4cf8299a3ed5f96e87f065fd808d2b154cd467 WatchSource:0}: Error finding container 8f4cba20c04282b58a9671053b4cf8299a3ed5f96e87f065fd808d2b154cd467: Status 404 returned error can't find the container with id 8f4cba20c04282b58a9671053b4cf8299a3ed5f96e87f065fd808d2b154cd467 Mar 12 13:51:09 crc kubenswrapper[4921]: I0312 13:51:09.174294 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl"] Mar 12 13:51:09 crc kubenswrapper[4921]: I0312 13:51:09.218996 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" event={"ID":"c4eac827-ab86-4fef-b974-8638416f5125","Type":"ContainerStarted","Data":"8f4cba20c04282b58a9671053b4cf8299a3ed5f96e87f065fd808d2b154cd467"} Mar 12 13:51:10 crc kubenswrapper[4921]: I0312 13:51:10.141694 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:51:10 crc kubenswrapper[4921]: I0312 13:51:10.142252 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:51:10 crc kubenswrapper[4921]: I0312 13:51:10.208932 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:51:10 crc kubenswrapper[4921]: I0312 13:51:10.239754 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" event={"ID":"c4eac827-ab86-4fef-b974-8638416f5125","Type":"ContainerStarted","Data":"0ffc2da603038c11f5b53feab08c1fb952a025d8e4263c1fb86bd5aa008a7565"} Mar 12 13:51:10 crc kubenswrapper[4921]: I0312 13:51:10.282963 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" podStartSLOduration=1.743190156 podStartE2EDuration="2.282936429s" podCreationTimestamp="2026-03-12 13:51:08 +0000 UTC" firstStartedPulling="2026-03-12 13:51:09.171223254 +0000 UTC m=+2491.861295225" lastFinishedPulling="2026-03-12 13:51:09.710969537 +0000 UTC m=+2492.401041498" observedRunningTime="2026-03-12 13:51:10.266705892 +0000 UTC m=+2492.956777873" watchObservedRunningTime="2026-03-12 13:51:10.282936429 +0000 UTC m=+2492.973008440" Mar 12 13:51:10 crc kubenswrapper[4921]: I0312 13:51:10.292734 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:51:10 crc kubenswrapper[4921]: I0312 13:51:10.446996 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pc87q"] Mar 12 13:51:12 crc kubenswrapper[4921]: I0312 13:51:12.259199 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pc87q" podUID="781d89ba-908d-43e4-8d07-487e2daeafdf" containerName="registry-server" containerID="cri-o://4ba8103b2a697dc9addbbb77adeacdf602bc8dbfed1bb4d71c61a8ffb1cc670e" gracePeriod=2 Mar 12 13:51:12 crc kubenswrapper[4921]: I0312 13:51:12.660110 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:51:12 crc kubenswrapper[4921]: I0312 13:51:12.752926 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/781d89ba-908d-43e4-8d07-487e2daeafdf-catalog-content\") pod \"781d89ba-908d-43e4-8d07-487e2daeafdf\" (UID: \"781d89ba-908d-43e4-8d07-487e2daeafdf\") " Mar 12 13:51:12 crc kubenswrapper[4921]: I0312 13:51:12.752980 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56fzt\" (UniqueName: \"kubernetes.io/projected/781d89ba-908d-43e4-8d07-487e2daeafdf-kube-api-access-56fzt\") pod \"781d89ba-908d-43e4-8d07-487e2daeafdf\" (UID: \"781d89ba-908d-43e4-8d07-487e2daeafdf\") " Mar 12 13:51:12 crc kubenswrapper[4921]: I0312 13:51:12.753100 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/781d89ba-908d-43e4-8d07-487e2daeafdf-utilities\") pod \"781d89ba-908d-43e4-8d07-487e2daeafdf\" (UID: \"781d89ba-908d-43e4-8d07-487e2daeafdf\") " Mar 12 13:51:12 crc kubenswrapper[4921]: I0312 13:51:12.754348 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/781d89ba-908d-43e4-8d07-487e2daeafdf-utilities" (OuterVolumeSpecName: "utilities") pod "781d89ba-908d-43e4-8d07-487e2daeafdf" (UID: "781d89ba-908d-43e4-8d07-487e2daeafdf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:51:12 crc kubenswrapper[4921]: I0312 13:51:12.764125 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/781d89ba-908d-43e4-8d07-487e2daeafdf-kube-api-access-56fzt" (OuterVolumeSpecName: "kube-api-access-56fzt") pod "781d89ba-908d-43e4-8d07-487e2daeafdf" (UID: "781d89ba-908d-43e4-8d07-487e2daeafdf"). InnerVolumeSpecName "kube-api-access-56fzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:51:12 crc kubenswrapper[4921]: I0312 13:51:12.814836 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/781d89ba-908d-43e4-8d07-487e2daeafdf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "781d89ba-908d-43e4-8d07-487e2daeafdf" (UID: "781d89ba-908d-43e4-8d07-487e2daeafdf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:51:12 crc kubenswrapper[4921]: I0312 13:51:12.854609 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56fzt\" (UniqueName: \"kubernetes.io/projected/781d89ba-908d-43e4-8d07-487e2daeafdf-kube-api-access-56fzt\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:12 crc kubenswrapper[4921]: I0312 13:51:12.854642 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/781d89ba-908d-43e4-8d07-487e2daeafdf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:12 crc kubenswrapper[4921]: I0312 13:51:12.854652 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/781d89ba-908d-43e4-8d07-487e2daeafdf-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.267174 4921 generic.go:334] "Generic (PLEG): container finished" podID="781d89ba-908d-43e4-8d07-487e2daeafdf" containerID="4ba8103b2a697dc9addbbb77adeacdf602bc8dbfed1bb4d71c61a8ffb1cc670e" exitCode=0 Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.267238 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pc87q" Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.267229 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc87q" event={"ID":"781d89ba-908d-43e4-8d07-487e2daeafdf","Type":"ContainerDied","Data":"4ba8103b2a697dc9addbbb77adeacdf602bc8dbfed1bb4d71c61a8ffb1cc670e"} Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.267307 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pc87q" event={"ID":"781d89ba-908d-43e4-8d07-487e2daeafdf","Type":"ContainerDied","Data":"6d106b8be4ba8cb7f27f0db1ac850ada30821cf78458198923507e5a1ba048bd"} Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.267331 4921 scope.go:117] "RemoveContainer" containerID="4ba8103b2a697dc9addbbb77adeacdf602bc8dbfed1bb4d71c61a8ffb1cc670e" Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.285776 4921 scope.go:117] "RemoveContainer" containerID="ba6c8e8fab77a09adce6d590a8cdc48bf4da882f10e98abd81363b907ad4a466" Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.299184 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pc87q"] Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.307694 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pc87q"] Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.327644 4921 scope.go:117] "RemoveContainer" containerID="94ee6bc5f9b63876f152308907352f6dd0efeb710f126a3c94b69cba67efba93" Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.346336 4921 scope.go:117] "RemoveContainer" containerID="4ba8103b2a697dc9addbbb77adeacdf602bc8dbfed1bb4d71c61a8ffb1cc670e" Mar 12 13:51:13 crc kubenswrapper[4921]: E0312 13:51:13.346685 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ba8103b2a697dc9addbbb77adeacdf602bc8dbfed1bb4d71c61a8ffb1cc670e\": container with ID starting with 4ba8103b2a697dc9addbbb77adeacdf602bc8dbfed1bb4d71c61a8ffb1cc670e not found: ID does not exist" containerID="4ba8103b2a697dc9addbbb77adeacdf602bc8dbfed1bb4d71c61a8ffb1cc670e" Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.346727 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ba8103b2a697dc9addbbb77adeacdf602bc8dbfed1bb4d71c61a8ffb1cc670e"} err="failed to get container status \"4ba8103b2a697dc9addbbb77adeacdf602bc8dbfed1bb4d71c61a8ffb1cc670e\": rpc error: code = NotFound desc = could not find container \"4ba8103b2a697dc9addbbb77adeacdf602bc8dbfed1bb4d71c61a8ffb1cc670e\": container with ID starting with 4ba8103b2a697dc9addbbb77adeacdf602bc8dbfed1bb4d71c61a8ffb1cc670e not found: ID does not exist" Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.346753 4921 scope.go:117] "RemoveContainer" containerID="ba6c8e8fab77a09adce6d590a8cdc48bf4da882f10e98abd81363b907ad4a466" Mar 12 13:51:13 crc kubenswrapper[4921]: E0312 13:51:13.347156 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba6c8e8fab77a09adce6d590a8cdc48bf4da882f10e98abd81363b907ad4a466\": container with ID starting with ba6c8e8fab77a09adce6d590a8cdc48bf4da882f10e98abd81363b907ad4a466 not found: ID does not exist" containerID="ba6c8e8fab77a09adce6d590a8cdc48bf4da882f10e98abd81363b907ad4a466" Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.347197 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba6c8e8fab77a09adce6d590a8cdc48bf4da882f10e98abd81363b907ad4a466"} err="failed to get container status \"ba6c8e8fab77a09adce6d590a8cdc48bf4da882f10e98abd81363b907ad4a466\": rpc error: code = NotFound desc = could not find container \"ba6c8e8fab77a09adce6d590a8cdc48bf4da882f10e98abd81363b907ad4a466\": container with ID starting with ba6c8e8fab77a09adce6d590a8cdc48bf4da882f10e98abd81363b907ad4a466 not found: ID does not exist" Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.347224 4921 scope.go:117] "RemoveContainer" containerID="94ee6bc5f9b63876f152308907352f6dd0efeb710f126a3c94b69cba67efba93" Mar 12 13:51:13 crc kubenswrapper[4921]: E0312 13:51:13.347495 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94ee6bc5f9b63876f152308907352f6dd0efeb710f126a3c94b69cba67efba93\": container with ID starting with 94ee6bc5f9b63876f152308907352f6dd0efeb710f126a3c94b69cba67efba93 not found: ID does not exist" containerID="94ee6bc5f9b63876f152308907352f6dd0efeb710f126a3c94b69cba67efba93" Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.347525 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94ee6bc5f9b63876f152308907352f6dd0efeb710f126a3c94b69cba67efba93"} err="failed to get container status \"94ee6bc5f9b63876f152308907352f6dd0efeb710f126a3c94b69cba67efba93\": rpc error: code = NotFound desc = could not find container \"94ee6bc5f9b63876f152308907352f6dd0efeb710f126a3c94b69cba67efba93\": container with ID starting with 94ee6bc5f9b63876f152308907352f6dd0efeb710f126a3c94b69cba67efba93 not found: ID does not exist" Mar 12 13:51:13 crc kubenswrapper[4921]: I0312 13:51:13.994952 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="781d89ba-908d-43e4-8d07-487e2daeafdf" path="/var/lib/kubelet/pods/781d89ba-908d-43e4-8d07-487e2daeafdf/volumes" Mar 12 13:51:18 crc kubenswrapper[4921]: I0312 13:51:18.984288 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:51:18 crc kubenswrapper[4921]: E0312 13:51:18.984926 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:51:33 crc kubenswrapper[4921]: I0312 13:51:33.983993 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:51:33 crc kubenswrapper[4921]: E0312 13:51:33.984977 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:51:40 crc kubenswrapper[4921]: I0312 13:51:40.562037 4921 generic.go:334] "Generic (PLEG): container finished" podID="c4eac827-ab86-4fef-b974-8638416f5125" containerID="0ffc2da603038c11f5b53feab08c1fb952a025d8e4263c1fb86bd5aa008a7565" exitCode=0 Mar 12 13:51:40 crc kubenswrapper[4921]: I0312 13:51:40.562112 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" event={"ID":"c4eac827-ab86-4fef-b974-8638416f5125","Type":"ContainerDied","Data":"0ffc2da603038c11f5b53feab08c1fb952a025d8e4263c1fb86bd5aa008a7565"} Mar 12 13:51:41 crc kubenswrapper[4921]: I0312 13:51:41.990553 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.073714 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-ovn-default-certs-0\") pod \"c4eac827-ab86-4fef-b974-8638416f5125\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.073826 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ceph\") pod \"c4eac827-ab86-4fef-b974-8638416f5125\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.073885 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"c4eac827-ab86-4fef-b974-8638416f5125\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.073915 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ssh-key-openstack-edpm-ipam\") pod \"c4eac827-ab86-4fef-b974-8638416f5125\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.074071 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-repo-setup-combined-ca-bundle\") pod \"c4eac827-ab86-4fef-b974-8638416f5125\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.074247 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-inventory\") pod \"c4eac827-ab86-4fef-b974-8638416f5125\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.074302 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ovn-combined-ca-bundle\") pod \"c4eac827-ab86-4fef-b974-8638416f5125\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.074349 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-nova-combined-ca-bundle\") pod \"c4eac827-ab86-4fef-b974-8638416f5125\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.074369 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-neutron-metadata-combined-ca-bundle\") pod \"c4eac827-ab86-4fef-b974-8638416f5125\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.074407 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-bootstrap-combined-ca-bundle\") pod \"c4eac827-ab86-4fef-b974-8638416f5125\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.074429 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"c4eac827-ab86-4fef-b974-8638416f5125\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.074460 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxqld\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-kube-api-access-zxqld\") pod \"c4eac827-ab86-4fef-b974-8638416f5125\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.074501 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-libvirt-combined-ca-bundle\") pod \"c4eac827-ab86-4fef-b974-8638416f5125\" (UID: \"c4eac827-ab86-4fef-b974-8638416f5125\") " Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.081534 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c4eac827-ab86-4fef-b974-8638416f5125" (UID: "c4eac827-ab86-4fef-b974-8638416f5125"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.081565 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ceph" (OuterVolumeSpecName: "ceph") pod "c4eac827-ab86-4fef-b974-8638416f5125" (UID: "c4eac827-ab86-4fef-b974-8638416f5125"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.082272 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c4eac827-ab86-4fef-b974-8638416f5125" (UID: "c4eac827-ab86-4fef-b974-8638416f5125"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.082348 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c4eac827-ab86-4fef-b974-8638416f5125" (UID: "c4eac827-ab86-4fef-b974-8638416f5125"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.082407 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c4eac827-ab86-4fef-b974-8638416f5125" (UID: "c4eac827-ab86-4fef-b974-8638416f5125"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.082827 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "c4eac827-ab86-4fef-b974-8638416f5125" (UID: "c4eac827-ab86-4fef-b974-8638416f5125"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.083429 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "c4eac827-ab86-4fef-b974-8638416f5125" (UID: "c4eac827-ab86-4fef-b974-8638416f5125"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.083490 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c4eac827-ab86-4fef-b974-8638416f5125" (UID: "c4eac827-ab86-4fef-b974-8638416f5125"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.084756 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "c4eac827-ab86-4fef-b974-8638416f5125" (UID: "c4eac827-ab86-4fef-b974-8638416f5125"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.087018 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c4eac827-ab86-4fef-b974-8638416f5125" (UID: "c4eac827-ab86-4fef-b974-8638416f5125"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.100953 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-kube-api-access-zxqld" (OuterVolumeSpecName: "kube-api-access-zxqld") pod "c4eac827-ab86-4fef-b974-8638416f5125" (UID: "c4eac827-ab86-4fef-b974-8638416f5125"). InnerVolumeSpecName "kube-api-access-zxqld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.101734 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-inventory" (OuterVolumeSpecName: "inventory") pod "c4eac827-ab86-4fef-b974-8638416f5125" (UID: "c4eac827-ab86-4fef-b974-8638416f5125"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.102717 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c4eac827-ab86-4fef-b974-8638416f5125" (UID: "c4eac827-ab86-4fef-b974-8638416f5125"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.178124 4921 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.178175 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.178210 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.178231 4921 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.178250 4921 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.178267 4921 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.178287 4921 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.178307 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxqld\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-kube-api-access-zxqld\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.178324 4921 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.178342 4921 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.178364 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.178383 4921 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c4eac827-ab86-4fef-b974-8638416f5125-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.178406 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c4eac827-ab86-4fef-b974-8638416f5125-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.584239 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" event={"ID":"c4eac827-ab86-4fef-b974-8638416f5125","Type":"ContainerDied","Data":"8f4cba20c04282b58a9671053b4cf8299a3ed5f96e87f065fd808d2b154cd467"} Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.584314 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f4cba20c04282b58a9671053b4cf8299a3ed5f96e87f065fd808d2b154cd467" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.584329 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.701337 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558"] Mar 12 13:51:42 crc kubenswrapper[4921]: E0312 13:51:42.701792 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="781d89ba-908d-43e4-8d07-487e2daeafdf" containerName="registry-server" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.701834 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="781d89ba-908d-43e4-8d07-487e2daeafdf" containerName="registry-server" Mar 12 13:51:42 crc kubenswrapper[4921]: E0312 13:51:42.701864 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="781d89ba-908d-43e4-8d07-487e2daeafdf" containerName="extract-content" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.701875 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="781d89ba-908d-43e4-8d07-487e2daeafdf" containerName="extract-content" Mar 12 13:51:42 crc kubenswrapper[4921]: E0312 13:51:42.701896 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="781d89ba-908d-43e4-8d07-487e2daeafdf" containerName="extract-utilities" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.701904 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="781d89ba-908d-43e4-8d07-487e2daeafdf" containerName="extract-utilities" Mar 12 13:51:42 crc kubenswrapper[4921]: E0312 13:51:42.701923 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4eac827-ab86-4fef-b974-8638416f5125" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.701934 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4eac827-ab86-4fef-b974-8638416f5125" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.702176 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4eac827-ab86-4fef-b974-8638416f5125" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.702214 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="781d89ba-908d-43e4-8d07-487e2daeafdf" containerName="registry-server" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.702979 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.711130 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.711572 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.711637 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.711825 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.712905 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.718254 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558"] Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.790971 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b266l\" (UniqueName: \"kubernetes.io/projected/f5b6000a-13f1-4d52-9a03-3b777b3d651d-kube-api-access-b266l\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-dt558\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.791015 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-dt558\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.791210 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-dt558\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.791575 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-dt558\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.893316 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-dt558\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.893433 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b266l\" (UniqueName: \"kubernetes.io/projected/f5b6000a-13f1-4d52-9a03-3b777b3d651d-kube-api-access-b266l\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-dt558\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.893459 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-dt558\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.893483 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-dt558\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.899288 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-dt558\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.900403 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-dt558\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.902335 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-dt558\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:42 crc kubenswrapper[4921]: I0312 13:51:42.909398 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b266l\" (UniqueName: \"kubernetes.io/projected/f5b6000a-13f1-4d52-9a03-3b777b3d651d-kube-api-access-b266l\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-dt558\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:43 crc kubenswrapper[4921]: I0312 13:51:43.021942 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:43 crc kubenswrapper[4921]: I0312 13:51:43.527716 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558"] Mar 12 13:51:43 crc kubenswrapper[4921]: W0312 13:51:43.532939 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5b6000a_13f1_4d52_9a03_3b777b3d651d.slice/crio-856cf405b183ef913bbab194fa6253a325dcfc20cfec79753a98ce1f29bde867 WatchSource:0}: Error finding container 856cf405b183ef913bbab194fa6253a325dcfc20cfec79753a98ce1f29bde867: Status 404 returned error can't find the container with id 856cf405b183ef913bbab194fa6253a325dcfc20cfec79753a98ce1f29bde867 Mar 12 13:51:43 crc kubenswrapper[4921]: I0312 13:51:43.593673 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" event={"ID":"f5b6000a-13f1-4d52-9a03-3b777b3d651d","Type":"ContainerStarted","Data":"856cf405b183ef913bbab194fa6253a325dcfc20cfec79753a98ce1f29bde867"} Mar 12 13:51:44 crc kubenswrapper[4921]: I0312 13:51:44.617555 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" event={"ID":"f5b6000a-13f1-4d52-9a03-3b777b3d651d","Type":"ContainerStarted","Data":"3b2a263ccc7e8debfb14da02fae63cb6e231d71eb82b073a1f71304249563d25"} Mar 12 13:51:44 crc kubenswrapper[4921]: I0312 13:51:44.654504 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" podStartSLOduration=2.103741945 podStartE2EDuration="2.654477336s" podCreationTimestamp="2026-03-12 13:51:42 +0000 UTC" firstStartedPulling="2026-03-12 13:51:43.535034153 +0000 UTC m=+2526.225106124" lastFinishedPulling="2026-03-12 13:51:44.085769534 +0000 UTC m=+2526.775841515" observedRunningTime="2026-03-12 13:51:44.641624961 +0000 UTC m=+2527.331696962" watchObservedRunningTime="2026-03-12 13:51:44.654477336 +0000 UTC m=+2527.344549347" Mar 12 13:51:48 crc kubenswrapper[4921]: I0312 13:51:48.984026 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:51:48 crc kubenswrapper[4921]: E0312 13:51:48.984556 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:51:49 crc kubenswrapper[4921]: I0312 13:51:49.670793 4921 generic.go:334] "Generic (PLEG): container finished" podID="f5b6000a-13f1-4d52-9a03-3b777b3d651d" containerID="3b2a263ccc7e8debfb14da02fae63cb6e231d71eb82b073a1f71304249563d25" exitCode=0 Mar 12 13:51:49 crc kubenswrapper[4921]: I0312 13:51:49.670925 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" event={"ID":"f5b6000a-13f1-4d52-9a03-3b777b3d651d","Type":"ContainerDied","Data":"3b2a263ccc7e8debfb14da02fae63cb6e231d71eb82b073a1f71304249563d25"} Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.100257 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.259617 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-inventory\") pod \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.259708 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-ceph\") pod \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.259732 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-ssh-key-openstack-edpm-ipam\") pod \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.259785 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b266l\" (UniqueName: \"kubernetes.io/projected/f5b6000a-13f1-4d52-9a03-3b777b3d651d-kube-api-access-b266l\") pod \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\" (UID: \"f5b6000a-13f1-4d52-9a03-3b777b3d651d\") " Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.264905 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-ceph" (OuterVolumeSpecName: "ceph") pod "f5b6000a-13f1-4d52-9a03-3b777b3d651d" (UID: "f5b6000a-13f1-4d52-9a03-3b777b3d651d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.266627 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5b6000a-13f1-4d52-9a03-3b777b3d651d-kube-api-access-b266l" (OuterVolumeSpecName: "kube-api-access-b266l") pod "f5b6000a-13f1-4d52-9a03-3b777b3d651d" (UID: "f5b6000a-13f1-4d52-9a03-3b777b3d651d"). InnerVolumeSpecName "kube-api-access-b266l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.282914 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-inventory" (OuterVolumeSpecName: "inventory") pod "f5b6000a-13f1-4d52-9a03-3b777b3d651d" (UID: "f5b6000a-13f1-4d52-9a03-3b777b3d651d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.289184 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f5b6000a-13f1-4d52-9a03-3b777b3d651d" (UID: "f5b6000a-13f1-4d52-9a03-3b777b3d651d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.361760 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.361794 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.361807 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5b6000a-13f1-4d52-9a03-3b777b3d651d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.361820 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b266l\" (UniqueName: \"kubernetes.io/projected/f5b6000a-13f1-4d52-9a03-3b777b3d651d-kube-api-access-b266l\") on node \"crc\" DevicePath \"\"" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.696881 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" event={"ID":"f5b6000a-13f1-4d52-9a03-3b777b3d651d","Type":"ContainerDied","Data":"856cf405b183ef913bbab194fa6253a325dcfc20cfec79753a98ce1f29bde867"} Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.697196 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="856cf405b183ef913bbab194fa6253a325dcfc20cfec79753a98ce1f29bde867" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.696931 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-dt558" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.851326 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb"] Mar 12 13:51:51 crc kubenswrapper[4921]: E0312 13:51:51.851990 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b6000a-13f1-4d52-9a03-3b777b3d651d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.852028 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b6000a-13f1-4d52-9a03-3b777b3d651d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.852373 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5b6000a-13f1-4d52-9a03-3b777b3d651d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.853439 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.856327 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.857985 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.858223 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.858483 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.858754 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.860416 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.864316 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb"] Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.971767 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8697c3cf-f4d2-45fb-9347-c580192e39d2-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.971844 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.971960 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9jc9\" (UniqueName: \"kubernetes.io/projected/8697c3cf-f4d2-45fb-9347-c580192e39d2-kube-api-access-s9jc9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.972014 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.972033 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:51 crc kubenswrapper[4921]: I0312 13:51:51.972067 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.073113 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.073387 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.073507 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.073658 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8697c3cf-f4d2-45fb-9347-c580192e39d2-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.073767 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.073944 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9jc9\" (UniqueName: \"kubernetes.io/projected/8697c3cf-f4d2-45fb-9347-c580192e39d2-kube-api-access-s9jc9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.075270 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8697c3cf-f4d2-45fb-9347-c580192e39d2-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.080395 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.080676 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.081162 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.081216 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.093539 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9jc9\" (UniqueName: \"kubernetes.io/projected/8697c3cf-f4d2-45fb-9347-c580192e39d2-kube-api-access-s9jc9\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-p2wxb\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.177545 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.691659 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb"] Mar 12 13:51:52 crc kubenswrapper[4921]: I0312 13:51:52.714514 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" event={"ID":"8697c3cf-f4d2-45fb-9347-c580192e39d2","Type":"ContainerStarted","Data":"cfb55e28ed626e174d3cd494ac774ffaf8e00851cdca15fb5969efbd700d001c"} Mar 12 13:51:54 crc kubenswrapper[4921]: I0312 13:51:54.743995 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" event={"ID":"8697c3cf-f4d2-45fb-9347-c580192e39d2","Type":"ContainerStarted","Data":"47257bee4b43a22c48c52e939134d018ed246f3a8c2ef086e8f9a38184c0694c"} Mar 12 13:51:54 crc kubenswrapper[4921]: I0312 13:51:54.766269 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" podStartSLOduration=2.911832548 podStartE2EDuration="3.766245296s" podCreationTimestamp="2026-03-12 13:51:51 +0000 UTC" firstStartedPulling="2026-03-12 13:51:52.700512028 +0000 UTC m=+2535.390583999" lastFinishedPulling="2026-03-12 13:51:53.554924776 +0000 UTC m=+2536.244996747" observedRunningTime="2026-03-12 13:51:54.759935152 +0000 UTC m=+2537.450007143" watchObservedRunningTime="2026-03-12 13:51:54.766245296 +0000 UTC m=+2537.456317277" Mar 12 13:52:00 crc kubenswrapper[4921]: I0312 13:52:00.131490 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555392-5bl64"] Mar 12 13:52:00 crc kubenswrapper[4921]: I0312 13:52:00.133324 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555392-5bl64" Mar 12 13:52:00 crc kubenswrapper[4921]: I0312 13:52:00.136172 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:52:00 crc kubenswrapper[4921]: I0312 13:52:00.136249 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:52:00 crc kubenswrapper[4921]: I0312 13:52:00.136405 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:52:00 crc kubenswrapper[4921]: I0312 13:52:00.143526 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555392-5bl64"] Mar 12 13:52:00 crc kubenswrapper[4921]: I0312 13:52:00.229609 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcs5j\" (UniqueName: \"kubernetes.io/projected/081f2e1e-8724-4334-8e31-f3b643d5dcc3-kube-api-access-zcs5j\") pod \"auto-csr-approver-29555392-5bl64\" (UID: \"081f2e1e-8724-4334-8e31-f3b643d5dcc3\") " pod="openshift-infra/auto-csr-approver-29555392-5bl64" Mar 12 13:52:00 crc kubenswrapper[4921]: I0312 13:52:00.331977 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcs5j\" (UniqueName: \"kubernetes.io/projected/081f2e1e-8724-4334-8e31-f3b643d5dcc3-kube-api-access-zcs5j\") pod \"auto-csr-approver-29555392-5bl64\" (UID: \"081f2e1e-8724-4334-8e31-f3b643d5dcc3\") " pod="openshift-infra/auto-csr-approver-29555392-5bl64" Mar 12 13:52:00 crc kubenswrapper[4921]: I0312 13:52:00.350609 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcs5j\" (UniqueName: \"kubernetes.io/projected/081f2e1e-8724-4334-8e31-f3b643d5dcc3-kube-api-access-zcs5j\") pod \"auto-csr-approver-29555392-5bl64\" (UID: \"081f2e1e-8724-4334-8e31-f3b643d5dcc3\") " pod="openshift-infra/auto-csr-approver-29555392-5bl64" Mar 12 13:52:00 crc kubenswrapper[4921]: I0312 13:52:00.455026 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555392-5bl64" Mar 12 13:52:00 crc kubenswrapper[4921]: I0312 13:52:00.878980 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555392-5bl64"] Mar 12 13:52:01 crc kubenswrapper[4921]: I0312 13:52:01.797365 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555392-5bl64" event={"ID":"081f2e1e-8724-4334-8e31-f3b643d5dcc3","Type":"ContainerStarted","Data":"a5e9c723c1a4c11d54f27046d3d55944669a8509809937a1551236704730e246"} Mar 12 13:52:02 crc kubenswrapper[4921]: I0312 13:52:02.811784 4921 generic.go:334] "Generic (PLEG): container finished" podID="081f2e1e-8724-4334-8e31-f3b643d5dcc3" containerID="8c06d03bf4c7df3de3245b6931b5cd0701633898a3fa332495e71ac182c3e046" exitCode=0 Mar 12 13:52:02 crc kubenswrapper[4921]: I0312 13:52:02.811875 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555392-5bl64" event={"ID":"081f2e1e-8724-4334-8e31-f3b643d5dcc3","Type":"ContainerDied","Data":"8c06d03bf4c7df3de3245b6931b5cd0701633898a3fa332495e71ac182c3e046"} Mar 12 13:52:03 crc kubenswrapper[4921]: I0312 13:52:03.983192 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:52:03 crc kubenswrapper[4921]: E0312 13:52:03.983703 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:52:04 crc kubenswrapper[4921]: I0312 13:52:04.123460 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555392-5bl64" Mar 12 13:52:04 crc kubenswrapper[4921]: I0312 13:52:04.214486 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcs5j\" (UniqueName: \"kubernetes.io/projected/081f2e1e-8724-4334-8e31-f3b643d5dcc3-kube-api-access-zcs5j\") pod \"081f2e1e-8724-4334-8e31-f3b643d5dcc3\" (UID: \"081f2e1e-8724-4334-8e31-f3b643d5dcc3\") " Mar 12 13:52:04 crc kubenswrapper[4921]: I0312 13:52:04.220502 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/081f2e1e-8724-4334-8e31-f3b643d5dcc3-kube-api-access-zcs5j" (OuterVolumeSpecName: "kube-api-access-zcs5j") pod "081f2e1e-8724-4334-8e31-f3b643d5dcc3" (UID: "081f2e1e-8724-4334-8e31-f3b643d5dcc3"). InnerVolumeSpecName "kube-api-access-zcs5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:52:04 crc kubenswrapper[4921]: I0312 13:52:04.317738 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcs5j\" (UniqueName: \"kubernetes.io/projected/081f2e1e-8724-4334-8e31-f3b643d5dcc3-kube-api-access-zcs5j\") on node \"crc\" DevicePath \"\"" Mar 12 13:52:04 crc kubenswrapper[4921]: I0312 13:52:04.827577 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555392-5bl64" event={"ID":"081f2e1e-8724-4334-8e31-f3b643d5dcc3","Type":"ContainerDied","Data":"a5e9c723c1a4c11d54f27046d3d55944669a8509809937a1551236704730e246"} Mar 12 13:52:04 crc kubenswrapper[4921]: I0312 13:52:04.827618 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5e9c723c1a4c11d54f27046d3d55944669a8509809937a1551236704730e246" Mar 12 13:52:04 crc kubenswrapper[4921]: I0312 13:52:04.827649 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555392-5bl64" Mar 12 13:52:05 crc kubenswrapper[4921]: I0312 13:52:05.205964 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555386-fph74"] Mar 12 13:52:05 crc kubenswrapper[4921]: I0312 13:52:05.212903 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555386-fph74"] Mar 12 13:52:05 crc kubenswrapper[4921]: I0312 13:52:05.999490 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d97b3bf-7844-4079-85f8-2e38c3f16346" path="/var/lib/kubelet/pods/4d97b3bf-7844-4079-85f8-2e38c3f16346/volumes" Mar 12 13:52:17 crc kubenswrapper[4921]: I0312 13:52:17.995090 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:52:17 crc kubenswrapper[4921]: E0312 13:52:17.998089 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:52:30 crc kubenswrapper[4921]: I0312 13:52:30.983922 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:52:30 crc kubenswrapper[4921]: E0312 13:52:30.986045 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:52:41 crc kubenswrapper[4921]: I0312 13:52:41.984242 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:52:41 crc kubenswrapper[4921]: E0312 13:52:41.985389 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:52:44 crc kubenswrapper[4921]: I0312 13:52:44.240753 4921 scope.go:117] "RemoveContainer" containerID="2cfaf000d1932ead71adf47335ba9033cdb8f2f9271112224ca5f6c4ecc4b2b0" Mar 12 13:52:53 crc kubenswrapper[4921]: I0312 13:52:53.983918 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:52:53 crc kubenswrapper[4921]: E0312 13:52:53.984510 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:53:03 crc kubenswrapper[4921]: I0312 13:53:03.379039 4921 generic.go:334] "Generic (PLEG): container finished" podID="8697c3cf-f4d2-45fb-9347-c580192e39d2" containerID="47257bee4b43a22c48c52e939134d018ed246f3a8c2ef086e8f9a38184c0694c" exitCode=0 Mar 12 13:53:03 crc kubenswrapper[4921]: I0312 13:53:03.379089 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" event={"ID":"8697c3cf-f4d2-45fb-9347-c580192e39d2","Type":"ContainerDied","Data":"47257bee4b43a22c48c52e939134d018ed246f3a8c2ef086e8f9a38184c0694c"} Mar 12 13:53:04 crc kubenswrapper[4921]: I0312 13:53:04.773843 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:53:04 crc kubenswrapper[4921]: I0312 13:53:04.934048 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8697c3cf-f4d2-45fb-9347-c580192e39d2-ovncontroller-config-0\") pod \"8697c3cf-f4d2-45fb-9347-c580192e39d2\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " Mar 12 13:53:04 crc kubenswrapper[4921]: I0312 13:53:04.934207 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-inventory\") pod \"8697c3cf-f4d2-45fb-9347-c580192e39d2\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " Mar 12 13:53:04 crc kubenswrapper[4921]: I0312 13:53:04.934246 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ovn-combined-ca-bundle\") pod \"8697c3cf-f4d2-45fb-9347-c580192e39d2\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " Mar 12 13:53:04 crc kubenswrapper[4921]: I0312 13:53:04.934288 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ceph\") pod \"8697c3cf-f4d2-45fb-9347-c580192e39d2\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " Mar 12 13:53:04 crc kubenswrapper[4921]: I0312 13:53:04.934321 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ssh-key-openstack-edpm-ipam\") pod \"8697c3cf-f4d2-45fb-9347-c580192e39d2\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " Mar 12 13:53:04 crc kubenswrapper[4921]: I0312 13:53:04.934353 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9jc9\" (UniqueName: \"kubernetes.io/projected/8697c3cf-f4d2-45fb-9347-c580192e39d2-kube-api-access-s9jc9\") pod \"8697c3cf-f4d2-45fb-9347-c580192e39d2\" (UID: \"8697c3cf-f4d2-45fb-9347-c580192e39d2\") " Mar 12 13:53:04 crc kubenswrapper[4921]: I0312 13:53:04.939372 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8697c3cf-f4d2-45fb-9347-c580192e39d2-kube-api-access-s9jc9" (OuterVolumeSpecName: "kube-api-access-s9jc9") pod "8697c3cf-f4d2-45fb-9347-c580192e39d2" (UID: "8697c3cf-f4d2-45fb-9347-c580192e39d2"). InnerVolumeSpecName "kube-api-access-s9jc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:53:04 crc kubenswrapper[4921]: I0312 13:53:04.939731 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8697c3cf-f4d2-45fb-9347-c580192e39d2" (UID: "8697c3cf-f4d2-45fb-9347-c580192e39d2"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:53:04 crc kubenswrapper[4921]: I0312 13:53:04.940262 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ceph" (OuterVolumeSpecName: "ceph") pod "8697c3cf-f4d2-45fb-9347-c580192e39d2" (UID: "8697c3cf-f4d2-45fb-9347-c580192e39d2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:53:04 crc kubenswrapper[4921]: I0312 13:53:04.959973 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-inventory" (OuterVolumeSpecName: "inventory") pod "8697c3cf-f4d2-45fb-9347-c580192e39d2" (UID: "8697c3cf-f4d2-45fb-9347-c580192e39d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:53:04 crc kubenswrapper[4921]: I0312 13:53:04.960509 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8697c3cf-f4d2-45fb-9347-c580192e39d2-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "8697c3cf-f4d2-45fb-9347-c580192e39d2" (UID: "8697c3cf-f4d2-45fb-9347-c580192e39d2"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 13:53:04 crc kubenswrapper[4921]: I0312 13:53:04.963994 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8697c3cf-f4d2-45fb-9347-c580192e39d2" (UID: "8697c3cf-f4d2-45fb-9347-c580192e39d2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.036121 4921 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8697c3cf-f4d2-45fb-9347-c580192e39d2-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.036161 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.036171 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.036179 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.036188 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8697c3cf-f4d2-45fb-9347-c580192e39d2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.036196 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9jc9\" (UniqueName: \"kubernetes.io/projected/8697c3cf-f4d2-45fb-9347-c580192e39d2-kube-api-access-s9jc9\") on node \"crc\" DevicePath \"\"" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.399609 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" event={"ID":"8697c3cf-f4d2-45fb-9347-c580192e39d2","Type":"ContainerDied","Data":"cfb55e28ed626e174d3cd494ac774ffaf8e00851cdca15fb5969efbd700d001c"} Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.399974 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfb55e28ed626e174d3cd494ac774ffaf8e00851cdca15fb5969efbd700d001c" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.399668 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-p2wxb" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.489363 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2"] Mar 12 13:53:05 crc kubenswrapper[4921]: E0312 13:53:05.489720 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="081f2e1e-8724-4334-8e31-f3b643d5dcc3" containerName="oc" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.489735 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="081f2e1e-8724-4334-8e31-f3b643d5dcc3" containerName="oc" Mar 12 13:53:05 crc kubenswrapper[4921]: E0312 13:53:05.489749 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8697c3cf-f4d2-45fb-9347-c580192e39d2" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.489755 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8697c3cf-f4d2-45fb-9347-c580192e39d2" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.489951 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8697c3cf-f4d2-45fb-9347-c580192e39d2" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.489970 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="081f2e1e-8724-4334-8e31-f3b643d5dcc3" containerName="oc" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.490656 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.494430 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.494533 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.495480 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.495611 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.495653 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.496463 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.497880 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.502201 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2"] Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.646542 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.646594 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.646633 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlgmj\" (UniqueName: \"kubernetes.io/projected/f5126789-42a1-4b3d-bc96-384b4db790b6-kube-api-access-jlgmj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.646664 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.646699 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.646770 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.646807 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.748746 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.748865 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.749879 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlgmj\" (UniqueName: \"kubernetes.io/projected/f5126789-42a1-4b3d-bc96-384b4db790b6-kube-api-access-jlgmj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.749915 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.749960 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.750014 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.750067 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.753266 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.753375 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.753910 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.756587 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.758402 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.764050 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.768865 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlgmj\" (UniqueName: \"kubernetes.io/projected/f5126789-42a1-4b3d-bc96-384b4db790b6-kube-api-access-jlgmj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.821802 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:53:05 crc kubenswrapper[4921]: I0312 13:53:05.983783 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:53:06 crc kubenswrapper[4921]: I0312 13:53:06.415883 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2"] Mar 12 13:53:06 crc kubenswrapper[4921]: I0312 13:53:06.418335 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"de0aceb5ba9f7cbd7045011859d16969e07d215089a80468c9cc72efa69df4b5"} Mar 12 13:53:06 crc kubenswrapper[4921]: W0312 13:53:06.421263 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5126789_42a1_4b3d_bc96_384b4db790b6.slice/crio-2e6650417b6f850c3cb0b9d9b5fa78245f727417089ff1ee62bca8b96e4e968a WatchSource:0}: Error finding container 2e6650417b6f850c3cb0b9d9b5fa78245f727417089ff1ee62bca8b96e4e968a: Status 404 returned error can't find the container with id 2e6650417b6f850c3cb0b9d9b5fa78245f727417089ff1ee62bca8b96e4e968a Mar 12 13:53:07 crc kubenswrapper[4921]: I0312 13:53:07.428862 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" event={"ID":"f5126789-42a1-4b3d-bc96-384b4db790b6","Type":"ContainerStarted","Data":"ce802b426d20a050c8b5567f7e9ce52f381024b0f6e1565709a61eab0b14543f"} Mar 12 13:53:07 crc kubenswrapper[4921]: I0312 13:53:07.429580 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" event={"ID":"f5126789-42a1-4b3d-bc96-384b4db790b6","Type":"ContainerStarted","Data":"2e6650417b6f850c3cb0b9d9b5fa78245f727417089ff1ee62bca8b96e4e968a"} Mar 12 13:53:07 crc kubenswrapper[4921]: I0312 13:53:07.452099 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" podStartSLOduration=1.9998637970000002 podStartE2EDuration="2.452073359s" podCreationTimestamp="2026-03-12 13:53:05 +0000 UTC" firstStartedPulling="2026-03-12 13:53:06.433209829 +0000 UTC m=+2609.123281800" lastFinishedPulling="2026-03-12 13:53:06.885419391 +0000 UTC m=+2609.575491362" observedRunningTime="2026-03-12 13:53:07.447010344 +0000 UTC m=+2610.137082315" watchObservedRunningTime="2026-03-12 13:53:07.452073359 +0000 UTC m=+2610.142145350" Mar 12 13:53:37 crc kubenswrapper[4921]: I0312 13:53:37.773187 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xfdhk"] Mar 12 13:53:37 crc kubenswrapper[4921]: I0312 13:53:37.776335 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:37 crc kubenswrapper[4921]: I0312 13:53:37.782978 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xfdhk"] Mar 12 13:53:37 crc kubenswrapper[4921]: I0312 13:53:37.841244 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a920e63-d36d-423d-b80d-8731eaf9e1b5-utilities\") pod \"community-operators-xfdhk\" (UID: \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\") " pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:37 crc kubenswrapper[4921]: I0312 13:53:37.841330 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75cvd\" (UniqueName: \"kubernetes.io/projected/7a920e63-d36d-423d-b80d-8731eaf9e1b5-kube-api-access-75cvd\") pod \"community-operators-xfdhk\" (UID: \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\") " pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:37 crc kubenswrapper[4921]: I0312 13:53:37.841452 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a920e63-d36d-423d-b80d-8731eaf9e1b5-catalog-content\") pod \"community-operators-xfdhk\" (UID: \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\") " pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:37 crc kubenswrapper[4921]: I0312 13:53:37.942880 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a920e63-d36d-423d-b80d-8731eaf9e1b5-catalog-content\") pod \"community-operators-xfdhk\" (UID: \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\") " pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:37 crc kubenswrapper[4921]: I0312 13:53:37.943319 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a920e63-d36d-423d-b80d-8731eaf9e1b5-catalog-content\") pod \"community-operators-xfdhk\" (UID: \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\") " pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:37 crc kubenswrapper[4921]: I0312 13:53:37.943457 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a920e63-d36d-423d-b80d-8731eaf9e1b5-utilities\") pod \"community-operators-xfdhk\" (UID: \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\") " pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:37 crc kubenswrapper[4921]: I0312 13:53:37.943505 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75cvd\" (UniqueName: \"kubernetes.io/projected/7a920e63-d36d-423d-b80d-8731eaf9e1b5-kube-api-access-75cvd\") pod \"community-operators-xfdhk\" (UID: \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\") " pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:37 crc kubenswrapper[4921]: I0312 13:53:37.943916 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a920e63-d36d-423d-b80d-8731eaf9e1b5-utilities\") pod \"community-operators-xfdhk\" (UID: \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\") " pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:37 crc kubenswrapper[4921]: I0312 13:53:37.964082 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75cvd\" (UniqueName: \"kubernetes.io/projected/7a920e63-d36d-423d-b80d-8731eaf9e1b5-kube-api-access-75cvd\") pod \"community-operators-xfdhk\" (UID: \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\") " pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:38 crc kubenswrapper[4921]: I0312 13:53:38.094707 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:38 crc kubenswrapper[4921]: W0312 13:53:38.616609 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a920e63_d36d_423d_b80d_8731eaf9e1b5.slice/crio-1940649ed063c67299e6deae6720020470cce1eff0b4a884d092a654b6cbb9a2 WatchSource:0}: Error finding container 1940649ed063c67299e6deae6720020470cce1eff0b4a884d092a654b6cbb9a2: Status 404 returned error can't find the container with id 1940649ed063c67299e6deae6720020470cce1eff0b4a884d092a654b6cbb9a2 Mar 12 13:53:38 crc kubenswrapper[4921]: I0312 13:53:38.621408 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xfdhk"] Mar 12 13:53:38 crc kubenswrapper[4921]: I0312 13:53:38.679196 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xfdhk" event={"ID":"7a920e63-d36d-423d-b80d-8731eaf9e1b5","Type":"ContainerStarted","Data":"1940649ed063c67299e6deae6720020470cce1eff0b4a884d092a654b6cbb9a2"} Mar 12 13:53:39 crc kubenswrapper[4921]: I0312 13:53:39.689675 4921 generic.go:334] "Generic (PLEG): container finished" podID="7a920e63-d36d-423d-b80d-8731eaf9e1b5" containerID="449908901a86eeb8b70294a40d0760a099c4feba59e23aa901c66b2ac86e1923" exitCode=0 Mar 12 13:53:39 crc kubenswrapper[4921]: I0312 13:53:39.689759 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xfdhk" event={"ID":"7a920e63-d36d-423d-b80d-8731eaf9e1b5","Type":"ContainerDied","Data":"449908901a86eeb8b70294a40d0760a099c4feba59e23aa901c66b2ac86e1923"} Mar 12 13:53:40 crc kubenswrapper[4921]: I0312 13:53:40.699389 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xfdhk" event={"ID":"7a920e63-d36d-423d-b80d-8731eaf9e1b5","Type":"ContainerStarted","Data":"9740026d801eed2070ea5454ba8db1247c45ff0a93731de76af5050f226bbcab"} Mar 12 13:53:42 crc kubenswrapper[4921]: I0312 13:53:42.725288 4921 generic.go:334] "Generic (PLEG): container finished" podID="7a920e63-d36d-423d-b80d-8731eaf9e1b5" containerID="9740026d801eed2070ea5454ba8db1247c45ff0a93731de76af5050f226bbcab" exitCode=0 Mar 12 13:53:42 crc kubenswrapper[4921]: I0312 13:53:42.725379 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xfdhk" event={"ID":"7a920e63-d36d-423d-b80d-8731eaf9e1b5","Type":"ContainerDied","Data":"9740026d801eed2070ea5454ba8db1247c45ff0a93731de76af5050f226bbcab"} Mar 12 13:53:43 crc kubenswrapper[4921]: I0312 13:53:43.735408 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xfdhk" event={"ID":"7a920e63-d36d-423d-b80d-8731eaf9e1b5","Type":"ContainerStarted","Data":"636998dd8459995a78c02d1be7ea8b5303c28434c5d816e0f39a97977878d212"} Mar 12 13:53:43 crc kubenswrapper[4921]: I0312 13:53:43.756136 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xfdhk" podStartSLOduration=3.23457967 podStartE2EDuration="6.756110379s" podCreationTimestamp="2026-03-12 13:53:37 +0000 UTC" firstStartedPulling="2026-03-12 13:53:39.693145513 +0000 UTC m=+2642.383217485" lastFinishedPulling="2026-03-12 13:53:43.214676193 +0000 UTC m=+2645.904748194" observedRunningTime="2026-03-12 13:53:43.752029284 +0000 UTC m=+2646.442101255" watchObservedRunningTime="2026-03-12 13:53:43.756110379 +0000 UTC m=+2646.446182390" Mar 12 13:53:48 crc kubenswrapper[4921]: I0312 13:53:48.095868 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:48 crc kubenswrapper[4921]: I0312 13:53:48.096583 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:48 crc kubenswrapper[4921]: I0312 13:53:48.142508 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:48 crc kubenswrapper[4921]: I0312 13:53:48.838006 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:48 crc kubenswrapper[4921]: I0312 13:53:48.890296 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xfdhk"] Mar 12 13:53:50 crc kubenswrapper[4921]: I0312 13:53:50.793603 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xfdhk" podUID="7a920e63-d36d-423d-b80d-8731eaf9e1b5" containerName="registry-server" containerID="cri-o://636998dd8459995a78c02d1be7ea8b5303c28434c5d816e0f39a97977878d212" gracePeriod=2 Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.281452 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.360633 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a920e63-d36d-423d-b80d-8731eaf9e1b5-catalog-content\") pod \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\" (UID: \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\") " Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.360785 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a920e63-d36d-423d-b80d-8731eaf9e1b5-utilities\") pod \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\" (UID: \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\") " Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.360882 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75cvd\" (UniqueName: \"kubernetes.io/projected/7a920e63-d36d-423d-b80d-8731eaf9e1b5-kube-api-access-75cvd\") pod \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\" (UID: \"7a920e63-d36d-423d-b80d-8731eaf9e1b5\") " Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.361510 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a920e63-d36d-423d-b80d-8731eaf9e1b5-utilities" (OuterVolumeSpecName: "utilities") pod "7a920e63-d36d-423d-b80d-8731eaf9e1b5" (UID: "7a920e63-d36d-423d-b80d-8731eaf9e1b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.371113 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a920e63-d36d-423d-b80d-8731eaf9e1b5-kube-api-access-75cvd" (OuterVolumeSpecName: "kube-api-access-75cvd") pod "7a920e63-d36d-423d-b80d-8731eaf9e1b5" (UID: "7a920e63-d36d-423d-b80d-8731eaf9e1b5"). InnerVolumeSpecName "kube-api-access-75cvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.413310 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a920e63-d36d-423d-b80d-8731eaf9e1b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a920e63-d36d-423d-b80d-8731eaf9e1b5" (UID: "7a920e63-d36d-423d-b80d-8731eaf9e1b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.463864 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a920e63-d36d-423d-b80d-8731eaf9e1b5-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.463901 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75cvd\" (UniqueName: \"kubernetes.io/projected/7a920e63-d36d-423d-b80d-8731eaf9e1b5-kube-api-access-75cvd\") on node \"crc\" DevicePath \"\"" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.463917 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a920e63-d36d-423d-b80d-8731eaf9e1b5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.811522 4921 generic.go:334] "Generic (PLEG): container finished" podID="7a920e63-d36d-423d-b80d-8731eaf9e1b5" containerID="636998dd8459995a78c02d1be7ea8b5303c28434c5d816e0f39a97977878d212" exitCode=0 Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.811625 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xfdhk" event={"ID":"7a920e63-d36d-423d-b80d-8731eaf9e1b5","Type":"ContainerDied","Data":"636998dd8459995a78c02d1be7ea8b5303c28434c5d816e0f39a97977878d212"} Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.811663 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xfdhk" event={"ID":"7a920e63-d36d-423d-b80d-8731eaf9e1b5","Type":"ContainerDied","Data":"1940649ed063c67299e6deae6720020470cce1eff0b4a884d092a654b6cbb9a2"} Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.811694 4921 scope.go:117] "RemoveContainer" containerID="636998dd8459995a78c02d1be7ea8b5303c28434c5d816e0f39a97977878d212" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.811760 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xfdhk" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.840955 4921 scope.go:117] "RemoveContainer" containerID="9740026d801eed2070ea5454ba8db1247c45ff0a93731de76af5050f226bbcab" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.860741 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xfdhk"] Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.870475 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xfdhk"] Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.877457 4921 scope.go:117] "RemoveContainer" containerID="449908901a86eeb8b70294a40d0760a099c4feba59e23aa901c66b2ac86e1923" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.904765 4921 scope.go:117] "RemoveContainer" containerID="636998dd8459995a78c02d1be7ea8b5303c28434c5d816e0f39a97977878d212" Mar 12 13:53:51 crc kubenswrapper[4921]: E0312 13:53:51.905215 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"636998dd8459995a78c02d1be7ea8b5303c28434c5d816e0f39a97977878d212\": container with ID starting with 636998dd8459995a78c02d1be7ea8b5303c28434c5d816e0f39a97977878d212 not found: ID does not exist" containerID="636998dd8459995a78c02d1be7ea8b5303c28434c5d816e0f39a97977878d212" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.905255 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"636998dd8459995a78c02d1be7ea8b5303c28434c5d816e0f39a97977878d212"} err="failed to get container status \"636998dd8459995a78c02d1be7ea8b5303c28434c5d816e0f39a97977878d212\": rpc error: code = NotFound desc = could not find container \"636998dd8459995a78c02d1be7ea8b5303c28434c5d816e0f39a97977878d212\": container with ID starting with 636998dd8459995a78c02d1be7ea8b5303c28434c5d816e0f39a97977878d212 not found: ID does not exist" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.905280 4921 scope.go:117] "RemoveContainer" containerID="9740026d801eed2070ea5454ba8db1247c45ff0a93731de76af5050f226bbcab" Mar 12 13:53:51 crc kubenswrapper[4921]: E0312 13:53:51.905689 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9740026d801eed2070ea5454ba8db1247c45ff0a93731de76af5050f226bbcab\": container with ID starting with 9740026d801eed2070ea5454ba8db1247c45ff0a93731de76af5050f226bbcab not found: ID does not exist" containerID="9740026d801eed2070ea5454ba8db1247c45ff0a93731de76af5050f226bbcab" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.905726 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9740026d801eed2070ea5454ba8db1247c45ff0a93731de76af5050f226bbcab"} err="failed to get container status \"9740026d801eed2070ea5454ba8db1247c45ff0a93731de76af5050f226bbcab\": rpc error: code = NotFound desc = could not find container \"9740026d801eed2070ea5454ba8db1247c45ff0a93731de76af5050f226bbcab\": container with ID starting with 9740026d801eed2070ea5454ba8db1247c45ff0a93731de76af5050f226bbcab not found: ID does not exist" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.905751 4921 scope.go:117] "RemoveContainer" containerID="449908901a86eeb8b70294a40d0760a099c4feba59e23aa901c66b2ac86e1923" Mar 12 13:53:51 crc kubenswrapper[4921]: E0312 13:53:51.906083 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"449908901a86eeb8b70294a40d0760a099c4feba59e23aa901c66b2ac86e1923\": container with ID starting with 449908901a86eeb8b70294a40d0760a099c4feba59e23aa901c66b2ac86e1923 not found: ID does not exist" containerID="449908901a86eeb8b70294a40d0760a099c4feba59e23aa901c66b2ac86e1923" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.906112 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"449908901a86eeb8b70294a40d0760a099c4feba59e23aa901c66b2ac86e1923"} err="failed to get container status \"449908901a86eeb8b70294a40d0760a099c4feba59e23aa901c66b2ac86e1923\": rpc error: code = NotFound desc = could not find container \"449908901a86eeb8b70294a40d0760a099c4feba59e23aa901c66b2ac86e1923\": container with ID starting with 449908901a86eeb8b70294a40d0760a099c4feba59e23aa901c66b2ac86e1923 not found: ID does not exist" Mar 12 13:53:51 crc kubenswrapper[4921]: I0312 13:53:51.993195 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a920e63-d36d-423d-b80d-8731eaf9e1b5" path="/var/lib/kubelet/pods/7a920e63-d36d-423d-b80d-8731eaf9e1b5/volumes" Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.140299 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555394-sn68w"] Mar 12 13:54:00 crc kubenswrapper[4921]: E0312 13:54:00.141125 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a920e63-d36d-423d-b80d-8731eaf9e1b5" containerName="extract-utilities" Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.141136 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a920e63-d36d-423d-b80d-8731eaf9e1b5" containerName="extract-utilities" Mar 12 13:54:00 crc kubenswrapper[4921]: E0312 13:54:00.141159 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a920e63-d36d-423d-b80d-8731eaf9e1b5" containerName="extract-content" Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.141165 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a920e63-d36d-423d-b80d-8731eaf9e1b5" containerName="extract-content" Mar 12 13:54:00 crc kubenswrapper[4921]: E0312 13:54:00.141186 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a920e63-d36d-423d-b80d-8731eaf9e1b5" containerName="registry-server" Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.141193 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a920e63-d36d-423d-b80d-8731eaf9e1b5" containerName="registry-server" Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.141358 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a920e63-d36d-423d-b80d-8731eaf9e1b5" containerName="registry-server" Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.141874 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555394-sn68w" Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.150116 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555394-sn68w"] Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.150559 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.150784 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.154054 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.224288 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zvfh\" (UniqueName: \"kubernetes.io/projected/04056984-4307-4c27-943d-c6505f8c40c8-kube-api-access-4zvfh\") pod \"auto-csr-approver-29555394-sn68w\" (UID: \"04056984-4307-4c27-943d-c6505f8c40c8\") " pod="openshift-infra/auto-csr-approver-29555394-sn68w" Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.325931 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zvfh\" (UniqueName: \"kubernetes.io/projected/04056984-4307-4c27-943d-c6505f8c40c8-kube-api-access-4zvfh\") pod \"auto-csr-approver-29555394-sn68w\" (UID: \"04056984-4307-4c27-943d-c6505f8c40c8\") " pod="openshift-infra/auto-csr-approver-29555394-sn68w" Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.347188 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zvfh\" (UniqueName: \"kubernetes.io/projected/04056984-4307-4c27-943d-c6505f8c40c8-kube-api-access-4zvfh\") pod \"auto-csr-approver-29555394-sn68w\" (UID: \"04056984-4307-4c27-943d-c6505f8c40c8\") " pod="openshift-infra/auto-csr-approver-29555394-sn68w" Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.480638 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555394-sn68w" Mar 12 13:54:00 crc kubenswrapper[4921]: I0312 13:54:00.959635 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555394-sn68w"] Mar 12 13:54:01 crc kubenswrapper[4921]: I0312 13:54:01.909506 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555394-sn68w" event={"ID":"04056984-4307-4c27-943d-c6505f8c40c8","Type":"ContainerStarted","Data":"be52bac47e23ce832a8f226fcdbdae13fd2eaf70fd15c81f4debf2cd52f21eb9"} Mar 12 13:54:02 crc kubenswrapper[4921]: I0312 13:54:02.919750 4921 generic.go:334] "Generic (PLEG): container finished" podID="04056984-4307-4c27-943d-c6505f8c40c8" containerID="8101e4f38dd9f6e2d2e03834a325ab1fd81f4f99667202d8fd6ea978a4391676" exitCode=0 Mar 12 13:54:02 crc kubenswrapper[4921]: I0312 13:54:02.919864 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555394-sn68w" event={"ID":"04056984-4307-4c27-943d-c6505f8c40c8","Type":"ContainerDied","Data":"8101e4f38dd9f6e2d2e03834a325ab1fd81f4f99667202d8fd6ea978a4391676"} Mar 12 13:54:04 crc kubenswrapper[4921]: I0312 13:54:04.255410 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555394-sn68w" Mar 12 13:54:04 crc kubenswrapper[4921]: I0312 13:54:04.401691 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zvfh\" (UniqueName: \"kubernetes.io/projected/04056984-4307-4c27-943d-c6505f8c40c8-kube-api-access-4zvfh\") pod \"04056984-4307-4c27-943d-c6505f8c40c8\" (UID: \"04056984-4307-4c27-943d-c6505f8c40c8\") " Mar 12 13:54:04 crc kubenswrapper[4921]: I0312 13:54:04.409333 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04056984-4307-4c27-943d-c6505f8c40c8-kube-api-access-4zvfh" (OuterVolumeSpecName: "kube-api-access-4zvfh") pod "04056984-4307-4c27-943d-c6505f8c40c8" (UID: "04056984-4307-4c27-943d-c6505f8c40c8"). InnerVolumeSpecName "kube-api-access-4zvfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:54:04 crc kubenswrapper[4921]: I0312 13:54:04.504109 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zvfh\" (UniqueName: \"kubernetes.io/projected/04056984-4307-4c27-943d-c6505f8c40c8-kube-api-access-4zvfh\") on node \"crc\" DevicePath \"\"" Mar 12 13:54:04 crc kubenswrapper[4921]: I0312 13:54:04.937091 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555394-sn68w" event={"ID":"04056984-4307-4c27-943d-c6505f8c40c8","Type":"ContainerDied","Data":"be52bac47e23ce832a8f226fcdbdae13fd2eaf70fd15c81f4debf2cd52f21eb9"} Mar 12 13:54:04 crc kubenswrapper[4921]: I0312 13:54:04.937133 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be52bac47e23ce832a8f226fcdbdae13fd2eaf70fd15c81f4debf2cd52f21eb9" Mar 12 13:54:04 crc kubenswrapper[4921]: I0312 13:54:04.937152 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555394-sn68w" Mar 12 13:54:05 crc kubenswrapper[4921]: I0312 13:54:05.319373 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555388-nhldt"] Mar 12 13:54:05 crc kubenswrapper[4921]: I0312 13:54:05.326243 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555388-nhldt"] Mar 12 13:54:05 crc kubenswrapper[4921]: I0312 13:54:05.995275 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="236910c2-bd47-434a-af0c-f599664fda24" path="/var/lib/kubelet/pods/236910c2-bd47-434a-af0c-f599664fda24/volumes" Mar 12 13:54:07 crc kubenswrapper[4921]: I0312 13:54:07.974868 4921 generic.go:334] "Generic (PLEG): container finished" podID="f5126789-42a1-4b3d-bc96-384b4db790b6" containerID="ce802b426d20a050c8b5567f7e9ce52f381024b0f6e1565709a61eab0b14543f" exitCode=0 Mar 12 13:54:07 crc kubenswrapper[4921]: I0312 13:54:07.974955 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" event={"ID":"f5126789-42a1-4b3d-bc96-384b4db790b6","Type":"ContainerDied","Data":"ce802b426d20a050c8b5567f7e9ce52f381024b0f6e1565709a61eab0b14543f"} Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.410292 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.504704 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-inventory\") pod \"f5126789-42a1-4b3d-bc96-384b4db790b6\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.504913 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-ssh-key-openstack-edpm-ipam\") pod \"f5126789-42a1-4b3d-bc96-384b4db790b6\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.504961 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlgmj\" (UniqueName: \"kubernetes.io/projected/f5126789-42a1-4b3d-bc96-384b4db790b6-kube-api-access-jlgmj\") pod \"f5126789-42a1-4b3d-bc96-384b4db790b6\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.505006 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-nova-metadata-neutron-config-0\") pod \"f5126789-42a1-4b3d-bc96-384b4db790b6\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.505077 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-neutron-metadata-combined-ca-bundle\") pod \"f5126789-42a1-4b3d-bc96-384b4db790b6\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.505138 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-neutron-ovn-metadata-agent-neutron-config-0\") pod \"f5126789-42a1-4b3d-bc96-384b4db790b6\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.505202 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-ceph\") pod \"f5126789-42a1-4b3d-bc96-384b4db790b6\" (UID: \"f5126789-42a1-4b3d-bc96-384b4db790b6\") " Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.511064 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-ceph" (OuterVolumeSpecName: "ceph") pod "f5126789-42a1-4b3d-bc96-384b4db790b6" (UID: "f5126789-42a1-4b3d-bc96-384b4db790b6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.511110 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5126789-42a1-4b3d-bc96-384b4db790b6-kube-api-access-jlgmj" (OuterVolumeSpecName: "kube-api-access-jlgmj") pod "f5126789-42a1-4b3d-bc96-384b4db790b6" (UID: "f5126789-42a1-4b3d-bc96-384b4db790b6"). InnerVolumeSpecName "kube-api-access-jlgmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.512321 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f5126789-42a1-4b3d-bc96-384b4db790b6" (UID: "f5126789-42a1-4b3d-bc96-384b4db790b6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.531630 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "f5126789-42a1-4b3d-bc96-384b4db790b6" (UID: "f5126789-42a1-4b3d-bc96-384b4db790b6"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.535602 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "f5126789-42a1-4b3d-bc96-384b4db790b6" (UID: "f5126789-42a1-4b3d-bc96-384b4db790b6"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.540626 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-inventory" (OuterVolumeSpecName: "inventory") pod "f5126789-42a1-4b3d-bc96-384b4db790b6" (UID: "f5126789-42a1-4b3d-bc96-384b4db790b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.545404 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f5126789-42a1-4b3d-bc96-384b4db790b6" (UID: "f5126789-42a1-4b3d-bc96-384b4db790b6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.607667 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.607722 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.607732 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.607745 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlgmj\" (UniqueName: \"kubernetes.io/projected/f5126789-42a1-4b3d-bc96-384b4db790b6-kube-api-access-jlgmj\") on node \"crc\" DevicePath \"\"" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.607753 4921 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.607762 4921 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.607772 4921 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f5126789-42a1-4b3d-bc96-384b4db790b6-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.994529 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.994954 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2" event={"ID":"f5126789-42a1-4b3d-bc96-384b4db790b6","Type":"ContainerDied","Data":"2e6650417b6f850c3cb0b9d9b5fa78245f727417089ff1ee62bca8b96e4e968a"} Mar 12 13:54:09 crc kubenswrapper[4921]: I0312 13:54:09.994990 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e6650417b6f850c3cb0b9d9b5fa78245f727417089ff1ee62bca8b96e4e968a" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.146963 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6"] Mar 12 13:54:10 crc kubenswrapper[4921]: E0312 13:54:10.147316 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5126789-42a1-4b3d-bc96-384b4db790b6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.147328 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5126789-42a1-4b3d-bc96-384b4db790b6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 12 13:54:10 crc kubenswrapper[4921]: E0312 13:54:10.147345 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04056984-4307-4c27-943d-c6505f8c40c8" containerName="oc" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.147351 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="04056984-4307-4c27-943d-c6505f8c40c8" containerName="oc" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.147504 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="04056984-4307-4c27-943d-c6505f8c40c8" containerName="oc" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.147522 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5126789-42a1-4b3d-bc96-384b4db790b6" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.148042 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.150222 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.150464 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.150612 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.150743 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.153065 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.155229 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.163642 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6"] Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.220765 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cfvv\" (UniqueName: \"kubernetes.io/projected/2ee1e205-39b3-4648-8c21-4a7cd46b867f-kube-api-access-4cfvv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.220907 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.221050 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.221082 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.221283 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.221370 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.323384 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.323517 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.323578 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.323669 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cfvv\" (UniqueName: \"kubernetes.io/projected/2ee1e205-39b3-4648-8c21-4a7cd46b867f-kube-api-access-4cfvv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.323758 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.323903 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.327062 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.327889 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.330025 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.333919 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.338078 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.352878 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cfvv\" (UniqueName: \"kubernetes.io/projected/2ee1e205-39b3-4648-8c21-4a7cd46b867f-kube-api-access-4cfvv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:10 crc kubenswrapper[4921]: I0312 13:54:10.467509 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:54:11 crc kubenswrapper[4921]: I0312 13:54:11.023090 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6"] Mar 12 13:54:12 crc kubenswrapper[4921]: I0312 13:54:12.017926 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" event={"ID":"2ee1e205-39b3-4648-8c21-4a7cd46b867f","Type":"ContainerStarted","Data":"4df3fac6046801518be7639b9e6f92b216d16acc599da9011491ec4318354bff"} Mar 12 13:54:12 crc kubenswrapper[4921]: I0312 13:54:12.018337 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" event={"ID":"2ee1e205-39b3-4648-8c21-4a7cd46b867f","Type":"ContainerStarted","Data":"f54df5838710c0ad9163fe23fc081494e2b4ae092b623adc93c4ad382b9cc0fb"} Mar 12 13:54:12 crc kubenswrapper[4921]: I0312 13:54:12.056555 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" podStartSLOduration=1.502839525 podStartE2EDuration="2.056537206s" podCreationTimestamp="2026-03-12 13:54:10 +0000 UTC" firstStartedPulling="2026-03-12 13:54:11.02700839 +0000 UTC m=+2673.717080361" lastFinishedPulling="2026-03-12 13:54:11.580706051 +0000 UTC m=+2674.270778042" observedRunningTime="2026-03-12 13:54:12.041161974 +0000 UTC m=+2674.731233935" watchObservedRunningTime="2026-03-12 13:54:12.056537206 +0000 UTC m=+2674.746609177" Mar 12 13:54:44 crc kubenswrapper[4921]: I0312 13:54:44.363901 4921 scope.go:117] "RemoveContainer" containerID="9c27737841a993b5bd568e4afa70a46914aead1836c096c9e3c7edbacedc46d6" Mar 12 13:55:04 crc kubenswrapper[4921]: I0312 13:55:04.188885 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d27mh"] Mar 12 13:55:04 crc kubenswrapper[4921]: I0312 13:55:04.193763 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:04 crc kubenswrapper[4921]: I0312 13:55:04.202405 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d27mh"] Mar 12 13:55:04 crc kubenswrapper[4921]: I0312 13:55:04.320940 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddda98d9-b098-437f-8a37-22560af78cdd-utilities\") pod \"redhat-operators-d27mh\" (UID: \"ddda98d9-b098-437f-8a37-22560af78cdd\") " pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:04 crc kubenswrapper[4921]: I0312 13:55:04.320981 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddda98d9-b098-437f-8a37-22560af78cdd-catalog-content\") pod \"redhat-operators-d27mh\" (UID: \"ddda98d9-b098-437f-8a37-22560af78cdd\") " pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:04 crc kubenswrapper[4921]: I0312 13:55:04.321014 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjkdr\" (UniqueName: \"kubernetes.io/projected/ddda98d9-b098-437f-8a37-22560af78cdd-kube-api-access-tjkdr\") pod \"redhat-operators-d27mh\" (UID: \"ddda98d9-b098-437f-8a37-22560af78cdd\") " pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:04 crc kubenswrapper[4921]: I0312 13:55:04.422179 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddda98d9-b098-437f-8a37-22560af78cdd-utilities\") pod \"redhat-operators-d27mh\" (UID: \"ddda98d9-b098-437f-8a37-22560af78cdd\") " pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:04 crc kubenswrapper[4921]: I0312 13:55:04.422239 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddda98d9-b098-437f-8a37-22560af78cdd-catalog-content\") pod \"redhat-operators-d27mh\" (UID: \"ddda98d9-b098-437f-8a37-22560af78cdd\") " pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:04 crc kubenswrapper[4921]: I0312 13:55:04.422269 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjkdr\" (UniqueName: \"kubernetes.io/projected/ddda98d9-b098-437f-8a37-22560af78cdd-kube-api-access-tjkdr\") pod \"redhat-operators-d27mh\" (UID: \"ddda98d9-b098-437f-8a37-22560af78cdd\") " pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:04 crc kubenswrapper[4921]: I0312 13:55:04.422673 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddda98d9-b098-437f-8a37-22560af78cdd-utilities\") pod \"redhat-operators-d27mh\" (UID: \"ddda98d9-b098-437f-8a37-22560af78cdd\") " pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:04 crc kubenswrapper[4921]: I0312 13:55:04.422689 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddda98d9-b098-437f-8a37-22560af78cdd-catalog-content\") pod \"redhat-operators-d27mh\" (UID: \"ddda98d9-b098-437f-8a37-22560af78cdd\") " pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:04 crc kubenswrapper[4921]: I0312 13:55:04.445787 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjkdr\" (UniqueName: \"kubernetes.io/projected/ddda98d9-b098-437f-8a37-22560af78cdd-kube-api-access-tjkdr\") pod \"redhat-operators-d27mh\" (UID: \"ddda98d9-b098-437f-8a37-22560af78cdd\") " pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:04 crc kubenswrapper[4921]: I0312 13:55:04.535521 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:04 crc kubenswrapper[4921]: I0312 13:55:04.979033 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d27mh"] Mar 12 13:55:05 crc kubenswrapper[4921]: I0312 13:55:05.470704 4921 generic.go:334] "Generic (PLEG): container finished" podID="ddda98d9-b098-437f-8a37-22560af78cdd" containerID="b37845bf5be648f88b08630d919bfa68cd1cd2b9da1a4a0e519f7e7ad299820f" exitCode=0 Mar 12 13:55:05 crc kubenswrapper[4921]: I0312 13:55:05.470884 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d27mh" event={"ID":"ddda98d9-b098-437f-8a37-22560af78cdd","Type":"ContainerDied","Data":"b37845bf5be648f88b08630d919bfa68cd1cd2b9da1a4a0e519f7e7ad299820f"} Mar 12 13:55:05 crc kubenswrapper[4921]: I0312 13:55:05.471025 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d27mh" event={"ID":"ddda98d9-b098-437f-8a37-22560af78cdd","Type":"ContainerStarted","Data":"ecd7d1fdeabe6538bf04b052caefe97e153da4c12f1df351f10b05a4f4c5d3d1"} Mar 12 13:55:05 crc kubenswrapper[4921]: I0312 13:55:05.472716 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:55:07 crc kubenswrapper[4921]: I0312 13:55:07.487504 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d27mh" event={"ID":"ddda98d9-b098-437f-8a37-22560af78cdd","Type":"ContainerStarted","Data":"6af83661747569e9d0dc86b8d05418dbde27531f703a9c524bc4099423bab3e9"} Mar 12 13:55:09 crc kubenswrapper[4921]: I0312 13:55:09.509986 4921 generic.go:334] "Generic (PLEG): container finished" podID="ddda98d9-b098-437f-8a37-22560af78cdd" containerID="6af83661747569e9d0dc86b8d05418dbde27531f703a9c524bc4099423bab3e9" exitCode=0 Mar 12 13:55:09 crc kubenswrapper[4921]: I0312 13:55:09.510059 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d27mh" event={"ID":"ddda98d9-b098-437f-8a37-22560af78cdd","Type":"ContainerDied","Data":"6af83661747569e9d0dc86b8d05418dbde27531f703a9c524bc4099423bab3e9"} Mar 12 13:55:10 crc kubenswrapper[4921]: I0312 13:55:10.519143 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d27mh" event={"ID":"ddda98d9-b098-437f-8a37-22560af78cdd","Type":"ContainerStarted","Data":"8f5118ecb35eff0b44fbafe975df500a4e539fe61430c378d3261cbe49aef496"} Mar 12 13:55:10 crc kubenswrapper[4921]: I0312 13:55:10.545912 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d27mh" podStartSLOduration=2.080024901 podStartE2EDuration="6.545895476s" podCreationTimestamp="2026-03-12 13:55:04 +0000 UTC" firstStartedPulling="2026-03-12 13:55:05.47251654 +0000 UTC m=+2728.162588511" lastFinishedPulling="2026-03-12 13:55:09.938387105 +0000 UTC m=+2732.628459086" observedRunningTime="2026-03-12 13:55:10.539662515 +0000 UTC m=+2733.229734486" watchObservedRunningTime="2026-03-12 13:55:10.545895476 +0000 UTC m=+2733.235967447" Mar 12 13:55:14 crc kubenswrapper[4921]: I0312 13:55:14.536269 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:14 crc kubenswrapper[4921]: I0312 13:55:14.537030 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:15 crc kubenswrapper[4921]: I0312 13:55:15.603955 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d27mh" podUID="ddda98d9-b098-437f-8a37-22560af78cdd" containerName="registry-server" probeResult="failure" output=< Mar 12 13:55:15 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 13:55:15 crc kubenswrapper[4921]: > Mar 12 13:55:24 crc kubenswrapper[4921]: I0312 13:55:24.596536 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:24 crc kubenswrapper[4921]: I0312 13:55:24.648530 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:24 crc kubenswrapper[4921]: I0312 13:55:24.839386 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d27mh"] Mar 12 13:55:25 crc kubenswrapper[4921]: I0312 13:55:25.673134 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d27mh" podUID="ddda98d9-b098-437f-8a37-22560af78cdd" containerName="registry-server" containerID="cri-o://8f5118ecb35eff0b44fbafe975df500a4e539fe61430c378d3261cbe49aef496" gracePeriod=2 Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.249762 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.324179 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.324260 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.385589 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddda98d9-b098-437f-8a37-22560af78cdd-utilities\") pod \"ddda98d9-b098-437f-8a37-22560af78cdd\" (UID: \"ddda98d9-b098-437f-8a37-22560af78cdd\") " Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.385720 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjkdr\" (UniqueName: \"kubernetes.io/projected/ddda98d9-b098-437f-8a37-22560af78cdd-kube-api-access-tjkdr\") pod \"ddda98d9-b098-437f-8a37-22560af78cdd\" (UID: \"ddda98d9-b098-437f-8a37-22560af78cdd\") " Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.385785 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddda98d9-b098-437f-8a37-22560af78cdd-catalog-content\") pod \"ddda98d9-b098-437f-8a37-22560af78cdd\" (UID: \"ddda98d9-b098-437f-8a37-22560af78cdd\") " Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.386759 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddda98d9-b098-437f-8a37-22560af78cdd-utilities" (OuterVolumeSpecName: "utilities") pod "ddda98d9-b098-437f-8a37-22560af78cdd" (UID: "ddda98d9-b098-437f-8a37-22560af78cdd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.393300 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddda98d9-b098-437f-8a37-22560af78cdd-kube-api-access-tjkdr" (OuterVolumeSpecName: "kube-api-access-tjkdr") pod "ddda98d9-b098-437f-8a37-22560af78cdd" (UID: "ddda98d9-b098-437f-8a37-22560af78cdd"). InnerVolumeSpecName "kube-api-access-tjkdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.487825 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddda98d9-b098-437f-8a37-22560af78cdd-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.488062 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjkdr\" (UniqueName: \"kubernetes.io/projected/ddda98d9-b098-437f-8a37-22560af78cdd-kube-api-access-tjkdr\") on node \"crc\" DevicePath \"\"" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.506414 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddda98d9-b098-437f-8a37-22560af78cdd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ddda98d9-b098-437f-8a37-22560af78cdd" (UID: "ddda98d9-b098-437f-8a37-22560af78cdd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.590239 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddda98d9-b098-437f-8a37-22560af78cdd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.685079 4921 generic.go:334] "Generic (PLEG): container finished" podID="ddda98d9-b098-437f-8a37-22560af78cdd" containerID="8f5118ecb35eff0b44fbafe975df500a4e539fe61430c378d3261cbe49aef496" exitCode=0 Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.685150 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d27mh" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.686325 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d27mh" event={"ID":"ddda98d9-b098-437f-8a37-22560af78cdd","Type":"ContainerDied","Data":"8f5118ecb35eff0b44fbafe975df500a4e539fe61430c378d3261cbe49aef496"} Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.686431 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d27mh" event={"ID":"ddda98d9-b098-437f-8a37-22560af78cdd","Type":"ContainerDied","Data":"ecd7d1fdeabe6538bf04b052caefe97e153da4c12f1df351f10b05a4f4c5d3d1"} Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.686467 4921 scope.go:117] "RemoveContainer" containerID="8f5118ecb35eff0b44fbafe975df500a4e539fe61430c378d3261cbe49aef496" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.731173 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d27mh"] Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.737964 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d27mh"] Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.742350 4921 scope.go:117] "RemoveContainer" containerID="6af83661747569e9d0dc86b8d05418dbde27531f703a9c524bc4099423bab3e9" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.774887 4921 scope.go:117] "RemoveContainer" containerID="b37845bf5be648f88b08630d919bfa68cd1cd2b9da1a4a0e519f7e7ad299820f" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.806886 4921 scope.go:117] "RemoveContainer" containerID="8f5118ecb35eff0b44fbafe975df500a4e539fe61430c378d3261cbe49aef496" Mar 12 13:55:26 crc kubenswrapper[4921]: E0312 13:55:26.807415 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f5118ecb35eff0b44fbafe975df500a4e539fe61430c378d3261cbe49aef496\": container with ID starting with 8f5118ecb35eff0b44fbafe975df500a4e539fe61430c378d3261cbe49aef496 not found: ID does not exist" containerID="8f5118ecb35eff0b44fbafe975df500a4e539fe61430c378d3261cbe49aef496" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.807474 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f5118ecb35eff0b44fbafe975df500a4e539fe61430c378d3261cbe49aef496"} err="failed to get container status \"8f5118ecb35eff0b44fbafe975df500a4e539fe61430c378d3261cbe49aef496\": rpc error: code = NotFound desc = could not find container \"8f5118ecb35eff0b44fbafe975df500a4e539fe61430c378d3261cbe49aef496\": container with ID starting with 8f5118ecb35eff0b44fbafe975df500a4e539fe61430c378d3261cbe49aef496 not found: ID does not exist" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.807506 4921 scope.go:117] "RemoveContainer" containerID="6af83661747569e9d0dc86b8d05418dbde27531f703a9c524bc4099423bab3e9" Mar 12 13:55:26 crc kubenswrapper[4921]: E0312 13:55:26.807989 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6af83661747569e9d0dc86b8d05418dbde27531f703a9c524bc4099423bab3e9\": container with ID starting with 6af83661747569e9d0dc86b8d05418dbde27531f703a9c524bc4099423bab3e9 not found: ID does not exist" containerID="6af83661747569e9d0dc86b8d05418dbde27531f703a9c524bc4099423bab3e9" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.808028 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6af83661747569e9d0dc86b8d05418dbde27531f703a9c524bc4099423bab3e9"} err="failed to get container status \"6af83661747569e9d0dc86b8d05418dbde27531f703a9c524bc4099423bab3e9\": rpc error: code = NotFound desc = could not find container \"6af83661747569e9d0dc86b8d05418dbde27531f703a9c524bc4099423bab3e9\": container with ID starting with 6af83661747569e9d0dc86b8d05418dbde27531f703a9c524bc4099423bab3e9 not found: ID does not exist" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.808099 4921 scope.go:117] "RemoveContainer" containerID="b37845bf5be648f88b08630d919bfa68cd1cd2b9da1a4a0e519f7e7ad299820f" Mar 12 13:55:26 crc kubenswrapper[4921]: E0312 13:55:26.809124 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b37845bf5be648f88b08630d919bfa68cd1cd2b9da1a4a0e519f7e7ad299820f\": container with ID starting with b37845bf5be648f88b08630d919bfa68cd1cd2b9da1a4a0e519f7e7ad299820f not found: ID does not exist" containerID="b37845bf5be648f88b08630d919bfa68cd1cd2b9da1a4a0e519f7e7ad299820f" Mar 12 13:55:26 crc kubenswrapper[4921]: I0312 13:55:26.809163 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37845bf5be648f88b08630d919bfa68cd1cd2b9da1a4a0e519f7e7ad299820f"} err="failed to get container status \"b37845bf5be648f88b08630d919bfa68cd1cd2b9da1a4a0e519f7e7ad299820f\": rpc error: code = NotFound desc = could not find container \"b37845bf5be648f88b08630d919bfa68cd1cd2b9da1a4a0e519f7e7ad299820f\": container with ID starting with b37845bf5be648f88b08630d919bfa68cd1cd2b9da1a4a0e519f7e7ad299820f not found: ID does not exist" Mar 12 13:55:27 crc kubenswrapper[4921]: I0312 13:55:27.992303 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddda98d9-b098-437f-8a37-22560af78cdd" path="/var/lib/kubelet/pods/ddda98d9-b098-437f-8a37-22560af78cdd/volumes" Mar 12 13:55:56 crc kubenswrapper[4921]: I0312 13:55:56.324782 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:55:56 crc kubenswrapper[4921]: I0312 13:55:56.325407 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.142770 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555396-xbq7x"] Mar 12 13:56:00 crc kubenswrapper[4921]: E0312 13:56:00.143492 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddda98d9-b098-437f-8a37-22560af78cdd" containerName="extract-utilities" Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.143508 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddda98d9-b098-437f-8a37-22560af78cdd" containerName="extract-utilities" Mar 12 13:56:00 crc kubenswrapper[4921]: E0312 13:56:00.143533 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddda98d9-b098-437f-8a37-22560af78cdd" containerName="extract-content" Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.143539 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddda98d9-b098-437f-8a37-22560af78cdd" containerName="extract-content" Mar 12 13:56:00 crc kubenswrapper[4921]: E0312 13:56:00.143565 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddda98d9-b098-437f-8a37-22560af78cdd" containerName="registry-server" Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.143571 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddda98d9-b098-437f-8a37-22560af78cdd" containerName="registry-server" Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.143787 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddda98d9-b098-437f-8a37-22560af78cdd" containerName="registry-server" Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.144438 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555396-xbq7x" Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.149086 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.149476 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.149753 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555396-xbq7x"] Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.151159 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.239932 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rllt8\" (UniqueName: \"kubernetes.io/projected/1e8e0d28-8ca4-4de9-aaf1-27d835622e57-kube-api-access-rllt8\") pod \"auto-csr-approver-29555396-xbq7x\" (UID: \"1e8e0d28-8ca4-4de9-aaf1-27d835622e57\") " pod="openshift-infra/auto-csr-approver-29555396-xbq7x" Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.341369 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rllt8\" (UniqueName: \"kubernetes.io/projected/1e8e0d28-8ca4-4de9-aaf1-27d835622e57-kube-api-access-rllt8\") pod \"auto-csr-approver-29555396-xbq7x\" (UID: \"1e8e0d28-8ca4-4de9-aaf1-27d835622e57\") " pod="openshift-infra/auto-csr-approver-29555396-xbq7x" Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.359435 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rllt8\" (UniqueName: \"kubernetes.io/projected/1e8e0d28-8ca4-4de9-aaf1-27d835622e57-kube-api-access-rllt8\") pod \"auto-csr-approver-29555396-xbq7x\" (UID: \"1e8e0d28-8ca4-4de9-aaf1-27d835622e57\") " pod="openshift-infra/auto-csr-approver-29555396-xbq7x" Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.467592 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555396-xbq7x" Mar 12 13:56:00 crc kubenswrapper[4921]: I0312 13:56:00.936515 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555396-xbq7x"] Mar 12 13:56:01 crc kubenswrapper[4921]: I0312 13:56:01.009107 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555396-xbq7x" event={"ID":"1e8e0d28-8ca4-4de9-aaf1-27d835622e57","Type":"ContainerStarted","Data":"837f2738f48f659b72f2419b00febd085815562266651f0a7d17f5f37e67cc60"} Mar 12 13:56:03 crc kubenswrapper[4921]: I0312 13:56:03.032137 4921 generic.go:334] "Generic (PLEG): container finished" podID="1e8e0d28-8ca4-4de9-aaf1-27d835622e57" containerID="195b1aa597b32127bb8951f6be388ab80e01b7d7ad43807926858a5a1bf81feb" exitCode=0 Mar 12 13:56:03 crc kubenswrapper[4921]: I0312 13:56:03.032308 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555396-xbq7x" event={"ID":"1e8e0d28-8ca4-4de9-aaf1-27d835622e57","Type":"ContainerDied","Data":"195b1aa597b32127bb8951f6be388ab80e01b7d7ad43807926858a5a1bf81feb"} Mar 12 13:56:04 crc kubenswrapper[4921]: I0312 13:56:04.403418 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555396-xbq7x" Mar 12 13:56:04 crc kubenswrapper[4921]: I0312 13:56:04.526418 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rllt8\" (UniqueName: \"kubernetes.io/projected/1e8e0d28-8ca4-4de9-aaf1-27d835622e57-kube-api-access-rllt8\") pod \"1e8e0d28-8ca4-4de9-aaf1-27d835622e57\" (UID: \"1e8e0d28-8ca4-4de9-aaf1-27d835622e57\") " Mar 12 13:56:04 crc kubenswrapper[4921]: I0312 13:56:04.531806 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8e0d28-8ca4-4de9-aaf1-27d835622e57-kube-api-access-rllt8" (OuterVolumeSpecName: "kube-api-access-rllt8") pod "1e8e0d28-8ca4-4de9-aaf1-27d835622e57" (UID: "1e8e0d28-8ca4-4de9-aaf1-27d835622e57"). InnerVolumeSpecName "kube-api-access-rllt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:56:04 crc kubenswrapper[4921]: I0312 13:56:04.629049 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rllt8\" (UniqueName: \"kubernetes.io/projected/1e8e0d28-8ca4-4de9-aaf1-27d835622e57-kube-api-access-rllt8\") on node \"crc\" DevicePath \"\"" Mar 12 13:56:05 crc kubenswrapper[4921]: I0312 13:56:05.052167 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555396-xbq7x" event={"ID":"1e8e0d28-8ca4-4de9-aaf1-27d835622e57","Type":"ContainerDied","Data":"837f2738f48f659b72f2419b00febd085815562266651f0a7d17f5f37e67cc60"} Mar 12 13:56:05 crc kubenswrapper[4921]: I0312 13:56:05.052214 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="837f2738f48f659b72f2419b00febd085815562266651f0a7d17f5f37e67cc60" Mar 12 13:56:05 crc kubenswrapper[4921]: I0312 13:56:05.052275 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555396-xbq7x" Mar 12 13:56:05 crc kubenswrapper[4921]: I0312 13:56:05.473756 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555390-sbtp8"] Mar 12 13:56:05 crc kubenswrapper[4921]: I0312 13:56:05.481934 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555390-sbtp8"] Mar 12 13:56:06 crc kubenswrapper[4921]: I0312 13:56:06.007705 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d12d4f9f-e152-463e-b0d8-c93036e5f85b" path="/var/lib/kubelet/pods/d12d4f9f-e152-463e-b0d8-c93036e5f85b/volumes" Mar 12 13:56:26 crc kubenswrapper[4921]: I0312 13:56:26.324426 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:56:26 crc kubenswrapper[4921]: I0312 13:56:26.325183 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:56:26 crc kubenswrapper[4921]: I0312 13:56:26.325251 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:56:26 crc kubenswrapper[4921]: I0312 13:56:26.326300 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de0aceb5ba9f7cbd7045011859d16969e07d215089a80468c9cc72efa69df4b5"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 13:56:26 crc kubenswrapper[4921]: I0312 13:56:26.326377 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://de0aceb5ba9f7cbd7045011859d16969e07d215089a80468c9cc72efa69df4b5" gracePeriod=600 Mar 12 13:56:27 crc kubenswrapper[4921]: I0312 13:56:27.259066 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="de0aceb5ba9f7cbd7045011859d16969e07d215089a80468c9cc72efa69df4b5" exitCode=0 Mar 12 13:56:27 crc kubenswrapper[4921]: I0312 13:56:27.259123 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"de0aceb5ba9f7cbd7045011859d16969e07d215089a80468c9cc72efa69df4b5"} Mar 12 13:56:27 crc kubenswrapper[4921]: I0312 13:56:27.260300 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6"} Mar 12 13:56:27 crc kubenswrapper[4921]: I0312 13:56:27.260500 4921 scope.go:117] "RemoveContainer" containerID="879889561fe1806b0335a90ed5e50159ade78499fd03fdcfe4097d20976adc44" Mar 12 13:56:44 crc kubenswrapper[4921]: I0312 13:56:44.502995 4921 scope.go:117] "RemoveContainer" containerID="35862c3da763b439a0e8d53f4a00f7b7e1ff9430209db60f461494e4c1f85d94" Mar 12 13:58:00 crc kubenswrapper[4921]: I0312 13:58:00.155375 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555398-58zl7"] Mar 12 13:58:00 crc kubenswrapper[4921]: E0312 13:58:00.156331 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8e0d28-8ca4-4de9-aaf1-27d835622e57" containerName="oc" Mar 12 13:58:00 crc kubenswrapper[4921]: I0312 13:58:00.156344 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8e0d28-8ca4-4de9-aaf1-27d835622e57" containerName="oc" Mar 12 13:58:00 crc kubenswrapper[4921]: I0312 13:58:00.156514 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e8e0d28-8ca4-4de9-aaf1-27d835622e57" containerName="oc" Mar 12 13:58:00 crc kubenswrapper[4921]: I0312 13:58:00.157083 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555398-58zl7" Mar 12 13:58:00 crc kubenswrapper[4921]: I0312 13:58:00.162275 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 13:58:00 crc kubenswrapper[4921]: I0312 13:58:00.162358 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 13:58:00 crc kubenswrapper[4921]: I0312 13:58:00.162396 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 13:58:00 crc kubenswrapper[4921]: I0312 13:58:00.176508 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555398-58zl7"] Mar 12 13:58:00 crc kubenswrapper[4921]: I0312 13:58:00.337130 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb6vv\" (UniqueName: \"kubernetes.io/projected/2ab05a89-9086-4de8-9e24-03f59f6e2a0b-kube-api-access-jb6vv\") pod \"auto-csr-approver-29555398-58zl7\" (UID: \"2ab05a89-9086-4de8-9e24-03f59f6e2a0b\") " pod="openshift-infra/auto-csr-approver-29555398-58zl7" Mar 12 13:58:00 crc kubenswrapper[4921]: I0312 13:58:00.438338 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb6vv\" (UniqueName: \"kubernetes.io/projected/2ab05a89-9086-4de8-9e24-03f59f6e2a0b-kube-api-access-jb6vv\") pod \"auto-csr-approver-29555398-58zl7\" (UID: \"2ab05a89-9086-4de8-9e24-03f59f6e2a0b\") " pod="openshift-infra/auto-csr-approver-29555398-58zl7" Mar 12 13:58:00 crc kubenswrapper[4921]: I0312 13:58:00.460742 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb6vv\" (UniqueName: \"kubernetes.io/projected/2ab05a89-9086-4de8-9e24-03f59f6e2a0b-kube-api-access-jb6vv\") pod \"auto-csr-approver-29555398-58zl7\" (UID: \"2ab05a89-9086-4de8-9e24-03f59f6e2a0b\") " pod="openshift-infra/auto-csr-approver-29555398-58zl7" Mar 12 13:58:00 crc kubenswrapper[4921]: I0312 13:58:00.482533 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555398-58zl7" Mar 12 13:58:00 crc kubenswrapper[4921]: I0312 13:58:00.918670 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555398-58zl7"] Mar 12 13:58:01 crc kubenswrapper[4921]: I0312 13:58:01.140249 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555398-58zl7" event={"ID":"2ab05a89-9086-4de8-9e24-03f59f6e2a0b","Type":"ContainerStarted","Data":"b8ddb646aa49e33aef71a9ed9235e4d3235c06ff4659393bb604fa0baa5ac1ba"} Mar 12 13:58:03 crc kubenswrapper[4921]: I0312 13:58:03.156867 4921 generic.go:334] "Generic (PLEG): container finished" podID="2ab05a89-9086-4de8-9e24-03f59f6e2a0b" containerID="2e9af1b25a2313f2cd4c9eca01b93c8be5a194ebf02e0099de6ff4c716bc4c9b" exitCode=0 Mar 12 13:58:03 crc kubenswrapper[4921]: I0312 13:58:03.157017 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555398-58zl7" event={"ID":"2ab05a89-9086-4de8-9e24-03f59f6e2a0b","Type":"ContainerDied","Data":"2e9af1b25a2313f2cd4c9eca01b93c8be5a194ebf02e0099de6ff4c716bc4c9b"} Mar 12 13:58:04 crc kubenswrapper[4921]: I0312 13:58:04.487425 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555398-58zl7" Mar 12 13:58:04 crc kubenswrapper[4921]: I0312 13:58:04.513470 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb6vv\" (UniqueName: \"kubernetes.io/projected/2ab05a89-9086-4de8-9e24-03f59f6e2a0b-kube-api-access-jb6vv\") pod \"2ab05a89-9086-4de8-9e24-03f59f6e2a0b\" (UID: \"2ab05a89-9086-4de8-9e24-03f59f6e2a0b\") " Mar 12 13:58:04 crc kubenswrapper[4921]: I0312 13:58:04.520503 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab05a89-9086-4de8-9e24-03f59f6e2a0b-kube-api-access-jb6vv" (OuterVolumeSpecName: "kube-api-access-jb6vv") pod "2ab05a89-9086-4de8-9e24-03f59f6e2a0b" (UID: "2ab05a89-9086-4de8-9e24-03f59f6e2a0b"). InnerVolumeSpecName "kube-api-access-jb6vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:58:04 crc kubenswrapper[4921]: I0312 13:58:04.615629 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb6vv\" (UniqueName: \"kubernetes.io/projected/2ab05a89-9086-4de8-9e24-03f59f6e2a0b-kube-api-access-jb6vv\") on node \"crc\" DevicePath \"\"" Mar 12 13:58:05 crc kubenswrapper[4921]: I0312 13:58:05.181002 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555398-58zl7" event={"ID":"2ab05a89-9086-4de8-9e24-03f59f6e2a0b","Type":"ContainerDied","Data":"b8ddb646aa49e33aef71a9ed9235e4d3235c06ff4659393bb604fa0baa5ac1ba"} Mar 12 13:58:05 crc kubenswrapper[4921]: I0312 13:58:05.181066 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8ddb646aa49e33aef71a9ed9235e4d3235c06ff4659393bb604fa0baa5ac1ba" Mar 12 13:58:05 crc kubenswrapper[4921]: I0312 13:58:05.181078 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555398-58zl7" Mar 12 13:58:05 crc kubenswrapper[4921]: I0312 13:58:05.573563 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555392-5bl64"] Mar 12 13:58:05 crc kubenswrapper[4921]: I0312 13:58:05.583021 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555392-5bl64"] Mar 12 13:58:05 crc kubenswrapper[4921]: I0312 13:58:05.997087 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="081f2e1e-8724-4334-8e31-f3b643d5dcc3" path="/var/lib/kubelet/pods/081f2e1e-8724-4334-8e31-f3b643d5dcc3/volumes" Mar 12 13:58:18 crc kubenswrapper[4921]: I0312 13:58:18.297286 4921 generic.go:334] "Generic (PLEG): container finished" podID="2ee1e205-39b3-4648-8c21-4a7cd46b867f" containerID="4df3fac6046801518be7639b9e6f92b216d16acc599da9011491ec4318354bff" exitCode=0 Mar 12 13:58:18 crc kubenswrapper[4921]: I0312 13:58:18.297372 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" event={"ID":"2ee1e205-39b3-4648-8c21-4a7cd46b867f","Type":"ContainerDied","Data":"4df3fac6046801518be7639b9e6f92b216d16acc599da9011491ec4318354bff"} Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.768410 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.794268 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-ssh-key-openstack-edpm-ipam\") pod \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.794350 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cfvv\" (UniqueName: \"kubernetes.io/projected/2ee1e205-39b3-4648-8c21-4a7cd46b867f-kube-api-access-4cfvv\") pod \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.794386 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-ceph\") pod \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.794401 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-inventory\") pod \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.794470 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-libvirt-secret-0\") pod \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.794491 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-libvirt-combined-ca-bundle\") pod \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\" (UID: \"2ee1e205-39b3-4648-8c21-4a7cd46b867f\") " Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.807293 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee1e205-39b3-4648-8c21-4a7cd46b867f-kube-api-access-4cfvv" (OuterVolumeSpecName: "kube-api-access-4cfvv") pod "2ee1e205-39b3-4648-8c21-4a7cd46b867f" (UID: "2ee1e205-39b3-4648-8c21-4a7cd46b867f"). InnerVolumeSpecName "kube-api-access-4cfvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.808151 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2ee1e205-39b3-4648-8c21-4a7cd46b867f" (UID: "2ee1e205-39b3-4648-8c21-4a7cd46b867f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.818267 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-ceph" (OuterVolumeSpecName: "ceph") pod "2ee1e205-39b3-4648-8c21-4a7cd46b867f" (UID: "2ee1e205-39b3-4648-8c21-4a7cd46b867f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.853921 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-inventory" (OuterVolumeSpecName: "inventory") pod "2ee1e205-39b3-4648-8c21-4a7cd46b867f" (UID: "2ee1e205-39b3-4648-8c21-4a7cd46b867f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.856114 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2ee1e205-39b3-4648-8c21-4a7cd46b867f" (UID: "2ee1e205-39b3-4648-8c21-4a7cd46b867f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.868069 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "2ee1e205-39b3-4648-8c21-4a7cd46b867f" (UID: "2ee1e205-39b3-4648-8c21-4a7cd46b867f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.897094 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cfvv\" (UniqueName: \"kubernetes.io/projected/2ee1e205-39b3-4648-8c21-4a7cd46b867f-kube-api-access-4cfvv\") on node \"crc\" DevicePath \"\"" Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.897483 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.897499 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.897514 4921 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.897528 4921 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 13:58:19 crc kubenswrapper[4921]: I0312 13:58:19.897545 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ee1e205-39b3-4648-8c21-4a7cd46b867f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.314089 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" event={"ID":"2ee1e205-39b3-4648-8c21-4a7cd46b867f","Type":"ContainerDied","Data":"f54df5838710c0ad9163fe23fc081494e2b4ae092b623adc93c4ad382b9cc0fb"} Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.314132 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f54df5838710c0ad9163fe23fc081494e2b4ae092b623adc93c4ad382b9cc0fb" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.314150 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.411630 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j"] Mar 12 13:58:20 crc kubenswrapper[4921]: E0312 13:58:20.412347 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab05a89-9086-4de8-9e24-03f59f6e2a0b" containerName="oc" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.412441 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab05a89-9086-4de8-9e24-03f59f6e2a0b" containerName="oc" Mar 12 13:58:20 crc kubenswrapper[4921]: E0312 13:58:20.412543 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee1e205-39b3-4648-8c21-4a7cd46b867f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.412613 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee1e205-39b3-4648-8c21-4a7cd46b867f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.412966 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee1e205-39b3-4648-8c21-4a7cd46b867f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.413073 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab05a89-9086-4de8-9e24-03f59f6e2a0b" containerName="oc" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.414037 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.417381 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.417748 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.420175 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-x7gxf" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.420304 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.420457 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.420509 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.420624 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.420697 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.420754 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.426722 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j"] Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.509490 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.509578 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.509652 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.509720 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.509757 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.509808 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.509861 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.509889 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.509918 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.509993 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n96s\" (UniqueName: \"kubernetes.io/projected/bcef78dc-2d5d-4a04-b106-2b54e1b11292-kube-api-access-9n96s\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.510019 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.510131 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.510176 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.611907 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.611964 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.612001 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.612036 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.612056 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.612088 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.612104 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.612122 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.612140 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.612168 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n96s\" (UniqueName: \"kubernetes.io/projected/bcef78dc-2d5d-4a04-b106-2b54e1b11292-kube-api-access-9n96s\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.612188 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.612220 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.612247 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.613649 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.614397 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.615982 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.616004 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.616684 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-2\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.616852 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.617320 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.617534 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.617618 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.617939 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-3\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.619396 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.619587 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.636993 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n96s\" (UniqueName: \"kubernetes.io/projected/bcef78dc-2d5d-4a04-b106-2b54e1b11292-kube-api-access-9n96s\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:20 crc kubenswrapper[4921]: I0312 13:58:20.751706 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 13:58:21 crc kubenswrapper[4921]: I0312 13:58:21.279937 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j"] Mar 12 13:58:21 crc kubenswrapper[4921]: I0312 13:58:21.324425 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" event={"ID":"bcef78dc-2d5d-4a04-b106-2b54e1b11292","Type":"ContainerStarted","Data":"7297cfa96c2ce39e76a23204422ea0644c088ce659d821d463941bf53e5cdb0f"} Mar 12 13:58:22 crc kubenswrapper[4921]: I0312 13:58:22.341369 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" event={"ID":"bcef78dc-2d5d-4a04-b106-2b54e1b11292","Type":"ContainerStarted","Data":"4d706b0711fc22e597011bcbd5393725e8063459be07a5f18987f1d8d6694168"} Mar 12 13:58:22 crc kubenswrapper[4921]: I0312 13:58:22.371242 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" podStartSLOduration=1.721276047 podStartE2EDuration="2.371224439s" podCreationTimestamp="2026-03-12 13:58:20 +0000 UTC" firstStartedPulling="2026-03-12 13:58:21.281295011 +0000 UTC m=+2923.971366982" lastFinishedPulling="2026-03-12 13:58:21.931243403 +0000 UTC m=+2924.621315374" observedRunningTime="2026-03-12 13:58:22.363881174 +0000 UTC m=+2925.053953145" watchObservedRunningTime="2026-03-12 13:58:22.371224439 +0000 UTC m=+2925.061296410" Mar 12 13:58:26 crc kubenswrapper[4921]: I0312 13:58:26.326601 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:58:26 crc kubenswrapper[4921]: I0312 13:58:26.331092 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:58:44 crc kubenswrapper[4921]: I0312 13:58:44.592737 4921 scope.go:117] "RemoveContainer" containerID="8c06d03bf4c7df3de3245b6931b5cd0701633898a3fa332495e71ac182c3e046" Mar 12 13:58:56 crc kubenswrapper[4921]: I0312 13:58:56.324476 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:58:56 crc kubenswrapper[4921]: I0312 13:58:56.325203 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:59:26 crc kubenswrapper[4921]: I0312 13:59:26.323350 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 13:59:26 crc kubenswrapper[4921]: I0312 13:59:26.323908 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 13:59:26 crc kubenswrapper[4921]: I0312 13:59:26.323955 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 13:59:26 crc kubenswrapper[4921]: I0312 13:59:26.324717 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 13:59:26 crc kubenswrapper[4921]: I0312 13:59:26.324776 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" gracePeriod=600 Mar 12 13:59:26 crc kubenswrapper[4921]: E0312 13:59:26.445399 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:59:26 crc kubenswrapper[4921]: I0312 13:59:26.987799 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" exitCode=0 Mar 12 13:59:26 crc kubenswrapper[4921]: I0312 13:59:26.987847 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6"} Mar 12 13:59:26 crc kubenswrapper[4921]: I0312 13:59:26.988177 4921 scope.go:117] "RemoveContainer" containerID="de0aceb5ba9f7cbd7045011859d16969e07d215089a80468c9cc72efa69df4b5" Mar 12 13:59:26 crc kubenswrapper[4921]: I0312 13:59:26.990215 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 13:59:26 crc kubenswrapper[4921]: E0312 13:59:26.991010 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:59:40 crc kubenswrapper[4921]: I0312 13:59:40.983519 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 13:59:40 crc kubenswrapper[4921]: E0312 13:59:40.985199 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 13:59:52 crc kubenswrapper[4921]: I0312 13:59:52.983284 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 13:59:52 crc kubenswrapper[4921]: E0312 13:59:52.984135 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.145519 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555400-nddn9"] Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.149484 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555400-nddn9" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.152048 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.153990 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.155407 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.160623 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555400-nddn9"] Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.169720 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n"] Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.172089 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.174779 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.179681 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n"] Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.179875 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.345965 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4bv7\" (UniqueName: \"kubernetes.io/projected/a9477785-0666-4867-b966-5ea53dd6f0ea-kube-api-access-k4bv7\") pod \"auto-csr-approver-29555400-nddn9\" (UID: \"a9477785-0666-4867-b966-5ea53dd6f0ea\") " pod="openshift-infra/auto-csr-approver-29555400-nddn9" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.346011 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12955294-d435-42e0-9130-5a84882f0fe0-secret-volume\") pod \"collect-profiles-29555400-mjp2n\" (UID: \"12955294-d435-42e0-9130-5a84882f0fe0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.346050 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12955294-d435-42e0-9130-5a84882f0fe0-config-volume\") pod \"collect-profiles-29555400-mjp2n\" (UID: \"12955294-d435-42e0-9130-5a84882f0fe0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.346088 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g65xm\" (UniqueName: \"kubernetes.io/projected/12955294-d435-42e0-9130-5a84882f0fe0-kube-api-access-g65xm\") pod \"collect-profiles-29555400-mjp2n\" (UID: \"12955294-d435-42e0-9130-5a84882f0fe0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.447477 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4bv7\" (UniqueName: \"kubernetes.io/projected/a9477785-0666-4867-b966-5ea53dd6f0ea-kube-api-access-k4bv7\") pod \"auto-csr-approver-29555400-nddn9\" (UID: \"a9477785-0666-4867-b966-5ea53dd6f0ea\") " pod="openshift-infra/auto-csr-approver-29555400-nddn9" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.447530 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12955294-d435-42e0-9130-5a84882f0fe0-secret-volume\") pod \"collect-profiles-29555400-mjp2n\" (UID: \"12955294-d435-42e0-9130-5a84882f0fe0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.447587 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12955294-d435-42e0-9130-5a84882f0fe0-config-volume\") pod \"collect-profiles-29555400-mjp2n\" (UID: \"12955294-d435-42e0-9130-5a84882f0fe0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.447643 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g65xm\" (UniqueName: \"kubernetes.io/projected/12955294-d435-42e0-9130-5a84882f0fe0-kube-api-access-g65xm\") pod \"collect-profiles-29555400-mjp2n\" (UID: \"12955294-d435-42e0-9130-5a84882f0fe0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.448612 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12955294-d435-42e0-9130-5a84882f0fe0-config-volume\") pod \"collect-profiles-29555400-mjp2n\" (UID: \"12955294-d435-42e0-9130-5a84882f0fe0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.454462 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12955294-d435-42e0-9130-5a84882f0fe0-secret-volume\") pod \"collect-profiles-29555400-mjp2n\" (UID: \"12955294-d435-42e0-9130-5a84882f0fe0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.463344 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g65xm\" (UniqueName: \"kubernetes.io/projected/12955294-d435-42e0-9130-5a84882f0fe0-kube-api-access-g65xm\") pod \"collect-profiles-29555400-mjp2n\" (UID: \"12955294-d435-42e0-9130-5a84882f0fe0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.466480 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4bv7\" (UniqueName: \"kubernetes.io/projected/a9477785-0666-4867-b966-5ea53dd6f0ea-kube-api-access-k4bv7\") pod \"auto-csr-approver-29555400-nddn9\" (UID: \"a9477785-0666-4867-b966-5ea53dd6f0ea\") " pod="openshift-infra/auto-csr-approver-29555400-nddn9" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.483573 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555400-nddn9" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.498801 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" Mar 12 14:00:00 crc kubenswrapper[4921]: I0312 14:00:00.973450 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555400-nddn9"] Mar 12 14:00:01 crc kubenswrapper[4921]: W0312 14:00:01.071162 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12955294_d435_42e0_9130_5a84882f0fe0.slice/crio-61106179d61de9b78f97545086bb6f31033f6f15f799680b8cd32deffd20f40c WatchSource:0}: Error finding container 61106179d61de9b78f97545086bb6f31033f6f15f799680b8cd32deffd20f40c: Status 404 returned error can't find the container with id 61106179d61de9b78f97545086bb6f31033f6f15f799680b8cd32deffd20f40c Mar 12 14:00:01 crc kubenswrapper[4921]: I0312 14:00:01.072013 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n"] Mar 12 14:00:01 crc kubenswrapper[4921]: I0312 14:00:01.301917 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555400-nddn9" event={"ID":"a9477785-0666-4867-b966-5ea53dd6f0ea","Type":"ContainerStarted","Data":"406698baaadfca483d83239616b911d25b97757e6555bebc064235042ae54f5c"} Mar 12 14:00:01 crc kubenswrapper[4921]: I0312 14:00:01.305239 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" event={"ID":"12955294-d435-42e0-9130-5a84882f0fe0","Type":"ContainerStarted","Data":"f0feebcb84a08e5b580c6e8db79d6242c3e29baca96ede264340ca207c64072f"} Mar 12 14:00:01 crc kubenswrapper[4921]: I0312 14:00:01.305303 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" event={"ID":"12955294-d435-42e0-9130-5a84882f0fe0","Type":"ContainerStarted","Data":"61106179d61de9b78f97545086bb6f31033f6f15f799680b8cd32deffd20f40c"} Mar 12 14:00:01 crc kubenswrapper[4921]: I0312 14:00:01.323232 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" podStartSLOduration=1.323211613 podStartE2EDuration="1.323211613s" podCreationTimestamp="2026-03-12 14:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:00:01.318732494 +0000 UTC m=+3024.008804475" watchObservedRunningTime="2026-03-12 14:00:01.323211613 +0000 UTC m=+3024.013283604" Mar 12 14:00:02 crc kubenswrapper[4921]: I0312 14:00:02.325932 4921 generic.go:334] "Generic (PLEG): container finished" podID="12955294-d435-42e0-9130-5a84882f0fe0" containerID="f0feebcb84a08e5b580c6e8db79d6242c3e29baca96ede264340ca207c64072f" exitCode=0 Mar 12 14:00:02 crc kubenswrapper[4921]: I0312 14:00:02.325986 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" event={"ID":"12955294-d435-42e0-9130-5a84882f0fe0","Type":"ContainerDied","Data":"f0feebcb84a08e5b580c6e8db79d6242c3e29baca96ede264340ca207c64072f"} Mar 12 14:00:03 crc kubenswrapper[4921]: I0312 14:00:03.706678 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" Mar 12 14:00:03 crc kubenswrapper[4921]: I0312 14:00:03.714983 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12955294-d435-42e0-9130-5a84882f0fe0-secret-volume\") pod \"12955294-d435-42e0-9130-5a84882f0fe0\" (UID: \"12955294-d435-42e0-9130-5a84882f0fe0\") " Mar 12 14:00:03 crc kubenswrapper[4921]: I0312 14:00:03.715073 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g65xm\" (UniqueName: \"kubernetes.io/projected/12955294-d435-42e0-9130-5a84882f0fe0-kube-api-access-g65xm\") pod \"12955294-d435-42e0-9130-5a84882f0fe0\" (UID: \"12955294-d435-42e0-9130-5a84882f0fe0\") " Mar 12 14:00:03 crc kubenswrapper[4921]: I0312 14:00:03.722490 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12955294-d435-42e0-9130-5a84882f0fe0-kube-api-access-g65xm" (OuterVolumeSpecName: "kube-api-access-g65xm") pod "12955294-d435-42e0-9130-5a84882f0fe0" (UID: "12955294-d435-42e0-9130-5a84882f0fe0"). InnerVolumeSpecName "kube-api-access-g65xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:00:03 crc kubenswrapper[4921]: I0312 14:00:03.726952 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12955294-d435-42e0-9130-5a84882f0fe0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "12955294-d435-42e0-9130-5a84882f0fe0" (UID: "12955294-d435-42e0-9130-5a84882f0fe0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:00:03 crc kubenswrapper[4921]: I0312 14:00:03.817248 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12955294-d435-42e0-9130-5a84882f0fe0-config-volume\") pod \"12955294-d435-42e0-9130-5a84882f0fe0\" (UID: \"12955294-d435-42e0-9130-5a84882f0fe0\") " Mar 12 14:00:03 crc kubenswrapper[4921]: I0312 14:00:03.817728 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12955294-d435-42e0-9130-5a84882f0fe0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 14:00:03 crc kubenswrapper[4921]: I0312 14:00:03.817755 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g65xm\" (UniqueName: \"kubernetes.io/projected/12955294-d435-42e0-9130-5a84882f0fe0-kube-api-access-g65xm\") on node \"crc\" DevicePath \"\"" Mar 12 14:00:03 crc kubenswrapper[4921]: I0312 14:00:03.818076 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12955294-d435-42e0-9130-5a84882f0fe0-config-volume" (OuterVolumeSpecName: "config-volume") pod "12955294-d435-42e0-9130-5a84882f0fe0" (UID: "12955294-d435-42e0-9130-5a84882f0fe0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:00:03 crc kubenswrapper[4921]: I0312 14:00:03.919669 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12955294-d435-42e0-9130-5a84882f0fe0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 14:00:03 crc kubenswrapper[4921]: I0312 14:00:03.983105 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:00:03 crc kubenswrapper[4921]: E0312 14:00:03.983349 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:00:04 crc kubenswrapper[4921]: I0312 14:00:04.353559 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" Mar 12 14:00:04 crc kubenswrapper[4921]: I0312 14:00:04.353610 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n" event={"ID":"12955294-d435-42e0-9130-5a84882f0fe0","Type":"ContainerDied","Data":"61106179d61de9b78f97545086bb6f31033f6f15f799680b8cd32deffd20f40c"} Mar 12 14:00:04 crc kubenswrapper[4921]: I0312 14:00:04.354263 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61106179d61de9b78f97545086bb6f31033f6f15f799680b8cd32deffd20f40c" Mar 12 14:00:04 crc kubenswrapper[4921]: I0312 14:00:04.404644 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6"] Mar 12 14:00:04 crc kubenswrapper[4921]: I0312 14:00:04.413954 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555355-8psk6"] Mar 12 14:00:05 crc kubenswrapper[4921]: I0312 14:00:05.382733 4921 generic.go:334] "Generic (PLEG): container finished" podID="a9477785-0666-4867-b966-5ea53dd6f0ea" containerID="bab3318f7e4bfa10390f12f5745b7634213c91719deb18176f1d832b464ed9d4" exitCode=0 Mar 12 14:00:05 crc kubenswrapper[4921]: I0312 14:00:05.382787 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555400-nddn9" event={"ID":"a9477785-0666-4867-b966-5ea53dd6f0ea","Type":"ContainerDied","Data":"bab3318f7e4bfa10390f12f5745b7634213c91719deb18176f1d832b464ed9d4"} Mar 12 14:00:05 crc kubenswrapper[4921]: I0312 14:00:05.998588 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ca8df82-33c8-43ef-8e87-9df25af27923" path="/var/lib/kubelet/pods/0ca8df82-33c8-43ef-8e87-9df25af27923/volumes" Mar 12 14:00:06 crc kubenswrapper[4921]: I0312 14:00:06.773291 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555400-nddn9" Mar 12 14:00:06 crc kubenswrapper[4921]: I0312 14:00:06.781471 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4bv7\" (UniqueName: \"kubernetes.io/projected/a9477785-0666-4867-b966-5ea53dd6f0ea-kube-api-access-k4bv7\") pod \"a9477785-0666-4867-b966-5ea53dd6f0ea\" (UID: \"a9477785-0666-4867-b966-5ea53dd6f0ea\") " Mar 12 14:00:06 crc kubenswrapper[4921]: I0312 14:00:06.788168 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9477785-0666-4867-b966-5ea53dd6f0ea-kube-api-access-k4bv7" (OuterVolumeSpecName: "kube-api-access-k4bv7") pod "a9477785-0666-4867-b966-5ea53dd6f0ea" (UID: "a9477785-0666-4867-b966-5ea53dd6f0ea"). InnerVolumeSpecName "kube-api-access-k4bv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:00:06 crc kubenswrapper[4921]: I0312 14:00:06.883030 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4bv7\" (UniqueName: \"kubernetes.io/projected/a9477785-0666-4867-b966-5ea53dd6f0ea-kube-api-access-k4bv7\") on node \"crc\" DevicePath \"\"" Mar 12 14:00:07 crc kubenswrapper[4921]: I0312 14:00:07.403831 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555400-nddn9" event={"ID":"a9477785-0666-4867-b966-5ea53dd6f0ea","Type":"ContainerDied","Data":"406698baaadfca483d83239616b911d25b97757e6555bebc064235042ae54f5c"} Mar 12 14:00:07 crc kubenswrapper[4921]: I0312 14:00:07.403878 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="406698baaadfca483d83239616b911d25b97757e6555bebc064235042ae54f5c" Mar 12 14:00:07 crc kubenswrapper[4921]: I0312 14:00:07.403923 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555400-nddn9" Mar 12 14:00:07 crc kubenswrapper[4921]: E0312 14:00:07.564310 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9477785_0666_4867_b966_5ea53dd6f0ea.slice/crio-406698baaadfca483d83239616b911d25b97757e6555bebc064235042ae54f5c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9477785_0666_4867_b966_5ea53dd6f0ea.slice\": RecentStats: unable to find data in memory cache]" Mar 12 14:00:07 crc kubenswrapper[4921]: I0312 14:00:07.830012 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555394-sn68w"] Mar 12 14:00:07 crc kubenswrapper[4921]: I0312 14:00:07.844968 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555394-sn68w"] Mar 12 14:00:08 crc kubenswrapper[4921]: I0312 14:00:08.010056 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04056984-4307-4c27-943d-c6505f8c40c8" path="/var/lib/kubelet/pods/04056984-4307-4c27-943d-c6505f8c40c8/volumes" Mar 12 14:00:16 crc kubenswrapper[4921]: I0312 14:00:16.982916 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:00:16 crc kubenswrapper[4921]: E0312 14:00:16.983863 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:00:27 crc kubenswrapper[4921]: I0312 14:00:27.999133 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:00:28 crc kubenswrapper[4921]: E0312 14:00:28.000001 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.089088 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9th9m"] Mar 12 14:00:33 crc kubenswrapper[4921]: E0312 14:00:33.090213 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9477785-0666-4867-b966-5ea53dd6f0ea" containerName="oc" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.090243 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9477785-0666-4867-b966-5ea53dd6f0ea" containerName="oc" Mar 12 14:00:33 crc kubenswrapper[4921]: E0312 14:00:33.090324 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12955294-d435-42e0-9130-5a84882f0fe0" containerName="collect-profiles" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.090341 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="12955294-d435-42e0-9130-5a84882f0fe0" containerName="collect-profiles" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.090667 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="12955294-d435-42e0-9130-5a84882f0fe0" containerName="collect-profiles" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.090718 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9477785-0666-4867-b966-5ea53dd6f0ea" containerName="oc" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.095924 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.105202 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9th9m"] Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.190006 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9xl2\" (UniqueName: \"kubernetes.io/projected/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-kube-api-access-n9xl2\") pod \"redhat-marketplace-9th9m\" (UID: \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\") " pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.190089 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-catalog-content\") pod \"redhat-marketplace-9th9m\" (UID: \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\") " pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.190429 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-utilities\") pod \"redhat-marketplace-9th9m\" (UID: \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\") " pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.292034 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-catalog-content\") pod \"redhat-marketplace-9th9m\" (UID: \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\") " pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.292163 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-utilities\") pod \"redhat-marketplace-9th9m\" (UID: \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\") " pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.292213 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9xl2\" (UniqueName: \"kubernetes.io/projected/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-kube-api-access-n9xl2\") pod \"redhat-marketplace-9th9m\" (UID: \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\") " pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.292575 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-catalog-content\") pod \"redhat-marketplace-9th9m\" (UID: \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\") " pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.292698 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-utilities\") pod \"redhat-marketplace-9th9m\" (UID: \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\") " pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.312452 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9xl2\" (UniqueName: \"kubernetes.io/projected/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-kube-api-access-n9xl2\") pod \"redhat-marketplace-9th9m\" (UID: \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\") " pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.429320 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:33 crc kubenswrapper[4921]: I0312 14:00:33.905097 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9th9m"] Mar 12 14:00:34 crc kubenswrapper[4921]: I0312 14:00:34.682979 4921 generic.go:334] "Generic (PLEG): container finished" podID="63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" containerID="6ba1bfbc648e26280cad12151f6d48f4d20655f7274631b2ab94099138fe3069" exitCode=0 Mar 12 14:00:34 crc kubenswrapper[4921]: I0312 14:00:34.683046 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9th9m" event={"ID":"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa","Type":"ContainerDied","Data":"6ba1bfbc648e26280cad12151f6d48f4d20655f7274631b2ab94099138fe3069"} Mar 12 14:00:34 crc kubenswrapper[4921]: I0312 14:00:34.683326 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9th9m" event={"ID":"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa","Type":"ContainerStarted","Data":"6d02db34664ba92443bc7a9aa7603e2d34dc04e1cefafe9e531b25381cba0af8"} Mar 12 14:00:34 crc kubenswrapper[4921]: I0312 14:00:34.689316 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:00:35 crc kubenswrapper[4921]: I0312 14:00:35.693776 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9th9m" event={"ID":"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa","Type":"ContainerStarted","Data":"2d099d94cb7f16497950150901a9ff6a8e5795a692db24166ab065bfdc4c5da8"} Mar 12 14:00:36 crc kubenswrapper[4921]: I0312 14:00:36.706838 4921 generic.go:334] "Generic (PLEG): container finished" podID="63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" containerID="2d099d94cb7f16497950150901a9ff6a8e5795a692db24166ab065bfdc4c5da8" exitCode=0 Mar 12 14:00:36 crc kubenswrapper[4921]: I0312 14:00:36.707014 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9th9m" event={"ID":"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa","Type":"ContainerDied","Data":"2d099d94cb7f16497950150901a9ff6a8e5795a692db24166ab065bfdc4c5da8"} Mar 12 14:00:37 crc kubenswrapper[4921]: I0312 14:00:37.718482 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9th9m" event={"ID":"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa","Type":"ContainerStarted","Data":"c73c9ca33e9064689872c6e4374bc1ab7c132a240c6c183b19c7941395201807"} Mar 12 14:00:37 crc kubenswrapper[4921]: I0312 14:00:37.740974 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9th9m" podStartSLOduration=2.28755903 podStartE2EDuration="4.740952568s" podCreationTimestamp="2026-03-12 14:00:33 +0000 UTC" firstStartedPulling="2026-03-12 14:00:34.6889431 +0000 UTC m=+3057.379015111" lastFinishedPulling="2026-03-12 14:00:37.142336678 +0000 UTC m=+3059.832408649" observedRunningTime="2026-03-12 14:00:37.737717648 +0000 UTC m=+3060.427789619" watchObservedRunningTime="2026-03-12 14:00:37.740952568 +0000 UTC m=+3060.431024549" Mar 12 14:00:38 crc kubenswrapper[4921]: I0312 14:00:38.983808 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:00:38 crc kubenswrapper[4921]: E0312 14:00:38.984411 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:00:43 crc kubenswrapper[4921]: I0312 14:00:43.430089 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:43 crc kubenswrapper[4921]: I0312 14:00:43.430696 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:43 crc kubenswrapper[4921]: I0312 14:00:43.488537 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:43 crc kubenswrapper[4921]: I0312 14:00:43.829033 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:43 crc kubenswrapper[4921]: I0312 14:00:43.885235 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9th9m"] Mar 12 14:00:44 crc kubenswrapper[4921]: I0312 14:00:44.692241 4921 scope.go:117] "RemoveContainer" containerID="642a58f7ddbf06f95ca332f4a68933c68769ea95f0d709b937fdfc24450ad2d5" Mar 12 14:00:44 crc kubenswrapper[4921]: I0312 14:00:44.727851 4921 scope.go:117] "RemoveContainer" containerID="8101e4f38dd9f6e2d2e03834a325ab1fd81f4f99667202d8fd6ea978a4391676" Mar 12 14:00:45 crc kubenswrapper[4921]: I0312 14:00:45.789297 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9th9m" podUID="63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" containerName="registry-server" containerID="cri-o://c73c9ca33e9064689872c6e4374bc1ab7c132a240c6c183b19c7941395201807" gracePeriod=2 Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.702802 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.799560 4921 generic.go:334] "Generic (PLEG): container finished" podID="63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" containerID="c73c9ca33e9064689872c6e4374bc1ab7c132a240c6c183b19c7941395201807" exitCode=0 Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.799607 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9th9m" event={"ID":"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa","Type":"ContainerDied","Data":"c73c9ca33e9064689872c6e4374bc1ab7c132a240c6c183b19c7941395201807"} Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.799644 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9th9m" event={"ID":"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa","Type":"ContainerDied","Data":"6d02db34664ba92443bc7a9aa7603e2d34dc04e1cefafe9e531b25381cba0af8"} Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.799666 4921 scope.go:117] "RemoveContainer" containerID="c73c9ca33e9064689872c6e4374bc1ab7c132a240c6c183b19c7941395201807" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.799676 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9th9m" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.823717 4921 scope.go:117] "RemoveContainer" containerID="2d099d94cb7f16497950150901a9ff6a8e5795a692db24166ab065bfdc4c5da8" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.841252 4921 scope.go:117] "RemoveContainer" containerID="6ba1bfbc648e26280cad12151f6d48f4d20655f7274631b2ab94099138fe3069" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.875588 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9xl2\" (UniqueName: \"kubernetes.io/projected/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-kube-api-access-n9xl2\") pod \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\" (UID: \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\") " Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.875711 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-catalog-content\") pod \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\" (UID: \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\") " Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.875942 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-utilities\") pod \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\" (UID: \"63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa\") " Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.877042 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-utilities" (OuterVolumeSpecName: "utilities") pod "63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" (UID: "63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.877239 4921 scope.go:117] "RemoveContainer" containerID="c73c9ca33e9064689872c6e4374bc1ab7c132a240c6c183b19c7941395201807" Mar 12 14:00:46 crc kubenswrapper[4921]: E0312 14:00:46.877787 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c73c9ca33e9064689872c6e4374bc1ab7c132a240c6c183b19c7941395201807\": container with ID starting with c73c9ca33e9064689872c6e4374bc1ab7c132a240c6c183b19c7941395201807 not found: ID does not exist" containerID="c73c9ca33e9064689872c6e4374bc1ab7c132a240c6c183b19c7941395201807" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.877843 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73c9ca33e9064689872c6e4374bc1ab7c132a240c6c183b19c7941395201807"} err="failed to get container status \"c73c9ca33e9064689872c6e4374bc1ab7c132a240c6c183b19c7941395201807\": rpc error: code = NotFound desc = could not find container \"c73c9ca33e9064689872c6e4374bc1ab7c132a240c6c183b19c7941395201807\": container with ID starting with c73c9ca33e9064689872c6e4374bc1ab7c132a240c6c183b19c7941395201807 not found: ID does not exist" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.877867 4921 scope.go:117] "RemoveContainer" containerID="2d099d94cb7f16497950150901a9ff6a8e5795a692db24166ab065bfdc4c5da8" Mar 12 14:00:46 crc kubenswrapper[4921]: E0312 14:00:46.878159 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d099d94cb7f16497950150901a9ff6a8e5795a692db24166ab065bfdc4c5da8\": container with ID starting with 2d099d94cb7f16497950150901a9ff6a8e5795a692db24166ab065bfdc4c5da8 not found: ID does not exist" containerID="2d099d94cb7f16497950150901a9ff6a8e5795a692db24166ab065bfdc4c5da8" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.878182 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d099d94cb7f16497950150901a9ff6a8e5795a692db24166ab065bfdc4c5da8"} err="failed to get container status \"2d099d94cb7f16497950150901a9ff6a8e5795a692db24166ab065bfdc4c5da8\": rpc error: code = NotFound desc = could not find container \"2d099d94cb7f16497950150901a9ff6a8e5795a692db24166ab065bfdc4c5da8\": container with ID starting with 2d099d94cb7f16497950150901a9ff6a8e5795a692db24166ab065bfdc4c5da8 not found: ID does not exist" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.878195 4921 scope.go:117] "RemoveContainer" containerID="6ba1bfbc648e26280cad12151f6d48f4d20655f7274631b2ab94099138fe3069" Mar 12 14:00:46 crc kubenswrapper[4921]: E0312 14:00:46.878642 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ba1bfbc648e26280cad12151f6d48f4d20655f7274631b2ab94099138fe3069\": container with ID starting with 6ba1bfbc648e26280cad12151f6d48f4d20655f7274631b2ab94099138fe3069 not found: ID does not exist" containerID="6ba1bfbc648e26280cad12151f6d48f4d20655f7274631b2ab94099138fe3069" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.878674 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ba1bfbc648e26280cad12151f6d48f4d20655f7274631b2ab94099138fe3069"} err="failed to get container status \"6ba1bfbc648e26280cad12151f6d48f4d20655f7274631b2ab94099138fe3069\": rpc error: code = NotFound desc = could not find container \"6ba1bfbc648e26280cad12151f6d48f4d20655f7274631b2ab94099138fe3069\": container with ID starting with 6ba1bfbc648e26280cad12151f6d48f4d20655f7274631b2ab94099138fe3069 not found: ID does not exist" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.882172 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-kube-api-access-n9xl2" (OuterVolumeSpecName: "kube-api-access-n9xl2") pod "63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" (UID: "63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa"). InnerVolumeSpecName "kube-api-access-n9xl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.901785 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" (UID: "63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.978186 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.978217 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9xl2\" (UniqueName: \"kubernetes.io/projected/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-kube-api-access-n9xl2\") on node \"crc\" DevicePath \"\"" Mar 12 14:00:46 crc kubenswrapper[4921]: I0312 14:00:46.978228 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:00:47 crc kubenswrapper[4921]: I0312 14:00:47.139985 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9th9m"] Mar 12 14:00:47 crc kubenswrapper[4921]: I0312 14:00:47.148572 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9th9m"] Mar 12 14:00:48 crc kubenswrapper[4921]: I0312 14:00:48.017604 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" path="/var/lib/kubelet/pods/63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa/volumes" Mar 12 14:00:50 crc kubenswrapper[4921]: I0312 14:00:50.982997 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:00:50 crc kubenswrapper[4921]: E0312 14:00:50.983664 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.153850 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29555401-cfhz9"] Mar 12 14:01:00 crc kubenswrapper[4921]: E0312 14:01:00.154787 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" containerName="extract-utilities" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.154804 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" containerName="extract-utilities" Mar 12 14:01:00 crc kubenswrapper[4921]: E0312 14:01:00.154835 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" containerName="extract-content" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.154843 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" containerName="extract-content" Mar 12 14:01:00 crc kubenswrapper[4921]: E0312 14:01:00.154857 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" containerName="registry-server" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.154864 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" containerName="registry-server" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.155073 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="63dbdb6c-d563-4e13-bc0b-2ad22b7a8bfa" containerName="registry-server" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.155912 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.183238 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29555401-cfhz9"] Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.257198 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-fernet-keys\") pod \"keystone-cron-29555401-cfhz9\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.257486 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-config-data\") pod \"keystone-cron-29555401-cfhz9\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.258267 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpsr6\" (UniqueName: \"kubernetes.io/projected/c85b992e-689f-4f2f-9799-da7e608f6ca8-kube-api-access-mpsr6\") pod \"keystone-cron-29555401-cfhz9\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.258581 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-combined-ca-bundle\") pod \"keystone-cron-29555401-cfhz9\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.360421 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-fernet-keys\") pod \"keystone-cron-29555401-cfhz9\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.360469 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-config-data\") pod \"keystone-cron-29555401-cfhz9\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.360507 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpsr6\" (UniqueName: \"kubernetes.io/projected/c85b992e-689f-4f2f-9799-da7e608f6ca8-kube-api-access-mpsr6\") pod \"keystone-cron-29555401-cfhz9\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.360593 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-combined-ca-bundle\") pod \"keystone-cron-29555401-cfhz9\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.369741 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-fernet-keys\") pod \"keystone-cron-29555401-cfhz9\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.384185 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-combined-ca-bundle\") pod \"keystone-cron-29555401-cfhz9\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.384501 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-config-data\") pod \"keystone-cron-29555401-cfhz9\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.391344 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpsr6\" (UniqueName: \"kubernetes.io/projected/c85b992e-689f-4f2f-9799-da7e608f6ca8-kube-api-access-mpsr6\") pod \"keystone-cron-29555401-cfhz9\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.483735 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:00 crc kubenswrapper[4921]: I0312 14:01:00.971442 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29555401-cfhz9"] Mar 12 14:01:01 crc kubenswrapper[4921]: I0312 14:01:01.943474 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29555401-cfhz9" event={"ID":"c85b992e-689f-4f2f-9799-da7e608f6ca8","Type":"ContainerStarted","Data":"62cdad4e2ea359b76c2d2e2899c04981fcb6840355012c10de7eff21c109876d"} Mar 12 14:01:01 crc kubenswrapper[4921]: I0312 14:01:01.943858 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29555401-cfhz9" event={"ID":"c85b992e-689f-4f2f-9799-da7e608f6ca8","Type":"ContainerStarted","Data":"562df4da9b2c820f3a20737c57582374bddbf67a06144c89ed25bfd91a332e64"} Mar 12 14:01:01 crc kubenswrapper[4921]: I0312 14:01:01.973779 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29555401-cfhz9" podStartSLOduration=1.9737522269999999 podStartE2EDuration="1.973752227s" podCreationTimestamp="2026-03-12 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:01.961162309 +0000 UTC m=+3084.651234280" watchObservedRunningTime="2026-03-12 14:01:01.973752227 +0000 UTC m=+3084.663824198" Mar 12 14:01:03 crc kubenswrapper[4921]: I0312 14:01:03.961478 4921 generic.go:334] "Generic (PLEG): container finished" podID="c85b992e-689f-4f2f-9799-da7e608f6ca8" containerID="62cdad4e2ea359b76c2d2e2899c04981fcb6840355012c10de7eff21c109876d" exitCode=0 Mar 12 14:01:03 crc kubenswrapper[4921]: I0312 14:01:03.961598 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29555401-cfhz9" event={"ID":"c85b992e-689f-4f2f-9799-da7e608f6ca8","Type":"ContainerDied","Data":"62cdad4e2ea359b76c2d2e2899c04981fcb6840355012c10de7eff21c109876d"} Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.321731 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.482848 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-combined-ca-bundle\") pod \"c85b992e-689f-4f2f-9799-da7e608f6ca8\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.482981 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpsr6\" (UniqueName: \"kubernetes.io/projected/c85b992e-689f-4f2f-9799-da7e608f6ca8-kube-api-access-mpsr6\") pod \"c85b992e-689f-4f2f-9799-da7e608f6ca8\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.483040 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-config-data\") pod \"c85b992e-689f-4f2f-9799-da7e608f6ca8\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.483118 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-fernet-keys\") pod \"c85b992e-689f-4f2f-9799-da7e608f6ca8\" (UID: \"c85b992e-689f-4f2f-9799-da7e608f6ca8\") " Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.488404 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c85b992e-689f-4f2f-9799-da7e608f6ca8-kube-api-access-mpsr6" (OuterVolumeSpecName: "kube-api-access-mpsr6") pod "c85b992e-689f-4f2f-9799-da7e608f6ca8" (UID: "c85b992e-689f-4f2f-9799-da7e608f6ca8"). InnerVolumeSpecName "kube-api-access-mpsr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.488481 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c85b992e-689f-4f2f-9799-da7e608f6ca8" (UID: "c85b992e-689f-4f2f-9799-da7e608f6ca8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.512057 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c85b992e-689f-4f2f-9799-da7e608f6ca8" (UID: "c85b992e-689f-4f2f-9799-da7e608f6ca8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.527936 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-config-data" (OuterVolumeSpecName: "config-data") pod "c85b992e-689f-4f2f-9799-da7e608f6ca8" (UID: "c85b992e-689f-4f2f-9799-da7e608f6ca8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.585634 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.585686 4921 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.585705 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c85b992e-689f-4f2f-9799-da7e608f6ca8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.585725 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpsr6\" (UniqueName: \"kubernetes.io/projected/c85b992e-689f-4f2f-9799-da7e608f6ca8-kube-api-access-mpsr6\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.983099 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:01:05 crc kubenswrapper[4921]: E0312 14:01:05.983694 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.987805 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29555401-cfhz9" Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.996960 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29555401-cfhz9" event={"ID":"c85b992e-689f-4f2f-9799-da7e608f6ca8","Type":"ContainerDied","Data":"562df4da9b2c820f3a20737c57582374bddbf67a06144c89ed25bfd91a332e64"} Mar 12 14:01:05 crc kubenswrapper[4921]: I0312 14:01:05.997200 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="562df4da9b2c820f3a20737c57582374bddbf67a06144c89ed25bfd91a332e64" Mar 12 14:01:15 crc kubenswrapper[4921]: I0312 14:01:15.087081 4921 generic.go:334] "Generic (PLEG): container finished" podID="bcef78dc-2d5d-4a04-b106-2b54e1b11292" containerID="4d706b0711fc22e597011bcbd5393725e8063459be07a5f18987f1d8d6694168" exitCode=0 Mar 12 14:01:15 crc kubenswrapper[4921]: I0312 14:01:15.087179 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" event={"ID":"bcef78dc-2d5d-4a04-b106-2b54e1b11292","Type":"ContainerDied","Data":"4d706b0711fc22e597011bcbd5393725e8063459be07a5f18987f1d8d6694168"} Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.512183 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.593288 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n96s\" (UniqueName: \"kubernetes.io/projected/bcef78dc-2d5d-4a04-b106-2b54e1b11292-kube-api-access-9n96s\") pod \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.593368 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-1\") pod \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.593424 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-custom-ceph-combined-ca-bundle\") pod \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.593555 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ssh-key-openstack-edpm-ipam\") pod \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.593619 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-0\") pod \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.593691 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-inventory\") pod \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.593723 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-migration-ssh-key-1\") pod \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.593756 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-3\") pod \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.593779 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ceph\") pod \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.593828 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-migration-ssh-key-0\") pod \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.593898 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ceph-nova-0\") pod \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.593960 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-extra-config-0\") pod \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.594021 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-2\") pod \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\" (UID: \"bcef78dc-2d5d-4a04-b106-2b54e1b11292\") " Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.602868 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "bcef78dc-2d5d-4a04-b106-2b54e1b11292" (UID: "bcef78dc-2d5d-4a04-b106-2b54e1b11292"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.604406 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ceph" (OuterVolumeSpecName: "ceph") pod "bcef78dc-2d5d-4a04-b106-2b54e1b11292" (UID: "bcef78dc-2d5d-4a04-b106-2b54e1b11292"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.604913 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcef78dc-2d5d-4a04-b106-2b54e1b11292-kube-api-access-9n96s" (OuterVolumeSpecName: "kube-api-access-9n96s") pod "bcef78dc-2d5d-4a04-b106-2b54e1b11292" (UID: "bcef78dc-2d5d-4a04-b106-2b54e1b11292"). InnerVolumeSpecName "kube-api-access-9n96s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.624769 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "bcef78dc-2d5d-4a04-b106-2b54e1b11292" (UID: "bcef78dc-2d5d-4a04-b106-2b54e1b11292"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.628264 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "bcef78dc-2d5d-4a04-b106-2b54e1b11292" (UID: "bcef78dc-2d5d-4a04-b106-2b54e1b11292"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.637688 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "bcef78dc-2d5d-4a04-b106-2b54e1b11292" (UID: "bcef78dc-2d5d-4a04-b106-2b54e1b11292"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.638357 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "bcef78dc-2d5d-4a04-b106-2b54e1b11292" (UID: "bcef78dc-2d5d-4a04-b106-2b54e1b11292"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.642059 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "bcef78dc-2d5d-4a04-b106-2b54e1b11292" (UID: "bcef78dc-2d5d-4a04-b106-2b54e1b11292"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.646041 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "bcef78dc-2d5d-4a04-b106-2b54e1b11292" (UID: "bcef78dc-2d5d-4a04-b106-2b54e1b11292"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.650977 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "bcef78dc-2d5d-4a04-b106-2b54e1b11292" (UID: "bcef78dc-2d5d-4a04-b106-2b54e1b11292"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.651000 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bcef78dc-2d5d-4a04-b106-2b54e1b11292" (UID: "bcef78dc-2d5d-4a04-b106-2b54e1b11292"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.653510 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "bcef78dc-2d5d-4a04-b106-2b54e1b11292" (UID: "bcef78dc-2d5d-4a04-b106-2b54e1b11292"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.655281 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-inventory" (OuterVolumeSpecName: "inventory") pod "bcef78dc-2d5d-4a04-b106-2b54e1b11292" (UID: "bcef78dc-2d5d-4a04-b106-2b54e1b11292"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.697507 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n96s\" (UniqueName: \"kubernetes.io/projected/bcef78dc-2d5d-4a04-b106-2b54e1b11292-kube-api-access-9n96s\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.697738 4921 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.697883 4921 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.698028 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.698139 4921 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.698248 4921 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-inventory\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.698364 4921 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.698479 4921 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.698694 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.698962 4921 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.699190 4921 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/bcef78dc-2d5d-4a04-b106-2b54e1b11292-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.699307 4921 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:16 crc kubenswrapper[4921]: I0312 14:01:16.699409 4921 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/bcef78dc-2d5d-4a04-b106-2b54e1b11292-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:17 crc kubenswrapper[4921]: I0312 14:01:17.108525 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" event={"ID":"bcef78dc-2d5d-4a04-b106-2b54e1b11292","Type":"ContainerDied","Data":"7297cfa96c2ce39e76a23204422ea0644c088ce659d821d463941bf53e5cdb0f"} Mar 12 14:01:17 crc kubenswrapper[4921]: I0312 14:01:17.108572 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7297cfa96c2ce39e76a23204422ea0644c088ce659d821d463941bf53e5cdb0f" Mar 12 14:01:17 crc kubenswrapper[4921]: I0312 14:01:17.108572 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j" Mar 12 14:01:20 crc kubenswrapper[4921]: I0312 14:01:20.983672 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:01:20 crc kubenswrapper[4921]: E0312 14:01:20.984572 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.493038 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c8b44c5c7-l6d8m"] Mar 12 14:01:34 crc kubenswrapper[4921]: E0312 14:01:34.493903 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85b992e-689f-4f2f-9799-da7e608f6ca8" containerName="keystone-cron" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.493916 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85b992e-689f-4f2f-9799-da7e608f6ca8" containerName="keystone-cron" Mar 12 14:01:34 crc kubenswrapper[4921]: E0312 14:01:34.493931 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcef78dc-2d5d-4a04-b106-2b54e1b11292" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.493938 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcef78dc-2d5d-4a04-b106-2b54e1b11292" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.494088 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c85b992e-689f-4f2f-9799-da7e608f6ca8" containerName="keystone-cron" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.494115 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcef78dc-2d5d-4a04-b106-2b54e1b11292" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.494678 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.508158 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c8b44c5c7-l6d8m"] Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.652586 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-fernet-keys\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.652681 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-config-data\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.652707 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-public-tls-certs\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.652908 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-scripts\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.652963 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-internal-tls-certs\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.653172 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-credential-keys\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.653247 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-combined-ca-bundle\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.653332 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqhwl\" (UniqueName: \"kubernetes.io/projected/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-kube-api-access-kqhwl\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.737643 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-1"] Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.739595 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.754335 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-scripts\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.754375 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-internal-tls-certs\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.754446 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-credential-keys\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.754465 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-combined-ca-bundle\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.754503 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqhwl\" (UniqueName: \"kubernetes.io/projected/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-kube-api-access-kqhwl\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.754535 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-fernet-keys\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.754590 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-config-data\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.754614 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-public-tls-certs\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.758871 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-1"] Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.762846 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-internal-tls-certs\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.763112 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-scripts\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.763746 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-config-data\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.764390 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-public-tls-certs\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.766348 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-credential-keys\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.769270 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-fernet-keys\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.771983 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-combined-ca-bundle\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.795488 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqhwl\" (UniqueName: \"kubernetes.io/projected/8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118-kube-api-access-kqhwl\") pod \"keystone-c8b44c5c7-l6d8m\" (UID: \"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118\") " pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.813483 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.849528 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.851030 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.853785 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.854009 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.857616 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-internal-tls-certs\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.857744 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-config-data\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.857845 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-public-tls-certs\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.857926 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-combined-ca-bundle\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.858017 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5xwr\" (UniqueName: \"kubernetes.io/projected/b1c64c98-e301-4386-b33e-ccd4fde7592d-kube-api-access-b5xwr\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.858100 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-scripts\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.858231 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1c64c98-e301-4386-b33e-ccd4fde7592d-logs\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.858329 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-config-data-custom\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.858417 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1c64c98-e301-4386-b33e-ccd4fde7592d-etc-machine-id\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.863333 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.897856 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.899563 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.901559 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.948777 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.959940 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960098 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5xwr\" (UniqueName: \"kubernetes.io/projected/b1c64c98-e301-4386-b33e-ccd4fde7592d-kube-api-access-b5xwr\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960163 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960190 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8671593e-1709-4d99-ae81-8639ee492d20-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960213 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-scripts\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960247 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960272 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960287 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8671593e-1709-4d99-ae81-8639ee492d20-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960325 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-sys\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960342 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960364 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960400 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1c64c98-e301-4386-b33e-ccd4fde7592d-logs\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960420 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960441 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-run\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960601 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-config-data-custom\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960648 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8671593e-1709-4d99-ae81-8639ee492d20-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960695 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1c64c98-e301-4386-b33e-ccd4fde7592d-etc-machine-id\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960713 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8671593e-1709-4d99-ae81-8639ee492d20-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960739 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-internal-tls-certs\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960878 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-config-data\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.960905 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-public-tls-certs\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.961308 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b1c64c98-e301-4386-b33e-ccd4fde7592d-logs\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.961937 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1c64c98-e301-4386-b33e-ccd4fde7592d-etc-machine-id\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.964787 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-dev\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.965268 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-combined-ca-bundle\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.965319 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7mbm\" (UniqueName: \"kubernetes.io/projected/8671593e-1709-4d99-ae81-8639ee492d20-kube-api-access-v7mbm\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.965358 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8671593e-1709-4d99-ae81-8639ee492d20-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.967366 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-scripts\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.967836 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-public-tls-certs\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.968439 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-internal-tls-certs\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.970546 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-config-data\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:34 crc kubenswrapper[4921]: I0312 14:01:34.972495 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-config-data-custom\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.000553 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5xwr\" (UniqueName: \"kubernetes.io/projected/b1c64c98-e301-4386-b33e-ccd4fde7592d-kube-api-access-b5xwr\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.003926 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77dd7dfdbc-tsrfv"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.005779 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.017899 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1c64c98-e301-4386-b33e-ccd4fde7592d-combined-ca-bundle\") pod \"cinder-api-1\" (UID: \"b1c64c98-e301-4386-b33e-ccd4fde7592d\") " pod="openstack/cinder-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.039351 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77dd7dfdbc-tsrfv"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.050543 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-55b9d64f77-lv45q"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.052162 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.053206 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.067587 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55b9d64f77-lv45q"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.069785 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7mbm\" (UniqueName: \"kubernetes.io/projected/8671593e-1709-4d99-ae81-8639ee492d20-kube-api-access-v7mbm\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.069853 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp2gq\" (UniqueName: \"kubernetes.io/projected/0ca55d43-e73b-403b-9760-f71e8b926650-kube-api-access-hp2gq\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.069893 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8671593e-1709-4d99-ae81-8639ee492d20-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.069920 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-lib-modules\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.069956 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.069981 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.070006 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.070064 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.070092 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.070123 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8671593e-1709-4d99-ae81-8639ee492d20-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.070153 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.070190 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.070212 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.073365 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.073930 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.073965 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.074016 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.077108 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8671593e-1709-4d99-ae81-8639ee492d20-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.077573 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8671593e-1709-4d99-ae81-8639ee492d20-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.070237 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8671593e-1709-4d99-ae81-8639ee492d20-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.086944 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-sys\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.086972 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087019 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087042 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087067 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0ca55d43-e73b-403b-9760-f71e8b926650-ceph\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087101 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087132 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-run\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087193 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8671593e-1709-4d99-ae81-8639ee492d20-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087208 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ca55d43-e73b-403b-9760-f71e8b926650-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087243 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-sys\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087271 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087295 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ca55d43-e73b-403b-9760-f71e8b926650-scripts\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087315 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8671593e-1709-4d99-ae81-8639ee492d20-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087336 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca55d43-e73b-403b-9760-f71e8b926650-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087354 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ca55d43-e73b-403b-9760-f71e8b926650-config-data\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087406 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-dev\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087431 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-run\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087456 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-dev\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087570 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-dev\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087604 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-sys\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087706 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087781 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087825 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.087847 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8671593e-1709-4d99-ae81-8639ee492d20-run\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.092848 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8671593e-1709-4d99-ae81-8639ee492d20-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.096205 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8671593e-1709-4d99-ae81-8639ee492d20-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.096255 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8671593e-1709-4d99-ae81-8639ee492d20-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.098267 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7mbm\" (UniqueName: \"kubernetes.io/projected/8671593e-1709-4d99-ae81-8639ee492d20-kube-api-access-v7mbm\") pod \"cinder-volume-volume1-0\" (UID: \"8671593e-1709-4d99-ae81-8639ee492d20\") " pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.188965 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-ovndb-tls-certs\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189005 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-ovndb-tls-certs\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189030 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189052 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0ca55d43-e73b-403b-9760-f71e8b926650-ceph\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189077 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhx2w\" (UniqueName: \"kubernetes.io/projected/0cc858fd-b2e1-4626-9e77-215bd07e374f-kube-api-access-rhx2w\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189103 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189131 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ca55d43-e73b-403b-9760-f71e8b926650-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189210 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-sys\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189264 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189286 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-public-tls-certs\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189310 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-internal-tls-certs\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189346 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ca55d43-e73b-403b-9760-f71e8b926650-scripts\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189386 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-config\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189404 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca55d43-e73b-403b-9760-f71e8b926650-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189428 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ca55d43-e73b-403b-9760-f71e8b926650-config-data\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189498 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-dev\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189530 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-combined-ca-bundle\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189546 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-run\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189566 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9l84\" (UniqueName: \"kubernetes.io/projected/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-kube-api-access-d9l84\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189604 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-public-tls-certs\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189673 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp2gq\" (UniqueName: \"kubernetes.io/projected/0ca55d43-e73b-403b-9760-f71e8b926650-kube-api-access-hp2gq\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189692 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-sys\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189718 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-lib-modules\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189742 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-lib-modules\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189763 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-combined-ca-bundle\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189794 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-internal-tls-certs\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189895 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189914 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.189979 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-httpd-config\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.190010 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.190040 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-httpd-config\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.190114 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-config\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.190139 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.190573 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.190658 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.190690 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.191248 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.191653 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.192279 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ca55d43-e73b-403b-9760-f71e8b926650-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.192385 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-dev\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.192385 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0ca55d43-e73b-403b-9760-f71e8b926650-run\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.194245 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ca55d43-e73b-403b-9760-f71e8b926650-scripts\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.196438 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ca55d43-e73b-403b-9760-f71e8b926650-config-data\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.198143 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0ca55d43-e73b-403b-9760-f71e8b926650-ceph\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.198849 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca55d43-e73b-403b-9760-f71e8b926650-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.207555 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp2gq\" (UniqueName: \"kubernetes.io/projected/0ca55d43-e73b-403b-9760-f71e8b926650-kube-api-access-hp2gq\") pod \"cinder-backup-0\" (UID: \"0ca55d43-e73b-403b-9760-f71e8b926650\") " pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.291658 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-public-tls-certs\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.292004 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-internal-tls-certs\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.292030 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-config\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.292070 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-combined-ca-bundle\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.292087 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9l84\" (UniqueName: \"kubernetes.io/projected/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-kube-api-access-d9l84\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.292108 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-public-tls-certs\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.292139 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-combined-ca-bundle\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.292155 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-internal-tls-certs\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.292188 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-httpd-config\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.292213 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-httpd-config\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.292241 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-config\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.292274 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-ovndb-tls-certs\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.292292 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-ovndb-tls-certs\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.292318 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhx2w\" (UniqueName: \"kubernetes.io/projected/0cc858fd-b2e1-4626-9e77-215bd07e374f-kube-api-access-rhx2w\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.296207 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-ovndb-tls-certs\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.303312 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-internal-tls-certs\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.308137 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-httpd-config\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.312584 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-config\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.316598 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9l84\" (UniqueName: \"kubernetes.io/projected/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-kube-api-access-d9l84\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.317953 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-combined-ca-bundle\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.319474 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.321418 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.327742 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-combined-ca-bundle\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.327754 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-public-tls-certs\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.328270 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-public-tls-certs\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.328353 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-ovndb-tls-certs\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.328981 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhx2w\" (UniqueName: \"kubernetes.io/projected/0cc858fd-b2e1-4626-9e77-215bd07e374f-kube-api-access-rhx2w\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.329140 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-internal-tls-certs\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.330328 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-httpd-config\") pod \"neutron-77dd7dfdbc-tsrfv\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.330785 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.331195 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.338540 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-config\") pod \"neutron-55b9d64f77-lv45q\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.353447 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.363715 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.370707 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.385883 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c8b44c5c7-l6d8m"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.394175 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-internal-tls-certs\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.394289 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-config-data\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.394327 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-public-tls-certs\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.394353 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-combined-ca-bundle\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.394382 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-logs\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.394414 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w6fx\" (UniqueName: \"kubernetes.io/projected/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-kube-api-access-6w6fx\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.395497 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.397496 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.401989 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.502364 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-combined-ca-bundle\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.502454 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cae9c939-db1a-4372-b8a0-ff4e9892cb85-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.502486 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-logs\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.502527 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae9c939-db1a-4372-b8a0-ff4e9892cb85-config\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.502547 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w6fx\" (UniqueName: \"kubernetes.io/projected/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-kube-api-access-6w6fx\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.502577 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.502597 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae9c939-db1a-4372-b8a0-ff4e9892cb85-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.502617 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae9c939-db1a-4372-b8a0-ff4e9892cb85-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.502652 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-internal-tls-certs\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.502716 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cae9c939-db1a-4372-b8a0-ff4e9892cb85-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.502782 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae9c939-db1a-4372-b8a0-ff4e9892cb85-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.502857 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87b5t\" (UniqueName: \"kubernetes.io/projected/cae9c939-db1a-4372-b8a0-ff4e9892cb85-kube-api-access-87b5t\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.502885 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-config-data\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.502925 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-public-tls-certs\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.511615 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-public-tls-certs\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.511752 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-config-data\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.518008 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-logs\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.522203 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-combined-ca-bundle\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.522585 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-internal-tls-certs\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.525378 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5dc95b94d7-gz566"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.527943 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.535561 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.535568 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.535596 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.535939 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-fqrdt" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.544199 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w6fx\" (UniqueName: \"kubernetes.io/projected/ae5ecb59-c6e0-4a5f-a034-059935a3eaff-kube-api-access-6w6fx\") pod \"nova-api-1\" (UID: \"ae5ecb59-c6e0-4a5f-a034-059935a3eaff\") " pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.569021 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dc95b94d7-gz566"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.610880 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-1"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.618500 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae9c939-db1a-4372-b8a0-ff4e9892cb85-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.618541 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87b5t\" (UniqueName: \"kubernetes.io/projected/cae9c939-db1a-4372-b8a0-ff4e9892cb85-kube-api-access-87b5t\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.618601 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cae9c939-db1a-4372-b8a0-ff4e9892cb85-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.618635 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f081f129-7b40-467c-98cc-420f18d1d3ca-horizon-secret-key\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.618662 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae9c939-db1a-4372-b8a0-ff4e9892cb85-config\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.618687 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.618701 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae9c939-db1a-4372-b8a0-ff4e9892cb85-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.618719 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f081f129-7b40-467c-98cc-420f18d1d3ca-logs\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.618737 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae9c939-db1a-4372-b8a0-ff4e9892cb85-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.618755 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f081f129-7b40-467c-98cc-420f18d1d3ca-config-data\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.618784 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nxqr\" (UniqueName: \"kubernetes.io/projected/f081f129-7b40-467c-98cc-420f18d1d3ca-kube-api-access-5nxqr\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.618831 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f081f129-7b40-467c-98cc-420f18d1d3ca-scripts\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.618873 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cae9c939-db1a-4372-b8a0-ff4e9892cb85-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.619411 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cae9c939-db1a-4372-b8a0-ff4e9892cb85-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.619620 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae9c939-db1a-4372-b8a0-ff4e9892cb85-config\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.619862 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") device mount path \"/mnt/openstack/pv13\"" pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.630071 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae9c939-db1a-4372-b8a0-ff4e9892cb85-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.632873 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cae9c939-db1a-4372-b8a0-ff4e9892cb85-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.644799 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cae9c939-db1a-4372-b8a0-ff4e9892cb85-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.644841 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cae9c939-db1a-4372-b8a0-ff4e9892cb85-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: W0312 14:01:35.648309 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1c64c98_e301_4386_b33e_ccd4fde7592d.slice/crio-0eb9bb1350edca532ec297f49970646e1e1d645bcb11072cd86b88e23b778166 WatchSource:0}: Error finding container 0eb9bb1350edca532ec297f49970646e1e1d645bcb11072cd86b88e23b778166: Status 404 returned error can't find the container with id 0eb9bb1350edca532ec297f49970646e1e1d645bcb11072cd86b88e23b778166 Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.667070 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87b5t\" (UniqueName: \"kubernetes.io/projected/cae9c939-db1a-4372-b8a0-ff4e9892cb85-kube-api-access-87b5t\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.671043 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.676671 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"ovsdbserver-nb-1\" (UID: \"cae9c939-db1a-4372-b8a0-ff4e9892cb85\") " pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.697174 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-1"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.700351 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.704981 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-rpw6q" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.705167 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.705385 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.705501 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.711013 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-56b9755d7c-lfsh8"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.712499 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.720498 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f081f129-7b40-467c-98cc-420f18d1d3ca-horizon-secret-key\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.720548 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f081f129-7b40-467c-98cc-420f18d1d3ca-logs\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.720573 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f081f129-7b40-467c-98cc-420f18d1d3ca-config-data\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.720626 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nxqr\" (UniqueName: \"kubernetes.io/projected/f081f129-7b40-467c-98cc-420f18d1d3ca-kube-api-access-5nxqr\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.720657 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f081f129-7b40-467c-98cc-420f18d1d3ca-scripts\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.721266 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f081f129-7b40-467c-98cc-420f18d1d3ca-logs\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.721880 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f081f129-7b40-467c-98cc-420f18d1d3ca-scripts\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.728645 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-1"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.730036 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.737387 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f081f129-7b40-467c-98cc-420f18d1d3ca-config-data\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.744725 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.747109 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.755422 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f081f129-7b40-467c-98cc-420f18d1d3ca-horizon-secret-key\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.755592 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nxqr\" (UniqueName: \"kubernetes.io/projected/f081f129-7b40-467c-98cc-420f18d1d3ca-kube-api-access-5nxqr\") pod \"horizon-5dc95b94d7-gz566\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.783784 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.806643 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56b9755d7c-lfsh8"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822158 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822216 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-logs\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822255 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-config-data\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822284 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-scripts\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822328 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-public-tls-certs\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822353 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nmm4\" (UniqueName: \"kubernetes.io/projected/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-kube-api-access-4nmm4\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822386 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e083a0f-e15a-4541-ac5b-2870ce8a245c-scripts\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822441 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-combined-ca-bundle\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822465 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-ceph\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822499 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822524 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822552 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e083a0f-e15a-4541-ac5b-2870ce8a245c-config-data\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822581 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-ceph\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822603 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp78f\" (UniqueName: \"kubernetes.io/projected/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-kube-api-access-wp78f\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822674 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0e083a0f-e15a-4541-ac5b-2870ce8a245c-horizon-secret-key\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822695 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-logs\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822723 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-scripts\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822758 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9vbr\" (UniqueName: \"kubernetes.io/projected/0e083a0f-e15a-4541-ac5b-2870ce8a245c-kube-api-access-z9vbr\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822785 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-config-data\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822828 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e083a0f-e15a-4541-ac5b-2870ce8a245c-logs\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822855 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822883 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.822907 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.836788 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-1"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.838889 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.846126 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.847908 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-1"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.858711 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.869134 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.870836 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.881790 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.897661 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924253 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0e083a0f-e15a-4541-ac5b-2870ce8a245c-horizon-secret-key\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924290 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-logs\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924319 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-scripts\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924351 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9vbr\" (UniqueName: \"kubernetes.io/projected/0e083a0f-e15a-4541-ac5b-2870ce8a245c-kube-api-access-z9vbr\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924373 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-config-data\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924395 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e083a0f-e15a-4541-ac5b-2870ce8a245c-logs\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924416 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924430 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924447 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924465 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924484 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-logs\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924506 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-config-data\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924526 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-scripts\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924552 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-public-tls-certs\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924571 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nmm4\" (UniqueName: \"kubernetes.io/projected/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-kube-api-access-4nmm4\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924594 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e083a0f-e15a-4541-ac5b-2870ce8a245c-scripts\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924636 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-combined-ca-bundle\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924654 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-ceph\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924681 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924695 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924714 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e083a0f-e15a-4541-ac5b-2870ce8a245c-config-data\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924735 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-ceph\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.924751 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp78f\" (UniqueName: \"kubernetes.io/projected/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-kube-api-access-wp78f\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.926764 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-logs\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.928774 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-config-data\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.929921 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.936247 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.936417 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-scripts\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.936580 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e083a0f-e15a-4541-ac5b-2870ce8a245c-logs\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.936740 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.937064 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-logs\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.937099 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e083a0f-e15a-4541-ac5b-2870ce8a245c-scripts\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.937780 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") device mount path \"/mnt/openstack/pv14\"" pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.940271 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e083a0f-e15a-4541-ac5b-2870ce8a245c-config-data\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.943311 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-ceph\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.947360 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0e083a0f-e15a-4541-ac5b-2870ce8a245c-horizon-secret-key\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.947761 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-public-tls-certs\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.949968 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-ceph\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.949987 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-config-data\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.956773 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9vbr\" (UniqueName: \"kubernetes.io/projected/0e083a0f-e15a-4541-ac5b-2870ce8a245c-kube-api-access-z9vbr\") pod \"horizon-56b9755d7c-lfsh8\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.956861 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-scripts\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.957456 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp78f\" (UniqueName: \"kubernetes.io/projected/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-kube-api-access-wp78f\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.968859 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-combined-ca-bundle\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.971056 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nmm4\" (UniqueName: \"kubernetes.io/projected/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-kube-api-access-4nmm4\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.972251 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.973751 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:35 crc kubenswrapper[4921]: I0312 14:01:35.986590 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:01:35 crc kubenswrapper[4921]: E0312 14:01:35.987211 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.000521 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.001058 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.026900 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/708d9f61-855e-4fa5-b8fc-acb8b745f37b-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027210 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-scripts\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027234 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5d74c96-9dfe-4db7-8287-b68a27840cf8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027252 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027367 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027393 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-combined-ca-bundle\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027409 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027450 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-internal-tls-certs\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027473 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/708d9f61-855e-4fa5-b8fc-acb8b745f37b-logs\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027492 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/708d9f61-855e-4fa5-b8fc-acb8b745f37b-ceph\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027535 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cljv8\" (UniqueName: \"kubernetes.io/projected/d5d74c96-9dfe-4db7-8287-b68a27840cf8-kube-api-access-cljv8\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027573 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027614 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fmh4\" (UniqueName: \"kubernetes.io/projected/708d9f61-855e-4fa5-b8fc-acb8b745f37b-kube-api-access-7fmh4\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027633 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027678 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d5d74c96-9dfe-4db7-8287-b68a27840cf8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027706 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027723 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d74c96-9dfe-4db7-8287-b68a27840cf8-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.027795 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-config-data\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.041165 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.061407 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.086731 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.132587 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-config-data\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.132782 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/708d9f61-855e-4fa5-b8fc-acb8b745f37b-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.132844 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-scripts\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.132885 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5d74c96-9dfe-4db7-8287-b68a27840cf8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.132911 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.133114 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.133176 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-combined-ca-bundle\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.133224 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.133265 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-internal-tls-certs\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.133319 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/708d9f61-855e-4fa5-b8fc-acb8b745f37b-logs\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.133355 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/708d9f61-855e-4fa5-b8fc-acb8b745f37b-ceph\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.133410 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cljv8\" (UniqueName: \"kubernetes.io/projected/d5d74c96-9dfe-4db7-8287-b68a27840cf8-kube-api-access-cljv8\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.133438 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.133498 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fmh4\" (UniqueName: \"kubernetes.io/projected/708d9f61-855e-4fa5-b8fc-acb8b745f37b-kube-api-access-7fmh4\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.133530 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.133595 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d5d74c96-9dfe-4db7-8287-b68a27840cf8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.133668 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.133693 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d74c96-9dfe-4db7-8287-b68a27840cf8-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.134739 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") device mount path \"/mnt/openstack/pv16\"" pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.134754 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/708d9f61-855e-4fa5-b8fc-acb8b745f37b-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.135430 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5d74c96-9dfe-4db7-8287-b68a27840cf8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.136046 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d74c96-9dfe-4db7-8287-b68a27840cf8-logs\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.136358 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/708d9f61-855e-4fa5-b8fc-acb8b745f37b-logs\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.136454 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.149549 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-scripts\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.149749 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.150050 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.150134 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/708d9f61-855e-4fa5-b8fc-acb8b745f37b-ceph\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.150201 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d5d74c96-9dfe-4db7-8287-b68a27840cf8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.152198 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-internal-tls-certs\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.152945 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.158073 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-config-data\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.160670 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.172871 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cljv8\" (UniqueName: \"kubernetes.io/projected/d5d74c96-9dfe-4db7-8287-b68a27840cf8-kube-api-access-cljv8\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.177324 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-combined-ca-bundle\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.178769 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fmh4\" (UniqueName: \"kubernetes.io/projected/708d9f61-855e-4fa5-b8fc-acb8b745f37b-kube-api-access-7fmh4\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.197046 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-1\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.232030 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.308266 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-1" event={"ID":"b1c64c98-e301-4386-b33e-ccd4fde7592d","Type":"ContainerStarted","Data":"0eb9bb1350edca532ec297f49970646e1e1d645bcb11072cd86b88e23b778166"} Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.316355 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c8b44c5c7-l6d8m" event={"ID":"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118","Type":"ContainerStarted","Data":"7ebce26958813c112e2c390fd0e1a8aaed42d4793b67e9fecb00d09529fdad7e"} Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.316393 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c8b44c5c7-l6d8m" event={"ID":"8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118","Type":"ContainerStarted","Data":"6c4a56f434be5413b2efec2dafd76da9b15b6de0f8376d51b2f6f2b9058f8b31"} Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.342387 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-c8b44c5c7-l6d8m" podStartSLOduration=2.342366216 podStartE2EDuration="2.342366216s" podCreationTimestamp="2026-03-12 14:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:36.34088161 +0000 UTC m=+3119.030953601" watchObservedRunningTime="2026-03-12 14:01:36.342366216 +0000 UTC m=+3119.032438187" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.412698 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.422879 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.491337 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55b9d64f77-lv45q"] Mar 12 14:01:36 crc kubenswrapper[4921]: W0312 14:01:36.506367 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cc858fd_b2e1_4626_9e77_215bd07e374f.slice/crio-0fe3edd5e0220a288f865544cc65f17f03a77377f9fa75bb5e2d75800c42cac5 WatchSource:0}: Error finding container 0fe3edd5e0220a288f865544cc65f17f03a77377f9fa75bb5e2d75800c42cac5: Status 404 returned error can't find the container with id 0fe3edd5e0220a288f865544cc65f17f03a77377f9fa75bb5e2d75800c42cac5 Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.537131 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5dc95b94d7-gz566"] Mar 12 14:01:36 crc kubenswrapper[4921]: W0312 14:01:36.574778 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf081f129_7b40_467c_98cc_420f18d1d3ca.slice/crio-8c88643a54b8332d972598d1ec1b39c6bdd0cbd6e46059c936b22bac501508bc WatchSource:0}: Error finding container 8c88643a54b8332d972598d1ec1b39c6bdd0cbd6e46059c936b22bac501508bc: Status 404 returned error can't find the container with id 8c88643a54b8332d972598d1ec1b39c6bdd0cbd6e46059c936b22bac501508bc Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.598827 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1"] Mar 12 14:01:36 crc kubenswrapper[4921]: W0312 14:01:36.614972 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae5ecb59_c6e0_4a5f_a034_059935a3eaff.slice/crio-2bde7bd270d8d57d4fab188978077be91de5fe2a86c18efed4cb7f703e01a783 WatchSource:0}: Error finding container 2bde7bd270d8d57d4fab188978077be91de5fe2a86c18efed4cb7f703e01a783: Status 404 returned error can't find the container with id 2bde7bd270d8d57d4fab188978077be91de5fe2a86c18efed4cb7f703e01a783 Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.616177 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.691641 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 12 14:01:36 crc kubenswrapper[4921]: W0312 14:01:36.699217 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcae9c939_db1a_4372_b8a0_ff4e9892cb85.slice/crio-56339da527c3a20c07d3867ae2d745fbd915cae316614056f3bfad68a620928f WatchSource:0}: Error finding container 56339da527c3a20c07d3867ae2d745fbd915cae316614056f3bfad68a620928f: Status 404 returned error can't find the container with id 56339da527c3a20c07d3867ae2d745fbd915cae316614056f3bfad68a620928f Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.847769 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-1"] Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.889313 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56b9755d7c-lfsh8"] Mar 12 14:01:36 crc kubenswrapper[4921]: I0312 14:01:36.945260 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.147240 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5664d5cbb7-9rpxn"] Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.152046 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.182957 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5664d5cbb7-9rpxn"] Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.236314 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-1"] Mar 12 14:01:37 crc kubenswrapper[4921]: W0312 14:01:37.251938 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod708d9f61_855e_4fa5_b8fc_acb8b745f37b.slice/crio-ae2e27fa2f00d330d14afd8cf5c5b12bde968559279455e49f887e6b0576d7ca WatchSource:0}: Error finding container ae2e27fa2f00d330d14afd8cf5c5b12bde968559279455e49f887e6b0576d7ca: Status 404 returned error can't find the container with id ae2e27fa2f00d330d14afd8cf5c5b12bde968559279455e49f887e6b0576d7ca Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.263294 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-openstack-edpm-ipam\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.263331 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-ovsdbserver-nb\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.263630 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b8dw\" (UniqueName: \"kubernetes.io/projected/5f732887-96f4-4cd5-9a36-df3848958280-kube-api-access-5b8dw\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.263852 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-ovsdbserver-sb\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.264005 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-config\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.264097 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-dns-svc\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.341195 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-1" event={"ID":"b1c64c98-e301-4386-b33e-ccd4fde7592d","Type":"ContainerStarted","Data":"4f6d6328a82104119af70e66960ba836e3abf36476adf0c69101f24ddb45d095"} Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.353641 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"ae5ecb59-c6e0-4a5f-a034-059935a3eaff","Type":"ContainerStarted","Data":"2bde7bd270d8d57d4fab188978077be91de5fe2a86c18efed4cb7f703e01a783"} Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.365129 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-1" event={"ID":"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1","Type":"ContainerStarted","Data":"d4a4c0f1cb5cef283140a658199f1beacbcb3d6f525ca03088007c2aeb25ddef"} Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.366076 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-ovsdbserver-sb\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.366160 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-config\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.366200 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-dns-svc\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.366238 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-openstack-edpm-ipam\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.366256 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-ovsdbserver-nb\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.366290 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b8dw\" (UniqueName: \"kubernetes.io/projected/5f732887-96f4-4cd5-9a36-df3848958280-kube-api-access-5b8dw\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.367945 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-ovsdbserver-nb\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.368263 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-ovsdbserver-sb\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.368491 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-dns-svc\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.369850 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dc95b94d7-gz566" event={"ID":"f081f129-7b40-467c-98cc-420f18d1d3ca","Type":"ContainerStarted","Data":"8c88643a54b8332d972598d1ec1b39c6bdd0cbd6e46059c936b22bac501508bc"} Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.371478 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-config\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.373789 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/5f732887-96f4-4cd5-9a36-df3848958280-openstack-edpm-ipam\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.374743 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56b9755d7c-lfsh8" event={"ID":"0e083a0f-e15a-4541-ac5b-2870ce8a245c","Type":"ContainerStarted","Data":"2b315090f86039051b321c497214e4f06a7ca0f598697d9a9782187cd8e8c3c8"} Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.375971 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a","Type":"ContainerStarted","Data":"1a75bef7719a50942d89c8cadf2f77fb37e0f78b2faa53448550f73f3b1d7997"} Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.376923 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b9d64f77-lv45q" event={"ID":"0cc858fd-b2e1-4626-9e77-215bd07e374f","Type":"ContainerStarted","Data":"94d44893e3c76ca70438854c97d11226b338e90efa37c39f8880b1921387d403"} Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.376941 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b9d64f77-lv45q" event={"ID":"0cc858fd-b2e1-4626-9e77-215bd07e374f","Type":"ContainerStarted","Data":"0fe3edd5e0220a288f865544cc65f17f03a77377f9fa75bb5e2d75800c42cac5"} Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.377787 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"cae9c939-db1a-4372-b8a0-ff4e9892cb85","Type":"ContainerStarted","Data":"00ad716bcfb0460684f8f9bf3ad61dc4384b2981c109ba9c511b387bf8886887"} Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.377808 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"cae9c939-db1a-4372-b8a0-ff4e9892cb85","Type":"ContainerStarted","Data":"56339da527c3a20c07d3867ae2d745fbd915cae316614056f3bfad68a620928f"} Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.380843 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0ca55d43-e73b-403b-9760-f71e8b926650","Type":"ContainerStarted","Data":"c3f83b2c30a38895c33db87dc478551c257f0871606886d1f646063a6b93c222"} Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.382327 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-1" event={"ID":"708d9f61-855e-4fa5-b8fc-acb8b745f37b","Type":"ContainerStarted","Data":"ae2e27fa2f00d330d14afd8cf5c5b12bde968559279455e49f887e6b0576d7ca"} Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.382445 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.396261 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b8dw\" (UniqueName: \"kubernetes.io/projected/5f732887-96f4-4cd5-9a36-df3848958280-kube-api-access-5b8dw\") pod \"dnsmasq-dns-5664d5cbb7-9rpxn\" (UID: \"5f732887-96f4-4cd5-9a36-df3848958280\") " pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.525078 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.552010 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 12 14:01:37 crc kubenswrapper[4921]: I0312 14:01:37.962903 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.387231 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5664d5cbb7-9rpxn"] Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.440689 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a","Type":"ContainerStarted","Data":"11f154ddfbccbb99b3d12cddc0220c29418164b631dddab7e94fd8f695410445"} Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.480979 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-1" event={"ID":"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1","Type":"ContainerStarted","Data":"a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608"} Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.498232 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" event={"ID":"5f732887-96f4-4cd5-9a36-df3848958280","Type":"ContainerStarted","Data":"7f8737e2b165de6d0767d7879aea446201bf9e072911b3ffb340ee2a1cfca784"} Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.500715 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-56b9755d7c-lfsh8"] Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.509185 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"cae9c939-db1a-4372-b8a0-ff4e9892cb85","Type":"ContainerStarted","Data":"a5197aa4bb2cbbf9608bb3db7f3e8454669d212bc3d5ca400339acf2b4dae1aa"} Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.513427 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77dd7dfdbc-tsrfv"] Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.515576 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"ae5ecb59-c6e0-4a5f-a034-059935a3eaff","Type":"ContainerStarted","Data":"284c300afcd34051324af59557dfb5759116018509feff0a72af06caf252ca65"} Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.516823 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5d74c96-9dfe-4db7-8287-b68a27840cf8","Type":"ContainerStarted","Data":"c3dc3b43a1d58be9bb151572cc8beb4943ca96d1db80efb2c460e3d4ada5ed5e"} Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.518719 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"8671593e-1709-4d99-ae81-8639ee492d20","Type":"ContainerStarted","Data":"f1611f937f00f720702464869a25c0831da7b3eddb25ae23223a1eb5aadcd491"} Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.550000 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-1"] Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.576761 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f698cc456-lcngv"] Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.578758 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.581356 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.591202 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f698cc456-lcngv"] Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.609952 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-1"] Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.619334 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5dc95b94d7-gz566"] Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.629923 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-bbd56cc76-cwl96"] Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.632507 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.638373 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bbd56cc76-cwl96"] Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.639021 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.639004242 podStartE2EDuration="4.639004242s" podCreationTimestamp="2026-03-12 14:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:38.522838681 +0000 UTC m=+3121.212910652" watchObservedRunningTime="2026-03-12 14:01:38.639004242 +0000 UTC m=+3121.329076213" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.709638 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6e62dec-8193-4d3c-a111-2ee250f79b86-horizon-secret-key\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.709679 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6e62dec-8193-4d3c-a111-2ee250f79b86-config-data\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.709699 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bhz7\" (UniqueName: \"kubernetes.io/projected/e6e62dec-8193-4d3c-a111-2ee250f79b86-kube-api-access-9bhz7\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.709718 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-combined-ca-bundle\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.709762 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6e62dec-8193-4d3c-a111-2ee250f79b86-scripts\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.709790 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e88e6256-b5e0-44bc-8f61-31e31844f957-logs\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.709913 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28xzm\" (UniqueName: \"kubernetes.io/projected/e88e6256-b5e0-44bc-8f61-31e31844f957-kube-api-access-28xzm\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.709931 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-horizon-secret-key\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.709950 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-horizon-tls-certs\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.709969 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e62dec-8193-4d3c-a111-2ee250f79b86-logs\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.709999 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e62dec-8193-4d3c-a111-2ee250f79b86-combined-ca-bundle\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.710040 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e88e6256-b5e0-44bc-8f61-31e31844f957-scripts\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.710089 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6e62dec-8193-4d3c-a111-2ee250f79b86-horizon-tls-certs\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.710120 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e88e6256-b5e0-44bc-8f61-31e31844f957-config-data\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.731390 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.811904 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e88e6256-b5e0-44bc-8f61-31e31844f957-logs\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.811977 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28xzm\" (UniqueName: \"kubernetes.io/projected/e88e6256-b5e0-44bc-8f61-31e31844f957-kube-api-access-28xzm\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.812007 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-horizon-secret-key\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.812034 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-horizon-tls-certs\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.812068 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e62dec-8193-4d3c-a111-2ee250f79b86-logs\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.812144 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e62dec-8193-4d3c-a111-2ee250f79b86-combined-ca-bundle\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.812209 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e88e6256-b5e0-44bc-8f61-31e31844f957-scripts\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.812286 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6e62dec-8193-4d3c-a111-2ee250f79b86-horizon-tls-certs\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.812333 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e88e6256-b5e0-44bc-8f61-31e31844f957-config-data\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.812383 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6e62dec-8193-4d3c-a111-2ee250f79b86-horizon-secret-key\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.812409 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6e62dec-8193-4d3c-a111-2ee250f79b86-config-data\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.812433 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bhz7\" (UniqueName: \"kubernetes.io/projected/e6e62dec-8193-4d3c-a111-2ee250f79b86-kube-api-access-9bhz7\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.812457 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-combined-ca-bundle\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.812509 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6e62dec-8193-4d3c-a111-2ee250f79b86-scripts\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.813436 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6e62dec-8193-4d3c-a111-2ee250f79b86-scripts\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.813743 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e88e6256-b5e0-44bc-8f61-31e31844f957-logs\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.814416 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6e62dec-8193-4d3c-a111-2ee250f79b86-logs\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.815135 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6e62dec-8193-4d3c-a111-2ee250f79b86-config-data\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.815404 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e88e6256-b5e0-44bc-8f61-31e31844f957-scripts\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.818009 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e88e6256-b5e0-44bc-8f61-31e31844f957-config-data\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.827280 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e62dec-8193-4d3c-a111-2ee250f79b86-combined-ca-bundle\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.828653 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-horizon-tls-certs\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.830315 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28xzm\" (UniqueName: \"kubernetes.io/projected/e88e6256-b5e0-44bc-8f61-31e31844f957-kube-api-access-28xzm\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.832013 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6e62dec-8193-4d3c-a111-2ee250f79b86-horizon-tls-certs\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.832493 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-horizon-secret-key\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.833974 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-combined-ca-bundle\") pod \"horizon-6f698cc456-lcngv\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.835941 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bhz7\" (UniqueName: \"kubernetes.io/projected/e6e62dec-8193-4d3c-a111-2ee250f79b86-kube-api-access-9bhz7\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.836010 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6e62dec-8193-4d3c-a111-2ee250f79b86-horizon-secret-key\") pod \"horizon-bbd56cc76-cwl96\" (UID: \"e6e62dec-8193-4d3c-a111-2ee250f79b86\") " pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.927261 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:38 crc kubenswrapper[4921]: I0312 14:01:38.964948 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.546075 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a","Type":"ContainerStarted","Data":"42d267bdcd1547ac622efcdb43e50ac9f9e7c1dfdf5aef5a880c3d9372a108de"} Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.551240 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b9d64f77-lv45q" event={"ID":"0cc858fd-b2e1-4626-9e77-215bd07e374f","Type":"ContainerStarted","Data":"1f71e67876e862d1c7c3e10a287d57787c9a4de035f4f61e037024d4860147ac"} Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.552320 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.566206 4921 generic.go:334] "Generic (PLEG): container finished" podID="5f732887-96f4-4cd5-9a36-df3848958280" containerID="66d1b159480bbbfe5a3a06d65b46ee08456ad2c7dcfb411c7e06f59b9ef77dca" exitCode=0 Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.566318 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" event={"ID":"5f732887-96f4-4cd5-9a36-df3848958280","Type":"ContainerDied","Data":"66d1b159480bbbfe5a3a06d65b46ee08456ad2c7dcfb411c7e06f59b9ef77dca"} Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.583454 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1" event={"ID":"ae5ecb59-c6e0-4a5f-a034-059935a3eaff","Type":"ContainerStarted","Data":"44b4d1246ae01da0616f03843ae06a4d396c49d1d1287fa3b57488cb7fd1dc95"} Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.585447 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.585423402 podStartE2EDuration="5.585423402s" podCreationTimestamp="2026-03-12 14:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:39.566360484 +0000 UTC m=+3122.256432455" watchObservedRunningTime="2026-03-12 14:01:39.585423402 +0000 UTC m=+3122.275495373" Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.589840 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-1" event={"ID":"708d9f61-855e-4fa5-b8fc-acb8b745f37b","Type":"ContainerStarted","Data":"83326320c66c2208b995e7c5fe0d56a94154636a9722ca249c50e59ed101c393"} Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.591618 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77dd7dfdbc-tsrfv" event={"ID":"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5","Type":"ContainerStarted","Data":"15dcb7800a5ace68fcd8ca6e2d30df6e7f87ab1a9fafbe0d36b28f5bfd5746c9"} Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.591651 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77dd7dfdbc-tsrfv" event={"ID":"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5","Type":"ContainerStarted","Data":"8a7c795bd4de4e959fb37f8009f3bcbd6ea6786f1d92d08c401a219f8c5f0efb"} Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.593259 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-1" event={"ID":"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1","Type":"ContainerStarted","Data":"958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e"} Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.593368 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-1" podUID="fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" containerName="glance-log" containerID="cri-o://a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608" gracePeriod=30 Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.596093 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-1" podUID="fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" containerName="glance-httpd" containerID="cri-o://958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e" gracePeriod=30 Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.612404 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-1" event={"ID":"b1c64c98-e301-4386-b33e-ccd4fde7592d","Type":"ContainerStarted","Data":"f5f89412fd6125d545e8acf754648a5116ef438b472ab22b2059ad8b9fb82ad4"} Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.612486 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-1" Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.633423 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-55b9d64f77-lv45q" podStartSLOduration=5.633402 podStartE2EDuration="5.633402s" podCreationTimestamp="2026-03-12 14:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:39.624855587 +0000 UTC m=+3122.314927558" watchObservedRunningTime="2026-03-12 14:01:39.633402 +0000 UTC m=+3122.323473971" Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.648945 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5d74c96-9dfe-4db7-8287-b68a27840cf8","Type":"ContainerStarted","Data":"2b1b6d94e766a0a67f6026f13b8924ff205c3b1e83923b820ef84adac2495236"} Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.663125 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-1" podStartSLOduration=5.6631098170000005 podStartE2EDuration="5.663109817s" podCreationTimestamp="2026-03-12 14:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:39.649378934 +0000 UTC m=+3122.339450905" watchObservedRunningTime="2026-03-12 14:01:39.663109817 +0000 UTC m=+3122.353181788" Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.712738 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-1" podStartSLOduration=4.712716145 podStartE2EDuration="4.712716145s" podCreationTimestamp="2026-03-12 14:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:39.669110752 +0000 UTC m=+3122.359182733" watchObservedRunningTime="2026-03-12 14:01:39.712716145 +0000 UTC m=+3122.402788126" Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.765162 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-1" podStartSLOduration=4.765043278 podStartE2EDuration="4.765043278s" podCreationTimestamp="2026-03-12 14:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:39.703750499 +0000 UTC m=+3122.393822480" watchObservedRunningTime="2026-03-12 14:01:39.765043278 +0000 UTC m=+3122.455115249" Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.801916 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-bbd56cc76-cwl96"] Mar 12 14:01:39 crc kubenswrapper[4921]: I0312 14:01:39.830519 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f698cc456-lcngv"] Mar 12 14:01:39 crc kubenswrapper[4921]: W0312 14:01:39.840807 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode88e6256_b5e0_44bc_8f61_31e31844f957.slice/crio-8fd31093ee07048c158413e63507d2b07e091454438290e32c63bfeb9d0ef4e0 WatchSource:0}: Error finding container 8fd31093ee07048c158413e63507d2b07e091454438290e32c63bfeb9d0ef4e0: Status 404 returned error can't find the container with id 8fd31093ee07048c158413e63507d2b07e091454438290e32c63bfeb9d0ef4e0 Mar 12 14:01:39 crc kubenswrapper[4921]: E0312 14:01:39.846397 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbde48fe_130e_4fd9_bf8c_5c2a28b3b6b1.slice/crio-conmon-958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbde48fe_130e_4fd9_bf8c_5c2a28b3b6b1.slice/crio-a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608.scope\": RecentStats: unable to find data in memory cache]" Mar 12 14:01:39 crc kubenswrapper[4921]: W0312 14:01:39.867456 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6e62dec_8193_4d3c_a111_2ee250f79b86.slice/crio-5ffb77743fdf22759bf7e011b27890b801b29aa43727e3006af1df9cd23f9db6 WatchSource:0}: Error finding container 5ffb77743fdf22759bf7e011b27890b801b29aa43727e3006af1df9cd23f9db6: Status 404 returned error can't find the container with id 5ffb77743fdf22759bf7e011b27890b801b29aa43727e3006af1df9cd23f9db6 Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.345363 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-1" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.484739 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-ceph\") pod \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.485202 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-combined-ca-bundle\") pod \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.485281 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-config-data\") pod \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.485309 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-scripts\") pod \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.485384 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-logs\") pod \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.485439 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-public-tls-certs\") pod \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.485478 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.485507 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp78f\" (UniqueName: \"kubernetes.io/projected/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-kube-api-access-wp78f\") pod \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.485532 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-httpd-run\") pod \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\" (UID: \"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1\") " Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.486473 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-logs" (OuterVolumeSpecName: "logs") pod "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" (UID: "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.486614 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" (UID: "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.490620 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-scripts" (OuterVolumeSpecName: "scripts") pod "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" (UID: "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.494461 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-kube-api-access-wp78f" (OuterVolumeSpecName: "kube-api-access-wp78f") pod "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" (UID: "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1"). InnerVolumeSpecName "kube-api-access-wp78f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.503828 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-ceph" (OuterVolumeSpecName: "ceph") pod "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" (UID: "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.503916 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" (UID: "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.588452 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.588503 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.588514 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-logs\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.588534 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.588543 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp78f\" (UniqueName: \"kubernetes.io/projected/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-kube-api-access-wp78f\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.588555 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.624514 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" (UID: "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.659400 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.664021 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bbd56cc76-cwl96" event={"ID":"e6e62dec-8193-4d3c-a111-2ee250f79b86","Type":"ContainerStarted","Data":"5ffb77743fdf22759bf7e011b27890b801b29aa43727e3006af1df9cd23f9db6"} Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.667935 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"8671593e-1709-4d99-ae81-8639ee492d20","Type":"ContainerStarted","Data":"a78c5b2a9c9ebca92b242c3c565646a5c321189aadd3e4d2634e922ee3dc1e16"} Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.667991 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"8671593e-1709-4d99-ae81-8639ee492d20","Type":"ContainerStarted","Data":"057577c267a3f5895e20d03c1d12d8c47b4b403373ced2070b2a61c8692972ee"} Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.672685 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" (UID: "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.680159 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77dd7dfdbc-tsrfv" event={"ID":"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5","Type":"ContainerStarted","Data":"45d1d4c39d265b4155f1a4db8086bc519b8140af18883faf3df4cf5a7b54227e"} Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.680764 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.691301 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.691342 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.691355 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.700450 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=4.550102062 podStartE2EDuration="6.700146749s" podCreationTimestamp="2026-03-12 14:01:34 +0000 UTC" firstStartedPulling="2026-03-12 14:01:37.708151402 +0000 UTC m=+3120.398223373" lastFinishedPulling="2026-03-12 14:01:39.858196089 +0000 UTC m=+3122.548268060" observedRunningTime="2026-03-12 14:01:40.695086243 +0000 UTC m=+3123.385158224" watchObservedRunningTime="2026-03-12 14:01:40.700146749 +0000 UTC m=+3123.390218720" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.706074 4921 generic.go:334] "Generic (PLEG): container finished" podID="fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" containerID="958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e" exitCode=143 Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.706104 4921 generic.go:334] "Generic (PLEG): container finished" podID="fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" containerID="a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608" exitCode=143 Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.706154 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-1" event={"ID":"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1","Type":"ContainerDied","Data":"958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e"} Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.706179 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-1" event={"ID":"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1","Type":"ContainerDied","Data":"a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608"} Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.706189 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-1" event={"ID":"fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1","Type":"ContainerDied","Data":"d4a4c0f1cb5cef283140a658199f1beacbcb3d6f525ca03088007c2aeb25ddef"} Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.706203 4921 scope.go:117] "RemoveContainer" containerID="958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.706324 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-1" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.731288 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.738475 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" event={"ID":"5f732887-96f4-4cd5-9a36-df3848958280","Type":"ContainerStarted","Data":"9d18add6c2c3444ee3ebb176af5e9b2f8957ba228e2ab30d56d4d362c5a572c0"} Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.739780 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.742086 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77dd7dfdbc-tsrfv" podStartSLOduration=6.74204376 podStartE2EDuration="6.74204376s" podCreationTimestamp="2026-03-12 14:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:40.720796356 +0000 UTC m=+3123.410868327" watchObservedRunningTime="2026-03-12 14:01:40.74204376 +0000 UTC m=+3123.432115741" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.755779 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f698cc456-lcngv" event={"ID":"e88e6256-b5e0-44bc-8f61-31e31844f957","Type":"ContainerStarted","Data":"8fd31093ee07048c158413e63507d2b07e091454438290e32c63bfeb9d0ef4e0"} Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.756912 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-config-data" (OuterVolumeSpecName: "config-data") pod "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" (UID: "fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.757652 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0ca55d43-e73b-403b-9760-f71e8b926650","Type":"ContainerStarted","Data":"5913652d986d74b493d87a9b446c26a43505eeae17f05cb72ceeb86c4eb80fbc"} Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.757677 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0ca55d43-e73b-403b-9760-f71e8b926650","Type":"ContainerStarted","Data":"2ac7efcccf617bfcf80a3764cd5bfb190f5f33980c4fc2886bc5bfe046781d3e"} Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.778745 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" podStartSLOduration=3.778730211 podStartE2EDuration="3.778730211s" podCreationTimestamp="2026-03-12 14:01:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:40.761843601 +0000 UTC m=+3123.451915582" watchObservedRunningTime="2026-03-12 14:01:40.778730211 +0000 UTC m=+3123.468802182" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.795362 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.810483 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=4.291118619 podStartE2EDuration="6.810454319s" podCreationTimestamp="2026-03-12 14:01:34 +0000 UTC" firstStartedPulling="2026-03-12 14:01:36.665891157 +0000 UTC m=+3119.355963128" lastFinishedPulling="2026-03-12 14:01:39.185226857 +0000 UTC m=+3121.875298828" observedRunningTime="2026-03-12 14:01:40.806700923 +0000 UTC m=+3123.496772894" watchObservedRunningTime="2026-03-12 14:01:40.810454319 +0000 UTC m=+3123.500526290" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.845242 4921 scope.go:117] "RemoveContainer" containerID="a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.851448 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.851431192 podStartE2EDuration="6.851431192s" podCreationTimestamp="2026-03-12 14:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:40.830556409 +0000 UTC m=+3123.520628380" watchObservedRunningTime="2026-03-12 14:01:40.851431192 +0000 UTC m=+3123.541503163" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.918072 4921 scope.go:117] "RemoveContainer" containerID="958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e" Mar 12 14:01:40 crc kubenswrapper[4921]: E0312 14:01:40.920468 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e\": container with ID starting with 958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e not found: ID does not exist" containerID="958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.920510 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e"} err="failed to get container status \"958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e\": rpc error: code = NotFound desc = could not find container \"958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e\": container with ID starting with 958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e not found: ID does not exist" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.920535 4921 scope.go:117] "RemoveContainer" containerID="a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608" Mar 12 14:01:40 crc kubenswrapper[4921]: E0312 14:01:40.921634 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608\": container with ID starting with a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608 not found: ID does not exist" containerID="a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.921667 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608"} err="failed to get container status \"a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608\": rpc error: code = NotFound desc = could not find container \"a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608\": container with ID starting with a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608 not found: ID does not exist" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.921698 4921 scope.go:117] "RemoveContainer" containerID="958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.931380 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e"} err="failed to get container status \"958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e\": rpc error: code = NotFound desc = could not find container \"958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e\": container with ID starting with 958b330d27fa8c9fdca32b662972143e43ac6afad69ed9bfb91efcd516a7b57e not found: ID does not exist" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.931417 4921 scope.go:117] "RemoveContainer" containerID="a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608" Mar 12 14:01:40 crc kubenswrapper[4921]: I0312 14:01:40.933301 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608"} err="failed to get container status \"a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608\": rpc error: code = NotFound desc = could not find container \"a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608\": container with ID starting with a34056ebc46c01735254d5690572e220ac2bbaf6a04c8eeeb54a392838a23608 not found: ID does not exist" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.087893 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-1"] Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.099074 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-1"] Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.119089 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-1"] Mar 12 14:01:41 crc kubenswrapper[4921]: E0312 14:01:41.119547 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" containerName="glance-httpd" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.119564 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" containerName="glance-httpd" Mar 12 14:01:41 crc kubenswrapper[4921]: E0312 14:01:41.119603 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" containerName="glance-log" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.119611 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" containerName="glance-log" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.119789 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" containerName="glance-log" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.119826 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" containerName="glance-httpd" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.120887 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.148996 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-1"] Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.233034 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-combined-ca-bundle\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.233111 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.233168 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-config-data\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.233191 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-scripts\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.233225 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-public-tls-certs\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.233259 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-logs\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.233311 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-ceph\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.233339 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.233383 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tts8s\" (UniqueName: \"kubernetes.io/projected/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-kube-api-access-tts8s\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.335576 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-ceph\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.335647 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.335714 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tts8s\" (UniqueName: \"kubernetes.io/projected/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-kube-api-access-tts8s\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.335769 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-combined-ca-bundle\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.335831 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.335906 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-config-data\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.335950 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-scripts\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.335995 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-public-tls-certs\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.336032 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-logs\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.336524 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-logs\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.337973 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") device mount path \"/mnt/openstack/pv14\"" pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.338405 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.352116 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-ceph\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.352640 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-combined-ca-bundle\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.354103 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-scripts\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.356074 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-public-tls-certs\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.358748 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tts8s\" (UniqueName: \"kubernetes.io/projected/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-kube-api-access-tts8s\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.364758 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b-config-data\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.393189 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-1\" (UID: \"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b\") " pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.452844 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.780870 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.810633 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5d74c96-9dfe-4db7-8287-b68a27840cf8","Type":"ContainerStarted","Data":"06ac18509ae2c75136b7365cb7941a477ff9375b3cadf673e69181b8b2c9cda0"} Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.819361 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-1" event={"ID":"708d9f61-855e-4fa5-b8fc-acb8b745f37b","Type":"ContainerStarted","Data":"4ac67f9ad89b4d336da226d7912a68dde958aab8a3aebb1cd0a3205f2b3433b7"} Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.820578 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-1" podUID="708d9f61-855e-4fa5-b8fc-acb8b745f37b" containerName="glance-httpd" containerID="cri-o://4ac67f9ad89b4d336da226d7912a68dde958aab8a3aebb1cd0a3205f2b3433b7" gracePeriod=30 Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.823719 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-1" podUID="708d9f61-855e-4fa5-b8fc-acb8b745f37b" containerName="glance-log" containerID="cri-o://83326320c66c2208b995e7c5fe0d56a94154636a9722ca249c50e59ed101c393" gracePeriod=30 Mar 12 14:01:41 crc kubenswrapper[4921]: I0312 14:01:41.868867 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-1" podStartSLOduration=6.86884034 podStartE2EDuration="6.86884034s" podCreationTimestamp="2026-03-12 14:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:41.838574348 +0000 UTC m=+3124.528646319" watchObservedRunningTime="2026-03-12 14:01:41.86884034 +0000 UTC m=+3124.558912311" Mar 12 14:01:42 crc kubenswrapper[4921]: I0312 14:01:42.315701 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1" path="/var/lib/kubelet/pods/fbde48fe-130e-4fd9-bf8c-5c2a28b3b6b1/volumes" Mar 12 14:01:42 crc kubenswrapper[4921]: I0312 14:01:42.430415 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-1"] Mar 12 14:01:42 crc kubenswrapper[4921]: I0312 14:01:42.832273 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-1" event={"ID":"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b","Type":"ContainerStarted","Data":"b7267222bec894a168843d3b662d2cbc0307333c91b92d8388d718419f300c06"} Mar 12 14:01:42 crc kubenswrapper[4921]: I0312 14:01:42.835554 4921 generic.go:334] "Generic (PLEG): container finished" podID="708d9f61-855e-4fa5-b8fc-acb8b745f37b" containerID="4ac67f9ad89b4d336da226d7912a68dde958aab8a3aebb1cd0a3205f2b3433b7" exitCode=0 Mar 12 14:01:42 crc kubenswrapper[4921]: I0312 14:01:42.835576 4921 generic.go:334] "Generic (PLEG): container finished" podID="708d9f61-855e-4fa5-b8fc-acb8b745f37b" containerID="83326320c66c2208b995e7c5fe0d56a94154636a9722ca249c50e59ed101c393" exitCode=143 Mar 12 14:01:42 crc kubenswrapper[4921]: I0312 14:01:42.835574 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-1" event={"ID":"708d9f61-855e-4fa5-b8fc-acb8b745f37b","Type":"ContainerDied","Data":"4ac67f9ad89b4d336da226d7912a68dde958aab8a3aebb1cd0a3205f2b3433b7"} Mar 12 14:01:42 crc kubenswrapper[4921]: I0312 14:01:42.835621 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-1" event={"ID":"708d9f61-855e-4fa5-b8fc-acb8b745f37b","Type":"ContainerDied","Data":"83326320c66c2208b995e7c5fe0d56a94154636a9722ca249c50e59ed101c393"} Mar 12 14:01:42 crc kubenswrapper[4921]: I0312 14:01:42.835634 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-1" event={"ID":"708d9f61-855e-4fa5-b8fc-acb8b745f37b","Type":"ContainerDied","Data":"ae2e27fa2f00d330d14afd8cf5c5b12bde968559279455e49f887e6b0576d7ca"} Mar 12 14:01:42 crc kubenswrapper[4921]: I0312 14:01:42.835645 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae2e27fa2f00d330d14afd8cf5c5b12bde968559279455e49f887e6b0576d7ca" Mar 12 14:01:42 crc kubenswrapper[4921]: I0312 14:01:42.885983 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:42 crc kubenswrapper[4921]: I0312 14:01:42.897133 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.003900 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-config-data\") pod \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.004021 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.005593 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/708d9f61-855e-4fa5-b8fc-acb8b745f37b-httpd-run\") pod \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.005735 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-scripts\") pod \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.006054 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/708d9f61-855e-4fa5-b8fc-acb8b745f37b-ceph\") pod \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.006094 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fmh4\" (UniqueName: \"kubernetes.io/projected/708d9f61-855e-4fa5-b8fc-acb8b745f37b-kube-api-access-7fmh4\") pod \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.006134 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-combined-ca-bundle\") pod \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.006161 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/708d9f61-855e-4fa5-b8fc-acb8b745f37b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "708d9f61-855e-4fa5-b8fc-acb8b745f37b" (UID: "708d9f61-855e-4fa5-b8fc-acb8b745f37b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.006354 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/708d9f61-855e-4fa5-b8fc-acb8b745f37b-logs\") pod \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.006385 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-internal-tls-certs\") pod \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\" (UID: \"708d9f61-855e-4fa5-b8fc-acb8b745f37b\") " Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.007384 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/708d9f61-855e-4fa5-b8fc-acb8b745f37b-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.007995 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/708d9f61-855e-4fa5-b8fc-acb8b745f37b-logs" (OuterVolumeSpecName: "logs") pod "708d9f61-855e-4fa5-b8fc-acb8b745f37b" (UID: "708d9f61-855e-4fa5-b8fc-acb8b745f37b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.012387 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-scripts" (OuterVolumeSpecName: "scripts") pod "708d9f61-855e-4fa5-b8fc-acb8b745f37b" (UID: "708d9f61-855e-4fa5-b8fc-acb8b745f37b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.017145 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/708d9f61-855e-4fa5-b8fc-acb8b745f37b-ceph" (OuterVolumeSpecName: "ceph") pod "708d9f61-855e-4fa5-b8fc-acb8b745f37b" (UID: "708d9f61-855e-4fa5-b8fc-acb8b745f37b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.035339 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "708d9f61-855e-4fa5-b8fc-acb8b745f37b" (UID: "708d9f61-855e-4fa5-b8fc-acb8b745f37b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.035515 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/708d9f61-855e-4fa5-b8fc-acb8b745f37b-kube-api-access-7fmh4" (OuterVolumeSpecName: "kube-api-access-7fmh4") pod "708d9f61-855e-4fa5-b8fc-acb8b745f37b" (UID: "708d9f61-855e-4fa5-b8fc-acb8b745f37b"). InnerVolumeSpecName "kube-api-access-7fmh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.076716 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "708d9f61-855e-4fa5-b8fc-acb8b745f37b" (UID: "708d9f61-855e-4fa5-b8fc-acb8b745f37b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.145121 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.145175 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.145188 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/708d9f61-855e-4fa5-b8fc-acb8b745f37b-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.145201 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fmh4\" (UniqueName: \"kubernetes.io/projected/708d9f61-855e-4fa5-b8fc-acb8b745f37b-kube-api-access-7fmh4\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.145217 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.145228 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/708d9f61-855e-4fa5-b8fc-acb8b745f37b-logs\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.155966 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-config-data" (OuterVolumeSpecName: "config-data") pod "708d9f61-855e-4fa5-b8fc-acb8b745f37b" (UID: "708d9f61-855e-4fa5-b8fc-acb8b745f37b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.166263 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.166529 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="ff3ad30a-89e1-4463-b43b-97d8af948926" containerName="ovn-northd" containerID="cri-o://9c218a731fd8dacf110058c351211ce6df9c54ef138d6df7e4c76c8e4559a3bc" gracePeriod=30 Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.166887 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="ff3ad30a-89e1-4463-b43b-97d8af948926" containerName="openstack-network-exporter" containerID="cri-o://13bffa3502335915c744aaf64a05c7953e648d43e83f1a96fa0906a41832bb73" gracePeriod=30 Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.186935 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "708d9f61-855e-4fa5-b8fc-acb8b745f37b" (UID: "708d9f61-855e-4fa5-b8fc-acb8b745f37b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.199267 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.247760 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.247802 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708d9f61-855e-4fa5-b8fc-acb8b745f37b-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.247824 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.424206 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77dd7dfdbc-tsrfv"] Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.425977 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77dd7dfdbc-tsrfv" podUID="7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" containerName="neutron-api" containerID="cri-o://15dcb7800a5ace68fcd8ca6e2d30df6e7f87ab1a9fafbe0d36b28f5bfd5746c9" gracePeriod=30 Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.426116 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77dd7dfdbc-tsrfv" podUID="7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" containerName="neutron-httpd" containerID="cri-o://45d1d4c39d265b4155f1a4db8086bc519b8140af18883faf3df4cf5a7b54227e" gracePeriod=30 Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.450510 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-547b7895d7-9c58r"] Mar 12 14:01:43 crc kubenswrapper[4921]: E0312 14:01:43.450965 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708d9f61-855e-4fa5-b8fc-acb8b745f37b" containerName="glance-httpd" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.450980 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="708d9f61-855e-4fa5-b8fc-acb8b745f37b" containerName="glance-httpd" Mar 12 14:01:43 crc kubenswrapper[4921]: E0312 14:01:43.451005 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708d9f61-855e-4fa5-b8fc-acb8b745f37b" containerName="glance-log" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.451010 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="708d9f61-855e-4fa5-b8fc-acb8b745f37b" containerName="glance-log" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.451191 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="708d9f61-855e-4fa5-b8fc-acb8b745f37b" containerName="glance-httpd" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.451215 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="708d9f61-855e-4fa5-b8fc-acb8b745f37b" containerName="glance-log" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.452923 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.469578 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-httpd-config\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.469654 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-public-tls-certs\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.469713 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-ovndb-tls-certs\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.469753 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-config\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.469802 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-combined-ca-bundle\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.469829 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58c2x\" (UniqueName: \"kubernetes.io/projected/4d97370e-b2d5-463a-ba6d-5e8e12618140-kube-api-access-58c2x\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.469851 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-internal-tls-certs\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.473262 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-547b7895d7-9c58r"] Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.573888 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-ovndb-tls-certs\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.574069 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-config\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.574201 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-combined-ca-bundle\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.574251 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58c2x\" (UniqueName: \"kubernetes.io/projected/4d97370e-b2d5-463a-ba6d-5e8e12618140-kube-api-access-58c2x\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.574295 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-internal-tls-certs\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.574367 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-httpd-config\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.574481 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-public-tls-certs\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.578429 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-public-tls-certs\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.583010 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-internal-tls-certs\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.583150 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-combined-ca-bundle\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.592890 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-ovndb-tls-certs\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.593133 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-httpd-config\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.593482 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d97370e-b2d5-463a-ba6d-5e8e12618140-config\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.597780 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58c2x\" (UniqueName: \"kubernetes.io/projected/4d97370e-b2d5-463a-ba6d-5e8e12618140-kube-api-access-58c2x\") pod \"neutron-547b7895d7-9c58r\" (UID: \"4d97370e-b2d5-463a-ba6d-5e8e12618140\") " pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.795433 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.880668 4921 generic.go:334] "Generic (PLEG): container finished" podID="7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" containerID="45d1d4c39d265b4155f1a4db8086bc519b8140af18883faf3df4cf5a7b54227e" exitCode=0 Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.880743 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77dd7dfdbc-tsrfv" event={"ID":"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5","Type":"ContainerDied","Data":"45d1d4c39d265b4155f1a4db8086bc519b8140af18883faf3df4cf5a7b54227e"} Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.884965 4921 generic.go:334] "Generic (PLEG): container finished" podID="ff3ad30a-89e1-4463-b43b-97d8af948926" containerID="13bffa3502335915c744aaf64a05c7953e648d43e83f1a96fa0906a41832bb73" exitCode=2 Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.885007 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ff3ad30a-89e1-4463-b43b-97d8af948926","Type":"ContainerDied","Data":"13bffa3502335915c744aaf64a05c7953e648d43e83f1a96fa0906a41832bb73"} Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.893288 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-1" event={"ID":"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b","Type":"ContainerStarted","Data":"cee33cf1bd7d1a5e8776f0c6c501ce0a66bf49cfab0ca95b5c3d90e68106dbe6"} Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.893322 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:43 crc kubenswrapper[4921]: I0312 14:01:43.969365 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-1"] Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.018478 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-1"] Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.050899 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-1"] Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.080936 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.105017 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-1"] Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.205202 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bm8j\" (UniqueName: \"kubernetes.io/projected/739d7b6f-9f1d-4052-958f-e08821db9361-kube-api-access-8bm8j\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.205896 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739d7b6f-9f1d-4052-958f-e08821db9361-config-data\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.205934 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/739d7b6f-9f1d-4052-958f-e08821db9361-internal-tls-certs\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.206098 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/739d7b6f-9f1d-4052-958f-e08821db9361-ceph\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.206139 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.206221 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/739d7b6f-9f1d-4052-958f-e08821db9361-scripts\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.206264 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739d7b6f-9f1d-4052-958f-e08821db9361-combined-ca-bundle\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.206346 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/739d7b6f-9f1d-4052-958f-e08821db9361-logs\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.206404 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/739d7b6f-9f1d-4052-958f-e08821db9361-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.312986 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/739d7b6f-9f1d-4052-958f-e08821db9361-ceph\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.313036 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.313072 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/739d7b6f-9f1d-4052-958f-e08821db9361-scripts\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.313104 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739d7b6f-9f1d-4052-958f-e08821db9361-combined-ca-bundle\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.313139 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/739d7b6f-9f1d-4052-958f-e08821db9361-logs\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.313167 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/739d7b6f-9f1d-4052-958f-e08821db9361-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.313237 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bm8j\" (UniqueName: \"kubernetes.io/projected/739d7b6f-9f1d-4052-958f-e08821db9361-kube-api-access-8bm8j\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.313253 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739d7b6f-9f1d-4052-958f-e08821db9361-config-data\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.313269 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/739d7b6f-9f1d-4052-958f-e08821db9361-internal-tls-certs\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.313611 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.314744 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/739d7b6f-9f1d-4052-958f-e08821db9361-logs\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.315006 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/739d7b6f-9f1d-4052-958f-e08821db9361-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.320885 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739d7b6f-9f1d-4052-958f-e08821db9361-combined-ca-bundle\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.323857 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/739d7b6f-9f1d-4052-958f-e08821db9361-scripts\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.325418 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/739d7b6f-9f1d-4052-958f-e08821db9361-internal-tls-certs\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.327169 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739d7b6f-9f1d-4052-958f-e08821db9361-config-data\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.331284 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/739d7b6f-9f1d-4052-958f-e08821db9361-ceph\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.341839 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bm8j\" (UniqueName: \"kubernetes.io/projected/739d7b6f-9f1d-4052-958f-e08821db9361-kube-api-access-8bm8j\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.363861 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-1\" (UID: \"739d7b6f-9f1d-4052-958f-e08821db9361\") " pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.447633 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.683875 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-547b7895d7-9c58r"] Mar 12 14:01:44 crc kubenswrapper[4921]: W0312 14:01:44.700234 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d97370e_b2d5_463a_ba6d_5e8e12618140.slice/crio-e60c04b7418641b9d93053866223500806dda430cd10d0fcbfe50af64a4f99b2 WatchSource:0}: Error finding container e60c04b7418641b9d93053866223500806dda430cd10d0fcbfe50af64a4f99b2: Status 404 returned error can't find the container with id e60c04b7418641b9d93053866223500806dda430cd10d0fcbfe50af64a4f99b2 Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.915808 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-1" event={"ID":"5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b","Type":"ContainerStarted","Data":"56de471986cb736521d0b749d05d08b23bf1c1009a92b4072f3f942e206e7a38"} Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.924232 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547b7895d7-9c58r" event={"ID":"4d97370e-b2d5-463a-ba6d-5e8e12618140","Type":"ContainerStarted","Data":"e60c04b7418641b9d93053866223500806dda430cd10d0fcbfe50af64a4f99b2"} Mar 12 14:01:44 crc kubenswrapper[4921]: I0312 14:01:44.957684 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-1" podStartSLOduration=3.957664933 podStartE2EDuration="3.957664933s" podCreationTimestamp="2026-03-12 14:01:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:44.94559508 +0000 UTC m=+3127.635667051" watchObservedRunningTime="2026-03-12 14:01:44.957664933 +0000 UTC m=+3127.647736904" Mar 12 14:01:45 crc kubenswrapper[4921]: I0312 14:01:45.249400 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-1"] Mar 12 14:01:45 crc kubenswrapper[4921]: I0312 14:01:45.332007 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:45 crc kubenswrapper[4921]: I0312 14:01:45.355608 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 12 14:01:45 crc kubenswrapper[4921]: I0312 14:01:45.672908 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-1" Mar 12 14:01:45 crc kubenswrapper[4921]: I0312 14:01:45.672944 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-1" Mar 12 14:01:45 crc kubenswrapper[4921]: I0312 14:01:45.818496 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 12 14:01:45 crc kubenswrapper[4921]: I0312 14:01:45.846298 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 12 14:01:45 crc kubenswrapper[4921]: I0312 14:01:45.971980 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-1" event={"ID":"739d7b6f-9f1d-4052-958f-e08821db9361","Type":"ContainerStarted","Data":"e012234fb8e08e9a0d9376049a50793b978c9f05c8a5715c47e37e72c2a6ad33"} Mar 12 14:01:45 crc kubenswrapper[4921]: I0312 14:01:45.979781 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547b7895d7-9c58r" event={"ID":"4d97370e-b2d5-463a-ba6d-5e8e12618140","Type":"ContainerStarted","Data":"6d352d387d4c96b7c7b834e6706e83f9d829df674d7932d1e34af740ea158cb2"} Mar 12 14:01:45 crc kubenswrapper[4921]: I0312 14:01:45.979893 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547b7895d7-9c58r" event={"ID":"4d97370e-b2d5-463a-ba6d-5e8e12618140","Type":"ContainerStarted","Data":"b529186b1fcf1558db9a953cb5122aee57b47608db4b265887f3402ed8bdd820"} Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.010996 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="708d9f61-855e-4fa5-b8fc-acb8b745f37b" path="/var/lib/kubelet/pods/708d9f61-855e-4fa5-b8fc-acb8b745f37b/volumes" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.056299 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-547b7895d7-9c58r" podStartSLOduration=3.056269283 podStartE2EDuration="3.056269283s" podCreationTimestamp="2026-03-12 14:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:46.032540922 +0000 UTC m=+3128.722612893" watchObservedRunningTime="2026-03-12 14:01:46.056269283 +0000 UTC m=+3128.746341254" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.068632 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.068689 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.180694 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.180798 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.423388 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.424945 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.563518 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.565602 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.679955 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-1" podUID="ae5ecb59-c6e0-4a5f-a034-059935a3eaff" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.6:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.680271 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-1" podUID="ae5ecb59-c6e0-4a5f-a034-059935a3eaff" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.6:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.992922 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-1" event={"ID":"739d7b6f-9f1d-4052-958f-e08821db9361","Type":"ContainerStarted","Data":"891c9c176678a7172dd7e15a6f69deb126a590173f357668639f95571d62e6d9"} Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.993397 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.994004 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.994034 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.994044 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 12 14:01:46 crc kubenswrapper[4921]: I0312 14:01:46.994054 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 12 14:01:47 crc kubenswrapper[4921]: I0312 14:01:47.527160 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5664d5cbb7-9rpxn" Mar 12 14:01:47 crc kubenswrapper[4921]: I0312 14:01:47.637041 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79794c8ddf-ht28n"] Mar 12 14:01:47 crc kubenswrapper[4921]: I0312 14:01:47.639392 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" podUID="06610185-0afb-4841-86c4-406c12519fc2" containerName="dnsmasq-dns" containerID="cri-o://ae83612ac487991e8873ddbd81c596d83cda8d817d48e2d4f3415b2af4ed9bf6" gracePeriod=10 Mar 12 14:01:47 crc kubenswrapper[4921]: E0312 14:01:47.674954 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c218a731fd8dacf110058c351211ce6df9c54ef138d6df7e4c76c8e4559a3bc" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 12 14:01:47 crc kubenswrapper[4921]: E0312 14:01:47.702997 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c218a731fd8dacf110058c351211ce6df9c54ef138d6df7e4c76c8e4559a3bc" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 12 14:01:47 crc kubenswrapper[4921]: E0312 14:01:47.719011 4921 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9c218a731fd8dacf110058c351211ce6df9c54ef138d6df7e4c76c8e4559a3bc" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 12 14:01:47 crc kubenswrapper[4921]: E0312 14:01:47.719133 4921 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="ff3ad30a-89e1-4463-b43b-97d8af948926" containerName="ovn-northd" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.111008 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-1" event={"ID":"739d7b6f-9f1d-4052-958f-e08821db9361","Type":"ContainerStarted","Data":"3edfed77afa0da8c60f2fed2284ecd834121809dddee03112590e5a80cefb222"} Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.120907 4921 generic.go:334] "Generic (PLEG): container finished" podID="06610185-0afb-4841-86c4-406c12519fc2" containerID="ae83612ac487991e8873ddbd81c596d83cda8d817d48e2d4f3415b2af4ed9bf6" exitCode=0 Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.121099 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" event={"ID":"06610185-0afb-4841-86c4-406c12519fc2","Type":"ContainerDied","Data":"ae83612ac487991e8873ddbd81c596d83cda8d817d48e2d4f3415b2af4ed9bf6"} Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.171776 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-1" podStartSLOduration=5.171757815 podStartE2EDuration="5.171757815s" podCreationTimestamp="2026-03-12 14:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:48.144714832 +0000 UTC m=+3130.834786803" watchObservedRunningTime="2026-03-12 14:01:48.171757815 +0000 UTC m=+3130.861829786" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.447505 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.502310 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qh2q\" (UniqueName: \"kubernetes.io/projected/06610185-0afb-4841-86c4-406c12519fc2-kube-api-access-6qh2q\") pod \"06610185-0afb-4841-86c4-406c12519fc2\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.502350 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-config\") pod \"06610185-0afb-4841-86c4-406c12519fc2\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.502469 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-dns-svc\") pod \"06610185-0afb-4841-86c4-406c12519fc2\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.502569 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-ovsdbserver-nb\") pod \"06610185-0afb-4841-86c4-406c12519fc2\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.502598 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-openstack-edpm-ipam\") pod \"06610185-0afb-4841-86c4-406c12519fc2\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.502703 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-ovsdbserver-sb\") pod \"06610185-0afb-4841-86c4-406c12519fc2\" (UID: \"06610185-0afb-4841-86c4-406c12519fc2\") " Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.516862 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06610185-0afb-4841-86c4-406c12519fc2-kube-api-access-6qh2q" (OuterVolumeSpecName: "kube-api-access-6qh2q") pod "06610185-0afb-4841-86c4-406c12519fc2" (UID: "06610185-0afb-4841-86c4-406c12519fc2"). InnerVolumeSpecName "kube-api-access-6qh2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.604908 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qh2q\" (UniqueName: \"kubernetes.io/projected/06610185-0afb-4841-86c4-406c12519fc2-kube-api-access-6qh2q\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.605300 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-1" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.630127 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "06610185-0afb-4841-86c4-406c12519fc2" (UID: "06610185-0afb-4841-86c4-406c12519fc2"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.674518 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06610185-0afb-4841-86c4-406c12519fc2" (UID: "06610185-0afb-4841-86c4-406c12519fc2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.680364 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-config" (OuterVolumeSpecName: "config") pod "06610185-0afb-4841-86c4-406c12519fc2" (UID: "06610185-0afb-4841-86c4-406c12519fc2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.694789 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "06610185-0afb-4841-86c4-406c12519fc2" (UID: "06610185-0afb-4841-86c4-406c12519fc2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.696897 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "06610185-0afb-4841-86c4-406c12519fc2" (UID: "06610185-0afb-4841-86c4-406c12519fc2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.708459 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.708495 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.708505 4921 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.708516 4921 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:48 crc kubenswrapper[4921]: I0312 14:01:48.708526 4921 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/06610185-0afb-4841-86c4-406c12519fc2-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:49 crc kubenswrapper[4921]: I0312 14:01:49.153932 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" event={"ID":"06610185-0afb-4841-86c4-406c12519fc2","Type":"ContainerDied","Data":"63cfcfb67afeec87b8e3f0abbbe86a3e6978915651b089f246c5f437b2318e84"} Mar 12 14:01:49 crc kubenswrapper[4921]: I0312 14:01:49.154395 4921 scope.go:117] "RemoveContainer" containerID="ae83612ac487991e8873ddbd81c596d83cda8d817d48e2d4f3415b2af4ed9bf6" Mar 12 14:01:49 crc kubenswrapper[4921]: I0312 14:01:49.154022 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79794c8ddf-ht28n" Mar 12 14:01:49 crc kubenswrapper[4921]: I0312 14:01:49.154036 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 14:01:49 crc kubenswrapper[4921]: I0312 14:01:49.155127 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 14:01:49 crc kubenswrapper[4921]: I0312 14:01:49.198883 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79794c8ddf-ht28n"] Mar 12 14:01:49 crc kubenswrapper[4921]: I0312 14:01:49.209779 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79794c8ddf-ht28n"] Mar 12 14:01:49 crc kubenswrapper[4921]: I0312 14:01:49.789596 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 14:01:49 crc kubenswrapper[4921]: I0312 14:01:49.790243 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 14:01:49 crc kubenswrapper[4921]: I0312 14:01:49.819498 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 14:01:49 crc kubenswrapper[4921]: I0312 14:01:49.853044 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 14:01:49 crc kubenswrapper[4921]: I0312 14:01:49.857994 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 14:01:49 crc kubenswrapper[4921]: I0312 14:01:49.994367 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:01:49 crc kubenswrapper[4921]: E0312 14:01:49.994636 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:01:50 crc kubenswrapper[4921]: I0312 14:01:50.023068 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06610185-0afb-4841-86c4-406c12519fc2" path="/var/lib/kubelet/pods/06610185-0afb-4841-86c4-406c12519fc2/volumes" Mar 12 14:01:50 crc kubenswrapper[4921]: I0312 14:01:50.188651 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ff3ad30a-89e1-4463-b43b-97d8af948926/ovn-northd/0.log" Mar 12 14:01:50 crc kubenswrapper[4921]: I0312 14:01:50.188717 4921 generic.go:334] "Generic (PLEG): container finished" podID="ff3ad30a-89e1-4463-b43b-97d8af948926" containerID="9c218a731fd8dacf110058c351211ce6df9c54ef138d6df7e4c76c8e4559a3bc" exitCode=139 Mar 12 14:01:50 crc kubenswrapper[4921]: I0312 14:01:50.188900 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ff3ad30a-89e1-4463-b43b-97d8af948926","Type":"ContainerDied","Data":"9c218a731fd8dacf110058c351211ce6df9c54ef138d6df7e4c76c8e4559a3bc"} Mar 12 14:01:51 crc kubenswrapper[4921]: I0312 14:01:51.454634 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-1" Mar 12 14:01:51 crc kubenswrapper[4921]: I0312 14:01:51.455032 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-1" Mar 12 14:01:51 crc kubenswrapper[4921]: I0312 14:01:51.495991 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-1" Mar 12 14:01:51 crc kubenswrapper[4921]: I0312 14:01:51.511106 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-1" Mar 12 14:01:52 crc kubenswrapper[4921]: I0312 14:01:52.205648 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-1" Mar 12 14:01:52 crc kubenswrapper[4921]: I0312 14:01:52.206105 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-1" Mar 12 14:01:53 crc kubenswrapper[4921]: I0312 14:01:53.678075 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-1" Mar 12 14:01:53 crc kubenswrapper[4921]: I0312 14:01:53.681227 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-1" Mar 12 14:01:54 crc kubenswrapper[4921]: I0312 14:01:54.107976 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-1" Mar 12 14:01:54 crc kubenswrapper[4921]: I0312 14:01:54.150260 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-1" Mar 12 14:01:54 crc kubenswrapper[4921]: I0312 14:01:54.204062 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 14:01:54 crc kubenswrapper[4921]: I0312 14:01:54.205773 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" containerName="glance-log" containerID="cri-o://11f154ddfbccbb99b3d12cddc0220c29418164b631dddab7e94fd8f695410445" gracePeriod=30 Mar 12 14:01:54 crc kubenswrapper[4921]: I0312 14:01:54.206154 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" containerName="glance-httpd" containerID="cri-o://42d267bdcd1547ac622efcdb43e50ac9f9e7c1dfdf5aef5a880c3d9372a108de" gracePeriod=30 Mar 12 14:01:54 crc kubenswrapper[4921]: I0312 14:01:54.448190 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:54 crc kubenswrapper[4921]: I0312 14:01:54.448534 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:54 crc kubenswrapper[4921]: I0312 14:01:54.492806 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:54 crc kubenswrapper[4921]: I0312 14:01:54.505424 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:54 crc kubenswrapper[4921]: I0312 14:01:54.843381 4921 scope.go:117] "RemoveContainer" containerID="9a63bb20238de0d1debcb2481355a1728cfc685156ca4f94e46cf92551031f8f" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.197305 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ff3ad30a-89e1-4463-b43b-97d8af948926/ovn-northd/0.log" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.197548 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.272378 4921 generic.go:334] "Generic (PLEG): container finished" podID="32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" containerID="11f154ddfbccbb99b3d12cddc0220c29418164b631dddab7e94fd8f695410445" exitCode=143 Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.272477 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a","Type":"ContainerDied","Data":"11f154ddfbccbb99b3d12cddc0220c29418164b631dddab7e94fd8f695410445"} Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.279050 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ff3ad30a-89e1-4463-b43b-97d8af948926/ovn-northd/0.log" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.279755 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.280268 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ff3ad30a-89e1-4463-b43b-97d8af948926","Type":"ContainerDied","Data":"7817df3f00068386c62bd5cf111d5f2e9b098f29bb543a3f47ef58fe790eb185"} Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.280375 4921 scope.go:117] "RemoveContainer" containerID="13bffa3502335915c744aaf64a05c7953e648d43e83f1a96fa0906a41832bb73" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.281211 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.281242 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.332920 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-ovn-northd-tls-certs\") pod \"ff3ad30a-89e1-4463-b43b-97d8af948926\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.332965 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff3ad30a-89e1-4463-b43b-97d8af948926-config\") pod \"ff3ad30a-89e1-4463-b43b-97d8af948926\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.333144 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-metrics-certs-tls-certs\") pod \"ff3ad30a-89e1-4463-b43b-97d8af948926\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.333272 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ff3ad30a-89e1-4463-b43b-97d8af948926-ovn-rundir\") pod \"ff3ad30a-89e1-4463-b43b-97d8af948926\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.333358 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d88vt\" (UniqueName: \"kubernetes.io/projected/ff3ad30a-89e1-4463-b43b-97d8af948926-kube-api-access-d88vt\") pod \"ff3ad30a-89e1-4463-b43b-97d8af948926\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.333455 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff3ad30a-89e1-4463-b43b-97d8af948926-scripts\") pod \"ff3ad30a-89e1-4463-b43b-97d8af948926\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.333527 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-combined-ca-bundle\") pod \"ff3ad30a-89e1-4463-b43b-97d8af948926\" (UID: \"ff3ad30a-89e1-4463-b43b-97d8af948926\") " Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.334660 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff3ad30a-89e1-4463-b43b-97d8af948926-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "ff3ad30a-89e1-4463-b43b-97d8af948926" (UID: "ff3ad30a-89e1-4463-b43b-97d8af948926"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.335031 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff3ad30a-89e1-4463-b43b-97d8af948926-scripts" (OuterVolumeSpecName: "scripts") pod "ff3ad30a-89e1-4463-b43b-97d8af948926" (UID: "ff3ad30a-89e1-4463-b43b-97d8af948926"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.335130 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff3ad30a-89e1-4463-b43b-97d8af948926-config" (OuterVolumeSpecName: "config") pod "ff3ad30a-89e1-4463-b43b-97d8af948926" (UID: "ff3ad30a-89e1-4463-b43b-97d8af948926"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.344660 4921 scope.go:117] "RemoveContainer" containerID="9c218a731fd8dacf110058c351211ce6df9c54ef138d6df7e4c76c8e4559a3bc" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.348253 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff3ad30a-89e1-4463-b43b-97d8af948926-kube-api-access-d88vt" (OuterVolumeSpecName: "kube-api-access-d88vt") pod "ff3ad30a-89e1-4463-b43b-97d8af948926" (UID: "ff3ad30a-89e1-4463-b43b-97d8af948926"). InnerVolumeSpecName "kube-api-access-d88vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.393232 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff3ad30a-89e1-4463-b43b-97d8af948926" (UID: "ff3ad30a-89e1-4463-b43b-97d8af948926"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.435969 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.435993 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff3ad30a-89e1-4463-b43b-97d8af948926-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.436002 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ff3ad30a-89e1-4463-b43b-97d8af948926-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.436012 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d88vt\" (UniqueName: \"kubernetes.io/projected/ff3ad30a-89e1-4463-b43b-97d8af948926-kube-api-access-d88vt\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.436023 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff3ad30a-89e1-4463-b43b-97d8af948926-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.460140 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "ff3ad30a-89e1-4463-b43b-97d8af948926" (UID: "ff3ad30a-89e1-4463-b43b-97d8af948926"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.473573 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "ff3ad30a-89e1-4463-b43b-97d8af948926" (UID: "ff3ad30a-89e1-4463-b43b-97d8af948926"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.538391 4921 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.538426 4921 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff3ad30a-89e1-4463-b43b-97d8af948926-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.622197 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.638097 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.669452 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 12 14:01:55 crc kubenswrapper[4921]: E0312 14:01:55.669938 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06610185-0afb-4841-86c4-406c12519fc2" containerName="dnsmasq-dns" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.669958 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="06610185-0afb-4841-86c4-406c12519fc2" containerName="dnsmasq-dns" Mar 12 14:01:55 crc kubenswrapper[4921]: E0312 14:01:55.669982 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff3ad30a-89e1-4463-b43b-97d8af948926" containerName="openstack-network-exporter" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.669992 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3ad30a-89e1-4463-b43b-97d8af948926" containerName="openstack-network-exporter" Mar 12 14:01:55 crc kubenswrapper[4921]: E0312 14:01:55.670019 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06610185-0afb-4841-86c4-406c12519fc2" containerName="init" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.670028 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="06610185-0afb-4841-86c4-406c12519fc2" containerName="init" Mar 12 14:01:55 crc kubenswrapper[4921]: E0312 14:01:55.670045 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff3ad30a-89e1-4463-b43b-97d8af948926" containerName="ovn-northd" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.670055 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff3ad30a-89e1-4463-b43b-97d8af948926" containerName="ovn-northd" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.670292 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="06610185-0afb-4841-86c4-406c12519fc2" containerName="dnsmasq-dns" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.670315 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff3ad30a-89e1-4463-b43b-97d8af948926" containerName="openstack-network-exporter" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.670341 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff3ad30a-89e1-4463-b43b-97d8af948926" containerName="ovn-northd" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.671641 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.679070 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.679237 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.679341 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.681804 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-sx5cf" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.683570 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.687147 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-1" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.699576 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-1" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.700365 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-1" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.745122 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b82052-6f75-4fe5-b4af-9726f2a59c2f-config\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.745161 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b82052-6f75-4fe5-b4af-9726f2a59c2f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.745188 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/47b82052-6f75-4fe5-b4af-9726f2a59c2f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.745253 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b82052-6f75-4fe5-b4af-9726f2a59c2f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.745281 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47b82052-6f75-4fe5-b4af-9726f2a59c2f-scripts\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.745366 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b82052-6f75-4fe5-b4af-9726f2a59c2f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.745420 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng8d7\" (UniqueName: \"kubernetes.io/projected/47b82052-6f75-4fe5-b4af-9726f2a59c2f-kube-api-access-ng8d7\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.847487 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/47b82052-6f75-4fe5-b4af-9726f2a59c2f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.847586 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b82052-6f75-4fe5-b4af-9726f2a59c2f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.847623 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47b82052-6f75-4fe5-b4af-9726f2a59c2f-scripts\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.847689 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b82052-6f75-4fe5-b4af-9726f2a59c2f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.847737 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng8d7\" (UniqueName: \"kubernetes.io/projected/47b82052-6f75-4fe5-b4af-9726f2a59c2f-kube-api-access-ng8d7\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.847769 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b82052-6f75-4fe5-b4af-9726f2a59c2f-config\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.847788 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b82052-6f75-4fe5-b4af-9726f2a59c2f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.848170 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/47b82052-6f75-4fe5-b4af-9726f2a59c2f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.849127 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/47b82052-6f75-4fe5-b4af-9726f2a59c2f-scripts\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.849430 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47b82052-6f75-4fe5-b4af-9726f2a59c2f-config\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.851473 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b82052-6f75-4fe5-b4af-9726f2a59c2f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.852671 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47b82052-6f75-4fe5-b4af-9726f2a59c2f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.853947 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/47b82052-6f75-4fe5-b4af-9726f2a59c2f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.868611 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng8d7\" (UniqueName: \"kubernetes.io/projected/47b82052-6f75-4fe5-b4af-9726f2a59c2f-kube-api-access-ng8d7\") pod \"ovn-northd-0\" (UID: \"47b82052-6f75-4fe5-b4af-9726f2a59c2f\") " pod="openstack/ovn-northd-0" Mar 12 14:01:55 crc kubenswrapper[4921]: I0312 14:01:55.995006 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff3ad30a-89e1-4463-b43b-97d8af948926" path="/var/lib/kubelet/pods/ff3ad30a-89e1-4463-b43b-97d8af948926/volumes" Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.000084 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.299556 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f698cc456-lcngv" event={"ID":"e88e6256-b5e0-44bc-8f61-31e31844f957","Type":"ContainerStarted","Data":"9c34569e4ff3e5a70ccd9fa4202d72a797cac94e1f445c82a59664436b20418c"} Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.299923 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f698cc456-lcngv" event={"ID":"e88e6256-b5e0-44bc-8f61-31e31844f957","Type":"ContainerStarted","Data":"8f2e2f6850e5ed504a8792d3daf8c4b2aa409ba15c8634aa1d5079c30c21ccba"} Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.310862 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dc95b94d7-gz566" event={"ID":"f081f129-7b40-467c-98cc-420f18d1d3ca","Type":"ContainerStarted","Data":"92eb4c6b9c5004a837e83effdc00b01cbf738416d64a6afb99e84d38fecc7584"} Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.310908 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dc95b94d7-gz566" event={"ID":"f081f129-7b40-467c-98cc-420f18d1d3ca","Type":"ContainerStarted","Data":"77e0acb25e568d205e7e59e9d8be85b343ee0622b60ebfc9eea244d3ed4c049e"} Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.311037 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5dc95b94d7-gz566" podUID="f081f129-7b40-467c-98cc-420f18d1d3ca" containerName="horizon-log" containerID="cri-o://77e0acb25e568d205e7e59e9d8be85b343ee0622b60ebfc9eea244d3ed4c049e" gracePeriod=30 Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.311301 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5dc95b94d7-gz566" podUID="f081f129-7b40-467c-98cc-420f18d1d3ca" containerName="horizon" containerID="cri-o://92eb4c6b9c5004a837e83effdc00b01cbf738416d64a6afb99e84d38fecc7584" gracePeriod=30 Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.317624 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56b9755d7c-lfsh8" event={"ID":"0e083a0f-e15a-4541-ac5b-2870ce8a245c","Type":"ContainerStarted","Data":"cb52a62d9c5b19797e3ceb98c52215f057a49903ece973753480ee845479bc62"} Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.317662 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56b9755d7c-lfsh8" event={"ID":"0e083a0f-e15a-4541-ac5b-2870ce8a245c","Type":"ContainerStarted","Data":"e32b8fdd925530f81c05fc575e6e57e489f3e5a757c91e749b1d60ea6afd502a"} Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.317770 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-56b9755d7c-lfsh8" podUID="0e083a0f-e15a-4541-ac5b-2870ce8a245c" containerName="horizon-log" containerID="cri-o://e32b8fdd925530f81c05fc575e6e57e489f3e5a757c91e749b1d60ea6afd502a" gracePeriod=30 Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.317862 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-56b9755d7c-lfsh8" podUID="0e083a0f-e15a-4541-ac5b-2870ce8a245c" containerName="horizon" containerID="cri-o://cb52a62d9c5b19797e3ceb98c52215f057a49903ece973753480ee845479bc62" gracePeriod=30 Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.334575 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6f698cc456-lcngv" podStartSLOduration=3.239931366 podStartE2EDuration="18.334556795s" podCreationTimestamp="2026-03-12 14:01:38 +0000 UTC" firstStartedPulling="2026-03-12 14:01:39.851922596 +0000 UTC m=+3122.541994567" lastFinishedPulling="2026-03-12 14:01:54.946548015 +0000 UTC m=+3137.636619996" observedRunningTime="2026-03-12 14:01:56.328711185 +0000 UTC m=+3139.018783156" watchObservedRunningTime="2026-03-12 14:01:56.334556795 +0000 UTC m=+3139.024628776" Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.341054 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bbd56cc76-cwl96" event={"ID":"e6e62dec-8193-4d3c-a111-2ee250f79b86","Type":"ContainerStarted","Data":"5b0dfdd5f5d01bcedefede99f099d65bd6bb215718c7e560f082fa93b9468d05"} Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.341104 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-bbd56cc76-cwl96" event={"ID":"e6e62dec-8193-4d3c-a111-2ee250f79b86","Type":"ContainerStarted","Data":"573f3a3edc2eb9e86516ae0d849e72b392cd45ba6a39280a011e9536b0af883a"} Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.355379 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5dc95b94d7-gz566" podStartSLOduration=3.009042295 podStartE2EDuration="21.355363266s" podCreationTimestamp="2026-03-12 14:01:35 +0000 UTC" firstStartedPulling="2026-03-12 14:01:36.598391997 +0000 UTC m=+3119.288463968" lastFinishedPulling="2026-03-12 14:01:54.944712958 +0000 UTC m=+3137.634784939" observedRunningTime="2026-03-12 14:01:56.343866161 +0000 UTC m=+3139.033938132" watchObservedRunningTime="2026-03-12 14:01:56.355363266 +0000 UTC m=+3139.045435237" Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.356981 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-1" Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.372652 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-56b9755d7c-lfsh8" podStartSLOduration=3.337342744 podStartE2EDuration="21.372632388s" podCreationTimestamp="2026-03-12 14:01:35 +0000 UTC" firstStartedPulling="2026-03-12 14:01:36.90704249 +0000 UTC m=+3119.597114461" lastFinishedPulling="2026-03-12 14:01:54.942332134 +0000 UTC m=+3137.632404105" observedRunningTime="2026-03-12 14:01:56.364495267 +0000 UTC m=+3139.054567238" watchObservedRunningTime="2026-03-12 14:01:56.372632388 +0000 UTC m=+3139.062704359" Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.406626 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-bbd56cc76-cwl96" podStartSLOduration=3.2651447239999998 podStartE2EDuration="18.406610485s" podCreationTimestamp="2026-03-12 14:01:38 +0000 UTC" firstStartedPulling="2026-03-12 14:01:39.907252132 +0000 UTC m=+3122.597324103" lastFinishedPulling="2026-03-12 14:01:55.048717883 +0000 UTC m=+3137.738789864" observedRunningTime="2026-03-12 14:01:56.405562683 +0000 UTC m=+3139.095634654" watchObservedRunningTime="2026-03-12 14:01:56.406610485 +0000 UTC m=+3139.096682456" Mar 12 14:01:56 crc kubenswrapper[4921]: I0312 14:01:56.517176 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 14:01:57 crc kubenswrapper[4921]: I0312 14:01:57.366065 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"47b82052-6f75-4fe5-b4af-9726f2a59c2f","Type":"ContainerStarted","Data":"74c4dfb93271e27496dabc06c90b708c8a16b16b5fc2ed59988de080308c3235"} Mar 12 14:01:57 crc kubenswrapper[4921]: I0312 14:01:57.366729 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"47b82052-6f75-4fe5-b4af-9726f2a59c2f","Type":"ContainerStarted","Data":"425aec8e56ac8fa549183da3b09d723a5bd783922c810d8396f92198d61c13bb"} Mar 12 14:01:57 crc kubenswrapper[4921]: I0312 14:01:57.366744 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"47b82052-6f75-4fe5-b4af-9726f2a59c2f","Type":"ContainerStarted","Data":"b0c225879d37a59cae7b581da9de72cb061837f9489e90490a465ec062a0ac20"} Mar 12 14:01:57 crc kubenswrapper[4921]: I0312 14:01:57.861028 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:57 crc kubenswrapper[4921]: I0312 14:01:57.861139 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.050767 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-1" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.108740 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.109052 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d5d74c96-9dfe-4db7-8287-b68a27840cf8" containerName="glance-log" containerID="cri-o://2b1b6d94e766a0a67f6026f13b8924ff205c3b1e83923b820ef84adac2495236" gracePeriod=30 Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.109123 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d5d74c96-9dfe-4db7-8287-b68a27840cf8" containerName="glance-httpd" containerID="cri-o://06ac18509ae2c75136b7365cb7941a477ff9375b3cadf673e69181b8b2c9cda0" gracePeriod=30 Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.306353 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.381656 4921 generic.go:334] "Generic (PLEG): container finished" podID="32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" containerID="42d267bdcd1547ac622efcdb43e50ac9f9e7c1dfdf5aef5a880c3d9372a108de" exitCode=0 Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.381716 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a","Type":"ContainerDied","Data":"42d267bdcd1547ac622efcdb43e50ac9f9e7c1dfdf5aef5a880c3d9372a108de"} Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.381742 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a","Type":"ContainerDied","Data":"1a75bef7719a50942d89c8cadf2f77fb37e0f78b2faa53448550f73f3b1d7997"} Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.381758 4921 scope.go:117] "RemoveContainer" containerID="42d267bdcd1547ac622efcdb43e50ac9f9e7c1dfdf5aef5a880c3d9372a108de" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.381889 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.390486 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5d74c96-9dfe-4db7-8287-b68a27840cf8" containerID="2b1b6d94e766a0a67f6026f13b8924ff205c3b1e83923b820ef84adac2495236" exitCode=143 Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.390962 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5d74c96-9dfe-4db7-8287-b68a27840cf8","Type":"ContainerDied","Data":"2b1b6d94e766a0a67f6026f13b8924ff205c3b1e83923b820ef84adac2495236"} Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.392276 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.403896 4921 scope.go:117] "RemoveContainer" containerID="11f154ddfbccbb99b3d12cddc0220c29418164b631dddab7e94fd8f695410445" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.405105 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-ceph\") pod \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.405213 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-scripts\") pod \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.405239 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-config-data\") pod \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.405305 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nmm4\" (UniqueName: \"kubernetes.io/projected/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-kube-api-access-4nmm4\") pod \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.405377 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-combined-ca-bundle\") pod \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.405421 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-httpd-run\") pod \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.410413 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" (UID: "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.411183 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-logs\") pod \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.411229 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.411254 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-public-tls-certs\") pod \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\" (UID: \"32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a\") " Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.412727 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-logs" (OuterVolumeSpecName: "logs") pod "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" (UID: "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.413481 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-ceph" (OuterVolumeSpecName: "ceph") pod "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" (UID: "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.413521 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-kube-api-access-4nmm4" (OuterVolumeSpecName: "kube-api-access-4nmm4") pod "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" (UID: "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a"). InnerVolumeSpecName "kube-api-access-4nmm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.413873 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-logs\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.413923 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.413936 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nmm4\" (UniqueName: \"kubernetes.io/projected/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-kube-api-access-4nmm4\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.413949 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.415263 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-scripts" (OuterVolumeSpecName: "scripts") pod "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" (UID: "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.419765 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" (UID: "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.443539 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.443520176 podStartE2EDuration="3.443520176s" podCreationTimestamp="2026-03-12 14:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:01:58.407252668 +0000 UTC m=+3141.097324639" watchObservedRunningTime="2026-03-12 14:01:58.443520176 +0000 UTC m=+3141.133592147" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.467792 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" (UID: "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.511618 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" (UID: "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.513793 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-config-data" (OuterVolumeSpecName: "config-data") pod "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" (UID: "32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.516294 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.516355 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.516385 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.516395 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.516404 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.518469 4921 scope.go:117] "RemoveContainer" containerID="42d267bdcd1547ac622efcdb43e50ac9f9e7c1dfdf5aef5a880c3d9372a108de" Mar 12 14:01:58 crc kubenswrapper[4921]: E0312 14:01:58.518929 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42d267bdcd1547ac622efcdb43e50ac9f9e7c1dfdf5aef5a880c3d9372a108de\": container with ID starting with 42d267bdcd1547ac622efcdb43e50ac9f9e7c1dfdf5aef5a880c3d9372a108de not found: ID does not exist" containerID="42d267bdcd1547ac622efcdb43e50ac9f9e7c1dfdf5aef5a880c3d9372a108de" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.518956 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42d267bdcd1547ac622efcdb43e50ac9f9e7c1dfdf5aef5a880c3d9372a108de"} err="failed to get container status \"42d267bdcd1547ac622efcdb43e50ac9f9e7c1dfdf5aef5a880c3d9372a108de\": rpc error: code = NotFound desc = could not find container \"42d267bdcd1547ac622efcdb43e50ac9f9e7c1dfdf5aef5a880c3d9372a108de\": container with ID starting with 42d267bdcd1547ac622efcdb43e50ac9f9e7c1dfdf5aef5a880c3d9372a108de not found: ID does not exist" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.518975 4921 scope.go:117] "RemoveContainer" containerID="11f154ddfbccbb99b3d12cddc0220c29418164b631dddab7e94fd8f695410445" Mar 12 14:01:58 crc kubenswrapper[4921]: E0312 14:01:58.519409 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f154ddfbccbb99b3d12cddc0220c29418164b631dddab7e94fd8f695410445\": container with ID starting with 11f154ddfbccbb99b3d12cddc0220c29418164b631dddab7e94fd8f695410445 not found: ID does not exist" containerID="11f154ddfbccbb99b3d12cddc0220c29418164b631dddab7e94fd8f695410445" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.519435 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f154ddfbccbb99b3d12cddc0220c29418164b631dddab7e94fd8f695410445"} err="failed to get container status \"11f154ddfbccbb99b3d12cddc0220c29418164b631dddab7e94fd8f695410445\": rpc error: code = NotFound desc = could not find container \"11f154ddfbccbb99b3d12cddc0220c29418164b631dddab7e94fd8f695410445\": container with ID starting with 11f154ddfbccbb99b3d12cddc0220c29418164b631dddab7e94fd8f695410445 not found: ID does not exist" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.537364 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.618361 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.714444 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.725007 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.752422 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 14:01:58 crc kubenswrapper[4921]: E0312 14:01:58.752801 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" containerName="glance-httpd" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.752833 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" containerName="glance-httpd" Mar 12 14:01:58 crc kubenswrapper[4921]: E0312 14:01:58.752845 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" containerName="glance-log" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.752852 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" containerName="glance-log" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.753024 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" containerName="glance-log" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.753034 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" containerName="glance-httpd" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.753994 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.765256 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.823333 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-logs\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.823412 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-scripts\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.823463 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.823492 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-ceph\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.823583 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.823622 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.823767 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.823892 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-config-data\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.823939 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6w4g\" (UniqueName: \"kubernetes.io/projected/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-kube-api-access-c6w4g\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.926288 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-logs\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.926349 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-scripts\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.926380 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.926400 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-ceph\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.926434 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.926455 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.926491 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.926520 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-config-data\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.926543 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6w4g\" (UniqueName: \"kubernetes.io/projected/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-kube-api-access-c6w4g\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.927138 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.927225 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.927356 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.927403 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.936629 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-logs\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.938457 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-scripts\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.938949 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-config-data\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.939422 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-ceph\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.939697 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.940226 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.964410 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6w4g\" (UniqueName: \"kubernetes.io/projected/3ddcb284-70a7-47da-8b0e-e5ba1f0a9443-kube-api-access-c6w4g\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.965956 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.966376 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:01:58 crc kubenswrapper[4921]: I0312 14:01:58.978296 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443\") " pod="openstack/glance-default-external-api-0" Mar 12 14:01:59 crc kubenswrapper[4921]: I0312 14:01:59.078077 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 12 14:01:59 crc kubenswrapper[4921]: I0312 14:01:59.769463 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 12 14:01:59 crc kubenswrapper[4921]: W0312 14:01:59.782648 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ddcb284_70a7_47da_8b0e_e5ba1f0a9443.slice/crio-17815a14a8957b036abe634bdb044a201bd0779f75ac062c4d3f268851eca3b4 WatchSource:0}: Error finding container 17815a14a8957b036abe634bdb044a201bd0779f75ac062c4d3f268851eca3b4: Status 404 returned error can't find the container with id 17815a14a8957b036abe634bdb044a201bd0779f75ac062c4d3f268851eca3b4 Mar 12 14:01:59 crc kubenswrapper[4921]: I0312 14:01:59.996321 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a" path="/var/lib/kubelet/pods/32d2a6a7-7f9b-4a8e-b2e1-25e748359a5a/volumes" Mar 12 14:02:00 crc kubenswrapper[4921]: I0312 14:02:00.175474 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555402-jsfzg"] Mar 12 14:02:00 crc kubenswrapper[4921]: I0312 14:02:00.177292 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555402-jsfzg" Mar 12 14:02:00 crc kubenswrapper[4921]: I0312 14:02:00.179735 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:02:00 crc kubenswrapper[4921]: I0312 14:02:00.179919 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:02:00 crc kubenswrapper[4921]: I0312 14:02:00.180103 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:02:00 crc kubenswrapper[4921]: I0312 14:02:00.183332 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555402-jsfzg"] Mar 12 14:02:00 crc kubenswrapper[4921]: I0312 14:02:00.254745 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9snm8\" (UniqueName: \"kubernetes.io/projected/626b7901-75fd-4a39-bef1-2fc34d374f41-kube-api-access-9snm8\") pod \"auto-csr-approver-29555402-jsfzg\" (UID: \"626b7901-75fd-4a39-bef1-2fc34d374f41\") " pod="openshift-infra/auto-csr-approver-29555402-jsfzg" Mar 12 14:02:00 crc kubenswrapper[4921]: I0312 14:02:00.356938 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9snm8\" (UniqueName: \"kubernetes.io/projected/626b7901-75fd-4a39-bef1-2fc34d374f41-kube-api-access-9snm8\") pod \"auto-csr-approver-29555402-jsfzg\" (UID: \"626b7901-75fd-4a39-bef1-2fc34d374f41\") " pod="openshift-infra/auto-csr-approver-29555402-jsfzg" Mar 12 14:02:00 crc kubenswrapper[4921]: I0312 14:02:00.386443 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9snm8\" (UniqueName: \"kubernetes.io/projected/626b7901-75fd-4a39-bef1-2fc34d374f41-kube-api-access-9snm8\") pod \"auto-csr-approver-29555402-jsfzg\" (UID: \"626b7901-75fd-4a39-bef1-2fc34d374f41\") " pod="openshift-infra/auto-csr-approver-29555402-jsfzg" Mar 12 14:02:00 crc kubenswrapper[4921]: I0312 14:02:00.411585 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443","Type":"ContainerStarted","Data":"65664e905116c01e250d07dd2f23782b50cba58a4565f1177bcce3f72394d592"} Mar 12 14:02:00 crc kubenswrapper[4921]: I0312 14:02:00.411826 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443","Type":"ContainerStarted","Data":"17815a14a8957b036abe634bdb044a201bd0779f75ac062c4d3f268851eca3b4"} Mar 12 14:02:00 crc kubenswrapper[4921]: I0312 14:02:00.507267 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555402-jsfzg" Mar 12 14:02:00 crc kubenswrapper[4921]: I0312 14:02:00.973548 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555402-jsfzg"] Mar 12 14:02:00 crc kubenswrapper[4921]: W0312 14:02:00.978640 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod626b7901_75fd_4a39_bef1_2fc34d374f41.slice/crio-e1f63fdf590e3306561e600fd055ca26d610e4d4fdf6fcc471c749711180b19b WatchSource:0}: Error finding container e1f63fdf590e3306561e600fd055ca26d610e4d4fdf6fcc471c749711180b19b: Status 404 returned error can't find the container with id e1f63fdf590e3306561e600fd055ca26d610e4d4fdf6fcc471c749711180b19b Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.422311 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ddcb284-70a7-47da-8b0e-e5ba1f0a9443","Type":"ContainerStarted","Data":"68517f660c850138daebf3ac9dd57dd629d47d3be61efa3be2752abd5f8981ef"} Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.423661 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555402-jsfzg" event={"ID":"626b7901-75fd-4a39-bef1-2fc34d374f41","Type":"ContainerStarted","Data":"e1f63fdf590e3306561e600fd055ca26d610e4d4fdf6fcc471c749711180b19b"} Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.434196 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5d74c96-9dfe-4db7-8287-b68a27840cf8" containerID="06ac18509ae2c75136b7365cb7941a477ff9375b3cadf673e69181b8b2c9cda0" exitCode=0 Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.434228 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5d74c96-9dfe-4db7-8287-b68a27840cf8","Type":"ContainerDied","Data":"06ac18509ae2c75136b7365cb7941a477ff9375b3cadf673e69181b8b2c9cda0"} Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.447373 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.447354039 podStartE2EDuration="3.447354039s" podCreationTimestamp="2026-03-12 14:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:02:01.440267901 +0000 UTC m=+3144.130339872" watchObservedRunningTime="2026-03-12 14:02:01.447354039 +0000 UTC m=+3144.137426010" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.784227 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.882855 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5d74c96-9dfe-4db7-8287-b68a27840cf8-httpd-run\") pod \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.882964 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cljv8\" (UniqueName: \"kubernetes.io/projected/d5d74c96-9dfe-4db7-8287-b68a27840cf8-kube-api-access-cljv8\") pod \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.884249 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-combined-ca-bundle\") pod \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.883371 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5d74c96-9dfe-4db7-8287-b68a27840cf8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d5d74c96-9dfe-4db7-8287-b68a27840cf8" (UID: "d5d74c96-9dfe-4db7-8287-b68a27840cf8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.884380 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-scripts\") pod \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.884401 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-config-data\") pod \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.884464 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.884488 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-internal-tls-certs\") pod \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.884533 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d74c96-9dfe-4db7-8287-b68a27840cf8-logs\") pod \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.884627 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d5d74c96-9dfe-4db7-8287-b68a27840cf8-ceph\") pod \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\" (UID: \"d5d74c96-9dfe-4db7-8287-b68a27840cf8\") " Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.885362 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5d74c96-9dfe-4db7-8287-b68a27840cf8-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.886053 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5d74c96-9dfe-4db7-8287-b68a27840cf8-logs" (OuterVolumeSpecName: "logs") pod "d5d74c96-9dfe-4db7-8287-b68a27840cf8" (UID: "d5d74c96-9dfe-4db7-8287-b68a27840cf8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.891976 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-scripts" (OuterVolumeSpecName: "scripts") pod "d5d74c96-9dfe-4db7-8287-b68a27840cf8" (UID: "d5d74c96-9dfe-4db7-8287-b68a27840cf8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.892058 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d74c96-9dfe-4db7-8287-b68a27840cf8-ceph" (OuterVolumeSpecName: "ceph") pod "d5d74c96-9dfe-4db7-8287-b68a27840cf8" (UID: "d5d74c96-9dfe-4db7-8287-b68a27840cf8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.892050 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d74c96-9dfe-4db7-8287-b68a27840cf8-kube-api-access-cljv8" (OuterVolumeSpecName: "kube-api-access-cljv8") pod "d5d74c96-9dfe-4db7-8287-b68a27840cf8" (UID: "d5d74c96-9dfe-4db7-8287-b68a27840cf8"). InnerVolumeSpecName "kube-api-access-cljv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.897310 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "glance") pod "d5d74c96-9dfe-4db7-8287-b68a27840cf8" (UID: "d5d74c96-9dfe-4db7-8287-b68a27840cf8"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.913965 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5d74c96-9dfe-4db7-8287-b68a27840cf8" (UID: "d5d74c96-9dfe-4db7-8287-b68a27840cf8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.938019 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d5d74c96-9dfe-4db7-8287-b68a27840cf8" (UID: "d5d74c96-9dfe-4db7-8287-b68a27840cf8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.953630 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-config-data" (OuterVolumeSpecName: "config-data") pod "d5d74c96-9dfe-4db7-8287-b68a27840cf8" (UID: "d5d74c96-9dfe-4db7-8287-b68a27840cf8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.987259 4921 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d5d74c96-9dfe-4db7-8287-b68a27840cf8-ceph\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.987294 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cljv8\" (UniqueName: \"kubernetes.io/projected/d5d74c96-9dfe-4db7-8287-b68a27840cf8-kube-api-access-cljv8\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.987309 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.987322 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.987334 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.987369 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.987381 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d74c96-9dfe-4db7-8287-b68a27840cf8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:01 crc kubenswrapper[4921]: I0312 14:02:01.987394 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d74c96-9dfe-4db7-8287-b68a27840cf8-logs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.022062 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.088835 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.449384 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.449795 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d5d74c96-9dfe-4db7-8287-b68a27840cf8","Type":"ContainerDied","Data":"c3dc3b43a1d58be9bb151572cc8beb4943ca96d1db80efb2c460e3d4ada5ed5e"} Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.449958 4921 scope.go:117] "RemoveContainer" containerID="06ac18509ae2c75136b7365cb7941a477ff9375b3cadf673e69181b8b2c9cda0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.480018 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.491318 4921 scope.go:117] "RemoveContainer" containerID="2b1b6d94e766a0a67f6026f13b8924ff205c3b1e83923b820ef84adac2495236" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.500441 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.511322 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 14:02:02 crc kubenswrapper[4921]: E0312 14:02:02.511766 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d74c96-9dfe-4db7-8287-b68a27840cf8" containerName="glance-log" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.511778 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d74c96-9dfe-4db7-8287-b68a27840cf8" containerName="glance-log" Mar 12 14:02:02 crc kubenswrapper[4921]: E0312 14:02:02.511797 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d74c96-9dfe-4db7-8287-b68a27840cf8" containerName="glance-httpd" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.511803 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d74c96-9dfe-4db7-8287-b68a27840cf8" containerName="glance-httpd" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.512013 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d74c96-9dfe-4db7-8287-b68a27840cf8" containerName="glance-log" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.512032 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d74c96-9dfe-4db7-8287-b68a27840cf8" containerName="glance-httpd" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.513119 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.524353 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.703975 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d506b9f9-1563-432f-9b21-760ceb017fe9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.704106 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d506b9f9-1563-432f-9b21-760ceb017fe9-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.704147 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.704195 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d506b9f9-1563-432f-9b21-760ceb017fe9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.704253 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d506b9f9-1563-432f-9b21-760ceb017fe9-logs\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.704501 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d506b9f9-1563-432f-9b21-760ceb017fe9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.704634 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct2zb\" (UniqueName: \"kubernetes.io/projected/d506b9f9-1563-432f-9b21-760ceb017fe9-kube-api-access-ct2zb\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.704687 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d506b9f9-1563-432f-9b21-760ceb017fe9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.704891 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d506b9f9-1563-432f-9b21-760ceb017fe9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.807186 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct2zb\" (UniqueName: \"kubernetes.io/projected/d506b9f9-1563-432f-9b21-760ceb017fe9-kube-api-access-ct2zb\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.807251 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d506b9f9-1563-432f-9b21-760ceb017fe9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.807313 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d506b9f9-1563-432f-9b21-760ceb017fe9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.807361 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d506b9f9-1563-432f-9b21-760ceb017fe9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.807388 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d506b9f9-1563-432f-9b21-760ceb017fe9-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.807414 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.807443 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d506b9f9-1563-432f-9b21-760ceb017fe9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.807477 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d506b9f9-1563-432f-9b21-760ceb017fe9-logs\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.807512 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d506b9f9-1563-432f-9b21-760ceb017fe9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.808284 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d506b9f9-1563-432f-9b21-760ceb017fe9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.808412 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") device mount path \"/mnt/openstack/pv16\"" pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.812101 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d506b9f9-1563-432f-9b21-760ceb017fe9-logs\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.814132 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d506b9f9-1563-432f-9b21-760ceb017fe9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.815907 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d506b9f9-1563-432f-9b21-760ceb017fe9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.817349 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d506b9f9-1563-432f-9b21-760ceb017fe9-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.817780 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d506b9f9-1563-432f-9b21-760ceb017fe9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.820121 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d506b9f9-1563-432f-9b21-760ceb017fe9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.823299 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct2zb\" (UniqueName: \"kubernetes.io/projected/d506b9f9-1563-432f-9b21-760ceb017fe9-kube-api-access-ct2zb\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.851702 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"glance-default-internal-api-0\" (UID: \"d506b9f9-1563-432f-9b21-760ceb017fe9\") " pod="openstack/glance-default-internal-api-0" Mar 12 14:02:02 crc kubenswrapper[4921]: I0312 14:02:02.982917 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:02:02 crc kubenswrapper[4921]: E0312 14:02:02.983208 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:02:03 crc kubenswrapper[4921]: I0312 14:02:03.143669 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 12 14:02:03 crc kubenswrapper[4921]: I0312 14:02:03.475790 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555402-jsfzg" event={"ID":"626b7901-75fd-4a39-bef1-2fc34d374f41","Type":"ContainerStarted","Data":"39f0e93d2aac44476afa1cd4f5c38b431622b2118c53082b5e35045add8acf6c"} Mar 12 14:02:03 crc kubenswrapper[4921]: I0312 14:02:03.493645 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555402-jsfzg" podStartSLOduration=1.796492271 podStartE2EDuration="3.493625199s" podCreationTimestamp="2026-03-12 14:02:00 +0000 UTC" firstStartedPulling="2026-03-12 14:02:00.981021716 +0000 UTC m=+3143.671093687" lastFinishedPulling="2026-03-12 14:02:02.678154644 +0000 UTC m=+3145.368226615" observedRunningTime="2026-03-12 14:02:03.492456482 +0000 UTC m=+3146.182528453" watchObservedRunningTime="2026-03-12 14:02:03.493625199 +0000 UTC m=+3146.183697170" Mar 12 14:02:03 crc kubenswrapper[4921]: I0312 14:02:03.554419 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 12 14:02:03 crc kubenswrapper[4921]: I0312 14:02:03.994747 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5d74c96-9dfe-4db7-8287-b68a27840cf8" path="/var/lib/kubelet/pods/d5d74c96-9dfe-4db7-8287-b68a27840cf8/volumes" Mar 12 14:02:04 crc kubenswrapper[4921]: I0312 14:02:04.488714 4921 generic.go:334] "Generic (PLEG): container finished" podID="626b7901-75fd-4a39-bef1-2fc34d374f41" containerID="39f0e93d2aac44476afa1cd4f5c38b431622b2118c53082b5e35045add8acf6c" exitCode=0 Mar 12 14:02:04 crc kubenswrapper[4921]: I0312 14:02:04.488925 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555402-jsfzg" event={"ID":"626b7901-75fd-4a39-bef1-2fc34d374f41","Type":"ContainerDied","Data":"39f0e93d2aac44476afa1cd4f5c38b431622b2118c53082b5e35045add8acf6c"} Mar 12 14:02:04 crc kubenswrapper[4921]: I0312 14:02:04.491397 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d506b9f9-1563-432f-9b21-760ceb017fe9","Type":"ContainerStarted","Data":"49c27591559b601e7b6191e5b9b07ea87241211634749bf77f6f2b290bd2d618"} Mar 12 14:02:04 crc kubenswrapper[4921]: I0312 14:02:04.491421 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d506b9f9-1563-432f-9b21-760ceb017fe9","Type":"ContainerStarted","Data":"3c9954a519cd4eff4ebe22474ca8ca19dff0558dfbdef20acd0e15ca89fbf0ae"} Mar 12 14:02:05 crc kubenswrapper[4921]: I0312 14:02:05.366559 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-77dd7dfdbc-tsrfv" podUID="7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.1.4:9696/\": dial tcp 10.217.1.4:9696: connect: connection refused" Mar 12 14:02:05 crc kubenswrapper[4921]: I0312 14:02:05.392343 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:02:05 crc kubenswrapper[4921]: I0312 14:02:05.510637 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d506b9f9-1563-432f-9b21-760ceb017fe9","Type":"ContainerStarted","Data":"c76e50e4204a41ae2f2ac3823c7fa35dcf09bbdef6ecd3ca93a013c481ea4c63"} Mar 12 14:02:05 crc kubenswrapper[4921]: I0312 14:02:05.548376 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.548355218 podStartE2EDuration="3.548355218s" podCreationTimestamp="2026-03-12 14:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:02:05.533205781 +0000 UTC m=+3148.223277752" watchObservedRunningTime="2026-03-12 14:02:05.548355218 +0000 UTC m=+3148.238427189" Mar 12 14:02:05 crc kubenswrapper[4921]: I0312 14:02:05.887917 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:02:06 crc kubenswrapper[4921]: I0312 14:02:06.093660 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:02:06 crc kubenswrapper[4921]: I0312 14:02:06.100002 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 12 14:02:06 crc kubenswrapper[4921]: I0312 14:02:06.220354 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555402-jsfzg" Mar 12 14:02:06 crc kubenswrapper[4921]: I0312 14:02:06.309959 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9snm8\" (UniqueName: \"kubernetes.io/projected/626b7901-75fd-4a39-bef1-2fc34d374f41-kube-api-access-9snm8\") pod \"626b7901-75fd-4a39-bef1-2fc34d374f41\" (UID: \"626b7901-75fd-4a39-bef1-2fc34d374f41\") " Mar 12 14:02:06 crc kubenswrapper[4921]: I0312 14:02:06.315861 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/626b7901-75fd-4a39-bef1-2fc34d374f41-kube-api-access-9snm8" (OuterVolumeSpecName: "kube-api-access-9snm8") pod "626b7901-75fd-4a39-bef1-2fc34d374f41" (UID: "626b7901-75fd-4a39-bef1-2fc34d374f41"). InnerVolumeSpecName "kube-api-access-9snm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:02:06 crc kubenswrapper[4921]: I0312 14:02:06.412955 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9snm8\" (UniqueName: \"kubernetes.io/projected/626b7901-75fd-4a39-bef1-2fc34d374f41-kube-api-access-9snm8\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:06 crc kubenswrapper[4921]: I0312 14:02:06.530676 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555402-jsfzg" event={"ID":"626b7901-75fd-4a39-bef1-2fc34d374f41","Type":"ContainerDied","Data":"e1f63fdf590e3306561e600fd055ca26d610e4d4fdf6fcc471c749711180b19b"} Mar 12 14:02:06 crc kubenswrapper[4921]: I0312 14:02:06.530735 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1f63fdf590e3306561e600fd055ca26d610e4d4fdf6fcc471c749711180b19b" Mar 12 14:02:06 crc kubenswrapper[4921]: I0312 14:02:06.530702 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555402-jsfzg" Mar 12 14:02:06 crc kubenswrapper[4921]: I0312 14:02:06.586737 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555396-xbq7x"] Mar 12 14:02:06 crc kubenswrapper[4921]: I0312 14:02:06.594472 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555396-xbq7x"] Mar 12 14:02:06 crc kubenswrapper[4921]: I0312 14:02:06.825294 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-c8b44c5c7-l6d8m" Mar 12 14:02:07 crc kubenswrapper[4921]: I0312 14:02:07.994601 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e8e0d28-8ca4-4de9-aaf1-27d835622e57" path="/var/lib/kubelet/pods/1e8e0d28-8ca4-4de9-aaf1-27d835622e57/volumes" Mar 12 14:02:08 crc kubenswrapper[4921]: I0312 14:02:08.929186 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6f698cc456-lcngv" podUID="e88e6256-b5e0-44bc-8f61-31e31844f957" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.15:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.15:8443: connect: connection refused" Mar 12 14:02:08 crc kubenswrapper[4921]: I0312 14:02:08.968376 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-bbd56cc76-cwl96" podUID="e6e62dec-8193-4d3c-a111-2ee250f79b86" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.16:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.16:8443: connect: connection refused" Mar 12 14:02:09 crc kubenswrapper[4921]: I0312 14:02:09.078978 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 14:02:09 crc kubenswrapper[4921]: I0312 14:02:09.079044 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 12 14:02:09 crc kubenswrapper[4921]: I0312 14:02:09.116555 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 14:02:09 crc kubenswrapper[4921]: I0312 14:02:09.150743 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 12 14:02:09 crc kubenswrapper[4921]: I0312 14:02:09.561063 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 12 14:02:09 crc kubenswrapper[4921]: I0312 14:02:09.561405 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 12 14:02:11 crc kubenswrapper[4921]: I0312 14:02:11.976925 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 14:02:11 crc kubenswrapper[4921]: I0312 14:02:11.977422 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 14:02:12 crc kubenswrapper[4921]: I0312 14:02:12.015138 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.144281 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.145914 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.190395 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.193372 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.600535 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77dd7dfdbc-tsrfv_7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5/neutron-api/0.log" Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.600970 4921 generic.go:334] "Generic (PLEG): container finished" podID="7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" containerID="15dcb7800a5ace68fcd8ca6e2d30df6e7f87ab1a9fafbe0d36b28f5bfd5746c9" exitCode=137 Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.601433 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77dd7dfdbc-tsrfv" event={"ID":"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5","Type":"ContainerDied","Data":"15dcb7800a5ace68fcd8ca6e2d30df6e7f87ab1a9fafbe0d36b28f5bfd5746c9"} Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.602171 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.602840 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.856731 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-547b7895d7-9c58r" Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.911607 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77dd7dfdbc-bp67m"] Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.915395 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77dd7dfdbc-bp67m" podUID="426d27b4-1f08-4c20-84c9-67b47fbc4753" containerName="neutron-api" containerID="cri-o://a6f5bd804e97a767d1a05fb61556f05b13af54eb76fa4abc65dc0c39906023dc" gracePeriod=30 Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.915440 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77dd7dfdbc-bp67m" podUID="426d27b4-1f08-4c20-84c9-67b47fbc4753" containerName="neutron-httpd" containerID="cri-o://65c14d7e8788a0f9a68db25707046d3b0017bcec70c84e179a1930b781c69dc4" gracePeriod=30 Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.986518 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-547b7895d7-42nbh"] Mar 12 14:02:13 crc kubenswrapper[4921]: E0312 14:02:13.995509 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626b7901-75fd-4a39-bef1-2fc34d374f41" containerName="oc" Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.995608 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="626b7901-75fd-4a39-bef1-2fc34d374f41" containerName="oc" Mar 12 14:02:13 crc kubenswrapper[4921]: I0312 14:02:13.995857 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="626b7901-75fd-4a39-bef1-2fc34d374f41" containerName="oc" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.022203 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-httpd-config\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.022561 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-public-tls-certs\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.022743 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzv7r\" (UniqueName: \"kubernetes.io/projected/f1a475b3-67ed-40db-b403-0f82930d5d36-kube-api-access-lzv7r\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.022968 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-internal-tls-certs\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.023273 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-combined-ca-bundle\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.023458 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-config\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.023619 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-ovndb-tls-certs\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.028234 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.064803 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-547b7895d7-42nbh"] Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.065962 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:02:14 crc kubenswrapper[4921]: E0312 14:02:14.066318 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.131591 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-httpd-config\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.131661 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-public-tls-certs\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.131701 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzv7r\" (UniqueName: \"kubernetes.io/projected/f1a475b3-67ed-40db-b403-0f82930d5d36-kube-api-access-lzv7r\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.131752 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-internal-tls-certs\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.131776 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-combined-ca-bundle\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.131837 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-config\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.131884 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-ovndb-tls-certs\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.141235 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-config\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.141573 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-internal-tls-certs\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.142330 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-combined-ca-bundle\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.148922 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-httpd-config\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.149448 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-ovndb-tls-certs\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.150790 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzv7r\" (UniqueName: \"kubernetes.io/projected/f1a475b3-67ed-40db-b403-0f82930d5d36-kube-api-access-lzv7r\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.150867 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1a475b3-67ed-40db-b403-0f82930d5d36-public-tls-certs\") pod \"neutron-547b7895d7-42nbh\" (UID: \"f1a475b3-67ed-40db-b403-0f82930d5d36\") " pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.215431 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.319168 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77dd7dfdbc-tsrfv_7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5/neutron-api/0.log" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.319301 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.336332 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-combined-ca-bundle\") pod \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.336450 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9l84\" (UniqueName: \"kubernetes.io/projected/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-kube-api-access-d9l84\") pod \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.336487 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-httpd-config\") pod \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.336543 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-internal-tls-certs\") pod \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.336583 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-public-tls-certs\") pod \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.336611 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-config\") pod \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.336627 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-ovndb-tls-certs\") pod \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\" (UID: \"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5\") " Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.351108 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-kube-api-access-d9l84" (OuterVolumeSpecName: "kube-api-access-d9l84") pod "7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" (UID: "7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5"). InnerVolumeSpecName "kube-api-access-d9l84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.363275 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" (UID: "7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.444572 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9l84\" (UniqueName: \"kubernetes.io/projected/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-kube-api-access-d9l84\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.444734 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.445773 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" (UID: "7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.446310 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" (UID: "7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.464256 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" (UID: "7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.493654 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-config" (OuterVolumeSpecName: "config") pod "7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" (UID: "7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.516251 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" (UID: "7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.547638 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.547675 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.547684 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.547695 4921 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.547705 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.611122 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77dd7dfdbc-tsrfv_7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5/neutron-api/0.log" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.611191 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77dd7dfdbc-tsrfv" event={"ID":"7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5","Type":"ContainerDied","Data":"8a7c795bd4de4e959fb37f8009f3bcbd6ea6786f1d92d08c401a219f8c5f0efb"} Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.611228 4921 scope.go:117] "RemoveContainer" containerID="45d1d4c39d265b4155f1a4db8086bc519b8140af18883faf3df4cf5a7b54227e" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.611341 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77dd7dfdbc-tsrfv" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.620544 4921 generic.go:334] "Generic (PLEG): container finished" podID="426d27b4-1f08-4c20-84c9-67b47fbc4753" containerID="65c14d7e8788a0f9a68db25707046d3b0017bcec70c84e179a1930b781c69dc4" exitCode=0 Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.620725 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77dd7dfdbc-bp67m" event={"ID":"426d27b4-1f08-4c20-84c9-67b47fbc4753","Type":"ContainerDied","Data":"65c14d7e8788a0f9a68db25707046d3b0017bcec70c84e179a1930b781c69dc4"} Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.718491 4921 scope.go:117] "RemoveContainer" containerID="15dcb7800a5ace68fcd8ca6e2d30df6e7f87ab1a9fafbe0d36b28f5bfd5746c9" Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.720861 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77dd7dfdbc-tsrfv"] Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.731166 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-77dd7dfdbc-tsrfv"] Mar 12 14:02:14 crc kubenswrapper[4921]: I0312 14:02:14.873508 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-547b7895d7-42nbh"] Mar 12 14:02:15 crc kubenswrapper[4921]: I0312 14:02:15.629506 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547b7895d7-42nbh" event={"ID":"f1a475b3-67ed-40db-b403-0f82930d5d36","Type":"ContainerStarted","Data":"4225b8827f713ba242474dcc5e77043739a126374fa34cf283a4cb30ade1f495"} Mar 12 14:02:15 crc kubenswrapper[4921]: I0312 14:02:15.630207 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547b7895d7-42nbh" event={"ID":"f1a475b3-67ed-40db-b403-0f82930d5d36","Type":"ContainerStarted","Data":"ac23c6d3923ee217c29d70c2e05db4202016e8bae90316b2a3f87b953faa2ede"} Mar 12 14:02:15 crc kubenswrapper[4921]: I0312 14:02:15.630295 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-547b7895d7-42nbh" event={"ID":"f1a475b3-67ed-40db-b403-0f82930d5d36","Type":"ContainerStarted","Data":"d2e80e8f65bfc4d64c33aec47e0bbb585af9489aa72c80c59da7b9cefc3c8ea7"} Mar 12 14:02:15 crc kubenswrapper[4921]: I0312 14:02:15.630312 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:15 crc kubenswrapper[4921]: I0312 14:02:15.630982 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 14:02:15 crc kubenswrapper[4921]: I0312 14:02:15.631004 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 14:02:15 crc kubenswrapper[4921]: I0312 14:02:15.993914 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" path="/var/lib/kubelet/pods/7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5/volumes" Mar 12 14:02:16 crc kubenswrapper[4921]: I0312 14:02:16.605452 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 14:02:16 crc kubenswrapper[4921]: I0312 14:02:16.642218 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-547b7895d7-42nbh" podStartSLOduration=3.642200396 podStartE2EDuration="3.642200396s" podCreationTimestamp="2026-03-12 14:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:02:15.664490232 +0000 UTC m=+3158.354562193" watchObservedRunningTime="2026-03-12 14:02:16.642200396 +0000 UTC m=+3159.332272367" Mar 12 14:02:16 crc kubenswrapper[4921]: I0312 14:02:16.645189 4921 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 14:02:16 crc kubenswrapper[4921]: I0312 14:02:16.819820 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 12 14:02:20 crc kubenswrapper[4921]: I0312 14:02:20.751372 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:02:20 crc kubenswrapper[4921]: I0312 14:02:20.771270 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:02:22 crc kubenswrapper[4921]: I0312 14:02:22.593270 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:02:22 crc kubenswrapper[4921]: I0312 14:02:22.627209 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-bbd56cc76-cwl96" Mar 12 14:02:22 crc kubenswrapper[4921]: I0312 14:02:22.783130 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f698cc456-lcngv"] Mar 12 14:02:22 crc kubenswrapper[4921]: I0312 14:02:22.783341 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f698cc456-lcngv" podUID="e88e6256-b5e0-44bc-8f61-31e31844f957" containerName="horizon-log" containerID="cri-o://8f2e2f6850e5ed504a8792d3daf8c4b2aa409ba15c8634aa1d5079c30c21ccba" gracePeriod=30 Mar 12 14:02:22 crc kubenswrapper[4921]: I0312 14:02:22.783511 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6f698cc456-lcngv" podUID="e88e6256-b5e0-44bc-8f61-31e31844f957" containerName="horizon" containerID="cri-o://9c34569e4ff3e5a70ccd9fa4202d72a797cac94e1f445c82a59664436b20418c" gracePeriod=30 Mar 12 14:02:24 crc kubenswrapper[4921]: I0312 14:02:24.984311 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:02:24 crc kubenswrapper[4921]: E0312 14:02:24.985318 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.767269 4921 generic.go:334] "Generic (PLEG): container finished" podID="e88e6256-b5e0-44bc-8f61-31e31844f957" containerID="9c34569e4ff3e5a70ccd9fa4202d72a797cac94e1f445c82a59664436b20418c" exitCode=0 Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.767335 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f698cc456-lcngv" event={"ID":"e88e6256-b5e0-44bc-8f61-31e31844f957","Type":"ContainerDied","Data":"9c34569e4ff3e5a70ccd9fa4202d72a797cac94e1f445c82a59664436b20418c"} Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.773659 4921 generic.go:334] "Generic (PLEG): container finished" podID="f081f129-7b40-467c-98cc-420f18d1d3ca" containerID="92eb4c6b9c5004a837e83effdc00b01cbf738416d64a6afb99e84d38fecc7584" exitCode=137 Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.773682 4921 generic.go:334] "Generic (PLEG): container finished" podID="f081f129-7b40-467c-98cc-420f18d1d3ca" containerID="77e0acb25e568d205e7e59e9d8be85b343ee0622b60ebfc9eea244d3ed4c049e" exitCode=137 Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.773726 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dc95b94d7-gz566" event={"ID":"f081f129-7b40-467c-98cc-420f18d1d3ca","Type":"ContainerDied","Data":"92eb4c6b9c5004a837e83effdc00b01cbf738416d64a6afb99e84d38fecc7584"} Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.773749 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dc95b94d7-gz566" event={"ID":"f081f129-7b40-467c-98cc-420f18d1d3ca","Type":"ContainerDied","Data":"77e0acb25e568d205e7e59e9d8be85b343ee0622b60ebfc9eea244d3ed4c049e"} Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.773759 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5dc95b94d7-gz566" event={"ID":"f081f129-7b40-467c-98cc-420f18d1d3ca","Type":"ContainerDied","Data":"8c88643a54b8332d972598d1ec1b39c6bdd0cbd6e46059c936b22bac501508bc"} Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.773768 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c88643a54b8332d972598d1ec1b39c6bdd0cbd6e46059c936b22bac501508bc" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.776420 4921 generic.go:334] "Generic (PLEG): container finished" podID="0e083a0f-e15a-4541-ac5b-2870ce8a245c" containerID="cb52a62d9c5b19797e3ceb98c52215f057a49903ece973753480ee845479bc62" exitCode=137 Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.776434 4921 generic.go:334] "Generic (PLEG): container finished" podID="0e083a0f-e15a-4541-ac5b-2870ce8a245c" containerID="e32b8fdd925530f81c05fc575e6e57e489f3e5a757c91e749b1d60ea6afd502a" exitCode=137 Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.776446 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56b9755d7c-lfsh8" event={"ID":"0e083a0f-e15a-4541-ac5b-2870ce8a245c","Type":"ContainerDied","Data":"cb52a62d9c5b19797e3ceb98c52215f057a49903ece973753480ee845479bc62"} Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.776462 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56b9755d7c-lfsh8" event={"ID":"0e083a0f-e15a-4541-ac5b-2870ce8a245c","Type":"ContainerDied","Data":"e32b8fdd925530f81c05fc575e6e57e489f3e5a757c91e749b1d60ea6afd502a"} Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.866168 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.870791 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.872155 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f081f129-7b40-467c-98cc-420f18d1d3ca-horizon-secret-key\") pod \"f081f129-7b40-467c-98cc-420f18d1d3ca\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.872202 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f081f129-7b40-467c-98cc-420f18d1d3ca-config-data\") pod \"f081f129-7b40-467c-98cc-420f18d1d3ca\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.872294 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f081f129-7b40-467c-98cc-420f18d1d3ca-logs\") pod \"f081f129-7b40-467c-98cc-420f18d1d3ca\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.873133 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f081f129-7b40-467c-98cc-420f18d1d3ca-logs" (OuterVolumeSpecName: "logs") pod "f081f129-7b40-467c-98cc-420f18d1d3ca" (UID: "f081f129-7b40-467c-98cc-420f18d1d3ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.873763 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nxqr\" (UniqueName: \"kubernetes.io/projected/f081f129-7b40-467c-98cc-420f18d1d3ca-kube-api-access-5nxqr\") pod \"f081f129-7b40-467c-98cc-420f18d1d3ca\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.874319 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f081f129-7b40-467c-98cc-420f18d1d3ca-scripts\") pod \"f081f129-7b40-467c-98cc-420f18d1d3ca\" (UID: \"f081f129-7b40-467c-98cc-420f18d1d3ca\") " Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.874903 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f081f129-7b40-467c-98cc-420f18d1d3ca-logs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.881881 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f081f129-7b40-467c-98cc-420f18d1d3ca-kube-api-access-5nxqr" (OuterVolumeSpecName: "kube-api-access-5nxqr") pod "f081f129-7b40-467c-98cc-420f18d1d3ca" (UID: "f081f129-7b40-467c-98cc-420f18d1d3ca"). InnerVolumeSpecName "kube-api-access-5nxqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.882768 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f081f129-7b40-467c-98cc-420f18d1d3ca-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f081f129-7b40-467c-98cc-420f18d1d3ca" (UID: "f081f129-7b40-467c-98cc-420f18d1d3ca"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.913538 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f081f129-7b40-467c-98cc-420f18d1d3ca-scripts" (OuterVolumeSpecName: "scripts") pod "f081f129-7b40-467c-98cc-420f18d1d3ca" (UID: "f081f129-7b40-467c-98cc-420f18d1d3ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.922243 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f081f129-7b40-467c-98cc-420f18d1d3ca-config-data" (OuterVolumeSpecName: "config-data") pod "f081f129-7b40-467c-98cc-420f18d1d3ca" (UID: "f081f129-7b40-467c-98cc-420f18d1d3ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.976165 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e083a0f-e15a-4541-ac5b-2870ce8a245c-scripts\") pod \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.976347 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9vbr\" (UniqueName: \"kubernetes.io/projected/0e083a0f-e15a-4541-ac5b-2870ce8a245c-kube-api-access-z9vbr\") pod \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.976407 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e083a0f-e15a-4541-ac5b-2870ce8a245c-logs\") pod \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.976595 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e083a0f-e15a-4541-ac5b-2870ce8a245c-config-data\") pod \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.976630 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0e083a0f-e15a-4541-ac5b-2870ce8a245c-horizon-secret-key\") pod \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\" (UID: \"0e083a0f-e15a-4541-ac5b-2870ce8a245c\") " Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.977000 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nxqr\" (UniqueName: \"kubernetes.io/projected/f081f129-7b40-467c-98cc-420f18d1d3ca-kube-api-access-5nxqr\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.977022 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f081f129-7b40-467c-98cc-420f18d1d3ca-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.977032 4921 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f081f129-7b40-467c-98cc-420f18d1d3ca-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.977040 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f081f129-7b40-467c-98cc-420f18d1d3ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.977872 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e083a0f-e15a-4541-ac5b-2870ce8a245c-logs" (OuterVolumeSpecName: "logs") pod "0e083a0f-e15a-4541-ac5b-2870ce8a245c" (UID: "0e083a0f-e15a-4541-ac5b-2870ce8a245c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.980040 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e083a0f-e15a-4541-ac5b-2870ce8a245c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0e083a0f-e15a-4541-ac5b-2870ce8a245c" (UID: "0e083a0f-e15a-4541-ac5b-2870ce8a245c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:26 crc kubenswrapper[4921]: I0312 14:02:26.981210 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e083a0f-e15a-4541-ac5b-2870ce8a245c-kube-api-access-z9vbr" (OuterVolumeSpecName: "kube-api-access-z9vbr") pod "0e083a0f-e15a-4541-ac5b-2870ce8a245c" (UID: "0e083a0f-e15a-4541-ac5b-2870ce8a245c"). InnerVolumeSpecName "kube-api-access-z9vbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.001258 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e083a0f-e15a-4541-ac5b-2870ce8a245c-scripts" (OuterVolumeSpecName: "scripts") pod "0e083a0f-e15a-4541-ac5b-2870ce8a245c" (UID: "0e083a0f-e15a-4541-ac5b-2870ce8a245c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.011027 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e083a0f-e15a-4541-ac5b-2870ce8a245c-config-data" (OuterVolumeSpecName: "config-data") pod "0e083a0f-e15a-4541-ac5b-2870ce8a245c" (UID: "0e083a0f-e15a-4541-ac5b-2870ce8a245c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.079327 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0e083a0f-e15a-4541-ac5b-2870ce8a245c-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.079364 4921 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0e083a0f-e15a-4541-ac5b-2870ce8a245c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.079373 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e083a0f-e15a-4541-ac5b-2870ce8a245c-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.079385 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9vbr\" (UniqueName: \"kubernetes.io/projected/0e083a0f-e15a-4541-ac5b-2870ce8a245c-kube-api-access-z9vbr\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.079395 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e083a0f-e15a-4541-ac5b-2870ce8a245c-logs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.786771 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56b9755d7c-lfsh8" event={"ID":"0e083a0f-e15a-4541-ac5b-2870ce8a245c","Type":"ContainerDied","Data":"2b315090f86039051b321c497214e4f06a7ca0f598697d9a9782187cd8e8c3c8"} Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.787115 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56b9755d7c-lfsh8" Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.787302 4921 scope.go:117] "RemoveContainer" containerID="cb52a62d9c5b19797e3ceb98c52215f057a49903ece973753480ee845479bc62" Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.791054 4921 generic.go:334] "Generic (PLEG): container finished" podID="426d27b4-1f08-4c20-84c9-67b47fbc4753" containerID="a6f5bd804e97a767d1a05fb61556f05b13af54eb76fa4abc65dc0c39906023dc" exitCode=0 Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.791138 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5dc95b94d7-gz566" Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.792946 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77dd7dfdbc-bp67m" event={"ID":"426d27b4-1f08-4c20-84c9-67b47fbc4753","Type":"ContainerDied","Data":"a6f5bd804e97a767d1a05fb61556f05b13af54eb76fa4abc65dc0c39906023dc"} Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.837474 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5dc95b94d7-gz566"] Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.856001 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5dc95b94d7-gz566"] Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.863796 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-56b9755d7c-lfsh8"] Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.872678 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-56b9755d7c-lfsh8"] Mar 12 14:02:27 crc kubenswrapper[4921]: I0312 14:02:27.998373 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e083a0f-e15a-4541-ac5b-2870ce8a245c" path="/var/lib/kubelet/pods/0e083a0f-e15a-4541-ac5b-2870ce8a245c/volumes" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.000290 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f081f129-7b40-467c-98cc-420f18d1d3ca" path="/var/lib/kubelet/pods/f081f129-7b40-467c-98cc-420f18d1d3ca/volumes" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.001312 4921 scope.go:117] "RemoveContainer" containerID="e32b8fdd925530f81c05fc575e6e57e489f3e5a757c91e749b1d60ea6afd502a" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.207418 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.307736 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d9lp\" (UniqueName: \"kubernetes.io/projected/426d27b4-1f08-4c20-84c9-67b47fbc4753-kube-api-access-7d9lp\") pod \"426d27b4-1f08-4c20-84c9-67b47fbc4753\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.307848 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-config\") pod \"426d27b4-1f08-4c20-84c9-67b47fbc4753\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.308045 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-combined-ca-bundle\") pod \"426d27b4-1f08-4c20-84c9-67b47fbc4753\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.308191 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-ovndb-tls-certs\") pod \"426d27b4-1f08-4c20-84c9-67b47fbc4753\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.308225 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-httpd-config\") pod \"426d27b4-1f08-4c20-84c9-67b47fbc4753\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.308290 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-public-tls-certs\") pod \"426d27b4-1f08-4c20-84c9-67b47fbc4753\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.308322 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-internal-tls-certs\") pod \"426d27b4-1f08-4c20-84c9-67b47fbc4753\" (UID: \"426d27b4-1f08-4c20-84c9-67b47fbc4753\") " Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.314713 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "426d27b4-1f08-4c20-84c9-67b47fbc4753" (UID: "426d27b4-1f08-4c20-84c9-67b47fbc4753"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.318279 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/426d27b4-1f08-4c20-84c9-67b47fbc4753-kube-api-access-7d9lp" (OuterVolumeSpecName: "kube-api-access-7d9lp") pod "426d27b4-1f08-4c20-84c9-67b47fbc4753" (UID: "426d27b4-1f08-4c20-84c9-67b47fbc4753"). InnerVolumeSpecName "kube-api-access-7d9lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.375518 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "426d27b4-1f08-4c20-84c9-67b47fbc4753" (UID: "426d27b4-1f08-4c20-84c9-67b47fbc4753"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.376615 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "426d27b4-1f08-4c20-84c9-67b47fbc4753" (UID: "426d27b4-1f08-4c20-84c9-67b47fbc4753"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.392056 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-config" (OuterVolumeSpecName: "config") pod "426d27b4-1f08-4c20-84c9-67b47fbc4753" (UID: "426d27b4-1f08-4c20-84c9-67b47fbc4753"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.394306 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "426d27b4-1f08-4c20-84c9-67b47fbc4753" (UID: "426d27b4-1f08-4c20-84c9-67b47fbc4753"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.396283 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "426d27b4-1f08-4c20-84c9-67b47fbc4753" (UID: "426d27b4-1f08-4c20-84c9-67b47fbc4753"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.410277 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.410311 4921 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.410320 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.410328 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.410336 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.410346 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d9lp\" (UniqueName: \"kubernetes.io/projected/426d27b4-1f08-4c20-84c9-67b47fbc4753-kube-api-access-7d9lp\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.410357 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/426d27b4-1f08-4c20-84c9-67b47fbc4753-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.804512 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77dd7dfdbc-bp67m" event={"ID":"426d27b4-1f08-4c20-84c9-67b47fbc4753","Type":"ContainerDied","Data":"4d5f602f5fbbc38d69a28156dde47e51200d00e7f0a8d0d4bcb619b9900a14c7"} Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.804891 4921 scope.go:117] "RemoveContainer" containerID="65c14d7e8788a0f9a68db25707046d3b0017bcec70c84e179a1930b781c69dc4" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.804603 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77dd7dfdbc-bp67m" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.842267 4921 scope.go:117] "RemoveContainer" containerID="a6f5bd804e97a767d1a05fb61556f05b13af54eb76fa4abc65dc0c39906023dc" Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.845878 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77dd7dfdbc-bp67m"] Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.857233 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-77dd7dfdbc-bp67m"] Mar 12 14:02:28 crc kubenswrapper[4921]: I0312 14:02:28.928941 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6f698cc456-lcngv" podUID="e88e6256-b5e0-44bc-8f61-31e31844f957" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.15:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.15:8443: connect: connection refused" Mar 12 14:02:29 crc kubenswrapper[4921]: I0312 14:02:29.997555 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="426d27b4-1f08-4c20-84c9-67b47fbc4753" path="/var/lib/kubelet/pods/426d27b4-1f08-4c20-84c9-67b47fbc4753/volumes" Mar 12 14:02:38 crc kubenswrapper[4921]: I0312 14:02:38.928615 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6f698cc456-lcngv" podUID="e88e6256-b5e0-44bc-8f61-31e31844f957" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.15:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.15:8443: connect: connection refused" Mar 12 14:02:38 crc kubenswrapper[4921]: I0312 14:02:38.984350 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:02:38 crc kubenswrapper[4921]: E0312 14:02:38.985028 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:02:44 crc kubenswrapper[4921]: I0312 14:02:44.233555 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-547b7895d7-42nbh" Mar 12 14:02:44 crc kubenswrapper[4921]: I0312 14:02:44.327616 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55b9d64f77-lv45q"] Mar 12 14:02:44 crc kubenswrapper[4921]: I0312 14:02:44.327999 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55b9d64f77-lv45q" podUID="0cc858fd-b2e1-4626-9e77-215bd07e374f" containerName="neutron-api" containerID="cri-o://94d44893e3c76ca70438854c97d11226b338e90efa37c39f8880b1921387d403" gracePeriod=30 Mar 12 14:02:44 crc kubenswrapper[4921]: I0312 14:02:44.328559 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55b9d64f77-lv45q" podUID="0cc858fd-b2e1-4626-9e77-215bd07e374f" containerName="neutron-httpd" containerID="cri-o://1f71e67876e862d1c7c3e10a287d57787c9a4de035f4f61e037024d4860147ac" gracePeriod=30 Mar 12 14:02:44 crc kubenswrapper[4921]: I0312 14:02:44.964862 4921 generic.go:334] "Generic (PLEG): container finished" podID="0cc858fd-b2e1-4626-9e77-215bd07e374f" containerID="1f71e67876e862d1c7c3e10a287d57787c9a4de035f4f61e037024d4860147ac" exitCode=0 Mar 12 14:02:44 crc kubenswrapper[4921]: I0312 14:02:44.964963 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b9d64f77-lv45q" event={"ID":"0cc858fd-b2e1-4626-9e77-215bd07e374f","Type":"ContainerDied","Data":"1f71e67876e862d1c7c3e10a287d57787c9a4de035f4f61e037024d4860147ac"} Mar 12 14:02:44 crc kubenswrapper[4921]: I0312 14:02:44.986958 4921 scope.go:117] "RemoveContainer" containerID="195b1aa597b32127bb8951f6be388ab80e01b7d7ad43807926858a5a1bf81feb" Mar 12 14:02:48 crc kubenswrapper[4921]: I0312 14:02:48.928447 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6f698cc456-lcngv" podUID="e88e6256-b5e0-44bc-8f61-31e31844f957" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.15:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.15:8443: connect: connection refused" Mar 12 14:02:48 crc kubenswrapper[4921]: I0312 14:02:48.929111 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.563836 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.724938 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-config\") pod \"0cc858fd-b2e1-4626-9e77-215bd07e374f\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.725258 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-internal-tls-certs\") pod \"0cc858fd-b2e1-4626-9e77-215bd07e374f\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.725377 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhx2w\" (UniqueName: \"kubernetes.io/projected/0cc858fd-b2e1-4626-9e77-215bd07e374f-kube-api-access-rhx2w\") pod \"0cc858fd-b2e1-4626-9e77-215bd07e374f\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.725415 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-combined-ca-bundle\") pod \"0cc858fd-b2e1-4626-9e77-215bd07e374f\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.725491 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-ovndb-tls-certs\") pod \"0cc858fd-b2e1-4626-9e77-215bd07e374f\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.725587 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-public-tls-certs\") pod \"0cc858fd-b2e1-4626-9e77-215bd07e374f\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.725661 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-httpd-config\") pod \"0cc858fd-b2e1-4626-9e77-215bd07e374f\" (UID: \"0cc858fd-b2e1-4626-9e77-215bd07e374f\") " Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.741333 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc858fd-b2e1-4626-9e77-215bd07e374f-kube-api-access-rhx2w" (OuterVolumeSpecName: "kube-api-access-rhx2w") pod "0cc858fd-b2e1-4626-9e77-215bd07e374f" (UID: "0cc858fd-b2e1-4626-9e77-215bd07e374f"). InnerVolumeSpecName "kube-api-access-rhx2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.743699 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0cc858fd-b2e1-4626-9e77-215bd07e374f" (UID: "0cc858fd-b2e1-4626-9e77-215bd07e374f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.796832 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-config" (OuterVolumeSpecName: "config") pod "0cc858fd-b2e1-4626-9e77-215bd07e374f" (UID: "0cc858fd-b2e1-4626-9e77-215bd07e374f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.805904 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0cc858fd-b2e1-4626-9e77-215bd07e374f" (UID: "0cc858fd-b2e1-4626-9e77-215bd07e374f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.810398 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cc858fd-b2e1-4626-9e77-215bd07e374f" (UID: "0cc858fd-b2e1-4626-9e77-215bd07e374f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.816146 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0cc858fd-b2e1-4626-9e77-215bd07e374f" (UID: "0cc858fd-b2e1-4626-9e77-215bd07e374f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.828540 4921 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.828573 4921 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.828583 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhx2w\" (UniqueName: \"kubernetes.io/projected/0cc858fd-b2e1-4626-9e77-215bd07e374f-kube-api-access-rhx2w\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.828593 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.828602 4921 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.828610 4921 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.851854 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0cc858fd-b2e1-4626-9e77-215bd07e374f" (UID: "0cc858fd-b2e1-4626-9e77-215bd07e374f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.930571 4921 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cc858fd-b2e1-4626-9e77-215bd07e374f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:50 crc kubenswrapper[4921]: I0312 14:02:50.983872 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:02:50 crc kubenswrapper[4921]: E0312 14:02:50.984213 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:02:51 crc kubenswrapper[4921]: I0312 14:02:51.043214 4921 generic.go:334] "Generic (PLEG): container finished" podID="0cc858fd-b2e1-4626-9e77-215bd07e374f" containerID="94d44893e3c76ca70438854c97d11226b338e90efa37c39f8880b1921387d403" exitCode=0 Mar 12 14:02:51 crc kubenswrapper[4921]: I0312 14:02:51.043257 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b9d64f77-lv45q" event={"ID":"0cc858fd-b2e1-4626-9e77-215bd07e374f","Type":"ContainerDied","Data":"94d44893e3c76ca70438854c97d11226b338e90efa37c39f8880b1921387d403"} Mar 12 14:02:51 crc kubenswrapper[4921]: I0312 14:02:51.043276 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55b9d64f77-lv45q" Mar 12 14:02:51 crc kubenswrapper[4921]: I0312 14:02:51.043298 4921 scope.go:117] "RemoveContainer" containerID="1f71e67876e862d1c7c3e10a287d57787c9a4de035f4f61e037024d4860147ac" Mar 12 14:02:51 crc kubenswrapper[4921]: I0312 14:02:51.043286 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b9d64f77-lv45q" event={"ID":"0cc858fd-b2e1-4626-9e77-215bd07e374f","Type":"ContainerDied","Data":"0fe3edd5e0220a288f865544cc65f17f03a77377f9fa75bb5e2d75800c42cac5"} Mar 12 14:02:51 crc kubenswrapper[4921]: I0312 14:02:51.110759 4921 scope.go:117] "RemoveContainer" containerID="94d44893e3c76ca70438854c97d11226b338e90efa37c39f8880b1921387d403" Mar 12 14:02:51 crc kubenswrapper[4921]: I0312 14:02:51.112304 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55b9d64f77-lv45q"] Mar 12 14:02:51 crc kubenswrapper[4921]: I0312 14:02:51.123388 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-55b9d64f77-lv45q"] Mar 12 14:02:51 crc kubenswrapper[4921]: I0312 14:02:51.137230 4921 scope.go:117] "RemoveContainer" containerID="1f71e67876e862d1c7c3e10a287d57787c9a4de035f4f61e037024d4860147ac" Mar 12 14:02:51 crc kubenswrapper[4921]: E0312 14:02:51.142293 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f71e67876e862d1c7c3e10a287d57787c9a4de035f4f61e037024d4860147ac\": container with ID starting with 1f71e67876e862d1c7c3e10a287d57787c9a4de035f4f61e037024d4860147ac not found: ID does not exist" containerID="1f71e67876e862d1c7c3e10a287d57787c9a4de035f4f61e037024d4860147ac" Mar 12 14:02:51 crc kubenswrapper[4921]: I0312 14:02:51.142391 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f71e67876e862d1c7c3e10a287d57787c9a4de035f4f61e037024d4860147ac"} err="failed to get container status \"1f71e67876e862d1c7c3e10a287d57787c9a4de035f4f61e037024d4860147ac\": rpc error: code = NotFound desc = could not find container \"1f71e67876e862d1c7c3e10a287d57787c9a4de035f4f61e037024d4860147ac\": container with ID starting with 1f71e67876e862d1c7c3e10a287d57787c9a4de035f4f61e037024d4860147ac not found: ID does not exist" Mar 12 14:02:51 crc kubenswrapper[4921]: I0312 14:02:51.142459 4921 scope.go:117] "RemoveContainer" containerID="94d44893e3c76ca70438854c97d11226b338e90efa37c39f8880b1921387d403" Mar 12 14:02:51 crc kubenswrapper[4921]: E0312 14:02:51.142938 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d44893e3c76ca70438854c97d11226b338e90efa37c39f8880b1921387d403\": container with ID starting with 94d44893e3c76ca70438854c97d11226b338e90efa37c39f8880b1921387d403 not found: ID does not exist" containerID="94d44893e3c76ca70438854c97d11226b338e90efa37c39f8880b1921387d403" Mar 12 14:02:51 crc kubenswrapper[4921]: I0312 14:02:51.142981 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d44893e3c76ca70438854c97d11226b338e90efa37c39f8880b1921387d403"} err="failed to get container status \"94d44893e3c76ca70438854c97d11226b338e90efa37c39f8880b1921387d403\": rpc error: code = NotFound desc = could not find container \"94d44893e3c76ca70438854c97d11226b338e90efa37c39f8880b1921387d403\": container with ID starting with 94d44893e3c76ca70438854c97d11226b338e90efa37c39f8880b1921387d403 not found: ID does not exist" Mar 12 14:02:51 crc kubenswrapper[4921]: I0312 14:02:51.993005 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc858fd-b2e1-4626-9e77-215bd07e374f" path="/var/lib/kubelet/pods/0cc858fd-b2e1-4626-9e77-215bd07e374f/volumes" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.074329 4921 generic.go:334] "Generic (PLEG): container finished" podID="e88e6256-b5e0-44bc-8f61-31e31844f957" containerID="8f2e2f6850e5ed504a8792d3daf8c4b2aa409ba15c8634aa1d5079c30c21ccba" exitCode=137 Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.074412 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f698cc456-lcngv" event={"ID":"e88e6256-b5e0-44bc-8f61-31e31844f957","Type":"ContainerDied","Data":"8f2e2f6850e5ed504a8792d3daf8c4b2aa409ba15c8634aa1d5079c30c21ccba"} Mar 12 14:02:53 crc kubenswrapper[4921]: E0312 14:02:53.076666 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode88e6256_b5e0_44bc_8f61_31e31844f957.slice/crio-conmon-8f2e2f6850e5ed504a8792d3daf8c4b2aa409ba15c8634aa1d5079c30c21ccba.scope\": RecentStats: unable to find data in memory cache]" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.232579 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.403135 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-combined-ca-bundle\") pod \"e88e6256-b5e0-44bc-8f61-31e31844f957\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.403634 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28xzm\" (UniqueName: \"kubernetes.io/projected/e88e6256-b5e0-44bc-8f61-31e31844f957-kube-api-access-28xzm\") pod \"e88e6256-b5e0-44bc-8f61-31e31844f957\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.403883 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e88e6256-b5e0-44bc-8f61-31e31844f957-scripts\") pod \"e88e6256-b5e0-44bc-8f61-31e31844f957\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.404350 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e88e6256-b5e0-44bc-8f61-31e31844f957-config-data\") pod \"e88e6256-b5e0-44bc-8f61-31e31844f957\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.404496 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-horizon-tls-certs\") pod \"e88e6256-b5e0-44bc-8f61-31e31844f957\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.404605 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-horizon-secret-key\") pod \"e88e6256-b5e0-44bc-8f61-31e31844f957\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.404786 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e88e6256-b5e0-44bc-8f61-31e31844f957-logs\") pod \"e88e6256-b5e0-44bc-8f61-31e31844f957\" (UID: \"e88e6256-b5e0-44bc-8f61-31e31844f957\") " Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.405901 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e88e6256-b5e0-44bc-8f61-31e31844f957-logs" (OuterVolumeSpecName: "logs") pod "e88e6256-b5e0-44bc-8f61-31e31844f957" (UID: "e88e6256-b5e0-44bc-8f61-31e31844f957"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.419915 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e88e6256-b5e0-44bc-8f61-31e31844f957-kube-api-access-28xzm" (OuterVolumeSpecName: "kube-api-access-28xzm") pod "e88e6256-b5e0-44bc-8f61-31e31844f957" (UID: "e88e6256-b5e0-44bc-8f61-31e31844f957"). InnerVolumeSpecName "kube-api-access-28xzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.420300 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e88e6256-b5e0-44bc-8f61-31e31844f957" (UID: "e88e6256-b5e0-44bc-8f61-31e31844f957"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.452202 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e88e6256-b5e0-44bc-8f61-31e31844f957" (UID: "e88e6256-b5e0-44bc-8f61-31e31844f957"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.455275 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e88e6256-b5e0-44bc-8f61-31e31844f957-scripts" (OuterVolumeSpecName: "scripts") pod "e88e6256-b5e0-44bc-8f61-31e31844f957" (UID: "e88e6256-b5e0-44bc-8f61-31e31844f957"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.459478 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e88e6256-b5e0-44bc-8f61-31e31844f957-config-data" (OuterVolumeSpecName: "config-data") pod "e88e6256-b5e0-44bc-8f61-31e31844f957" (UID: "e88e6256-b5e0-44bc-8f61-31e31844f957"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.475167 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "e88e6256-b5e0-44bc-8f61-31e31844f957" (UID: "e88e6256-b5e0-44bc-8f61-31e31844f957"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.508350 4921 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e88e6256-b5e0-44bc-8f61-31e31844f957-scripts\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.508386 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e88e6256-b5e0-44bc-8f61-31e31844f957-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.508398 4921 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.508412 4921 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.508422 4921 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e88e6256-b5e0-44bc-8f61-31e31844f957-logs\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.508433 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e88e6256-b5e0-44bc-8f61-31e31844f957-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:53 crc kubenswrapper[4921]: I0312 14:02:53.508443 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28xzm\" (UniqueName: \"kubernetes.io/projected/e88e6256-b5e0-44bc-8f61-31e31844f957-kube-api-access-28xzm\") on node \"crc\" DevicePath \"\"" Mar 12 14:02:54 crc kubenswrapper[4921]: I0312 14:02:54.088004 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f698cc456-lcngv" event={"ID":"e88e6256-b5e0-44bc-8f61-31e31844f957","Type":"ContainerDied","Data":"8fd31093ee07048c158413e63507d2b07e091454438290e32c63bfeb9d0ef4e0"} Mar 12 14:02:54 crc kubenswrapper[4921]: I0312 14:02:54.088105 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f698cc456-lcngv" Mar 12 14:02:54 crc kubenswrapper[4921]: I0312 14:02:54.088332 4921 scope.go:117] "RemoveContainer" containerID="9c34569e4ff3e5a70ccd9fa4202d72a797cac94e1f445c82a59664436b20418c" Mar 12 14:02:54 crc kubenswrapper[4921]: I0312 14:02:54.137882 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f698cc456-lcngv"] Mar 12 14:02:54 crc kubenswrapper[4921]: I0312 14:02:54.146559 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f698cc456-lcngv"] Mar 12 14:02:54 crc kubenswrapper[4921]: I0312 14:02:54.305985 4921 scope.go:117] "RemoveContainer" containerID="8f2e2f6850e5ed504a8792d3daf8c4b2aa409ba15c8634aa1d5079c30c21ccba" Mar 12 14:02:55 crc kubenswrapper[4921]: I0312 14:02:55.993219 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e88e6256-b5e0-44bc-8f61-31e31844f957" path="/var/lib/kubelet/pods/e88e6256-b5e0-44bc-8f61-31e31844f957/volumes" Mar 12 14:03:04 crc kubenswrapper[4921]: I0312 14:03:04.983589 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:03:04 crc kubenswrapper[4921]: E0312 14:03:04.984547 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:03:18 crc kubenswrapper[4921]: I0312 14:03:18.983106 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:03:18 crc kubenswrapper[4921]: E0312 14:03:18.985227 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:03:32 crc kubenswrapper[4921]: I0312 14:03:32.983667 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:03:32 crc kubenswrapper[4921]: E0312 14:03:32.984694 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:03:43 crc kubenswrapper[4921]: I0312 14:03:43.989544 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:03:43 crc kubenswrapper[4921]: E0312 14:03:43.990292 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.139887 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 12 14:03:50 crc kubenswrapper[4921]: E0312 14:03:50.140699 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc858fd-b2e1-4626-9e77-215bd07e374f" containerName="neutron-api" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.140718 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc858fd-b2e1-4626-9e77-215bd07e374f" containerName="neutron-api" Mar 12 14:03:50 crc kubenswrapper[4921]: E0312 14:03:50.140730 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e88e6256-b5e0-44bc-8f61-31e31844f957" containerName="horizon-log" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.140738 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e88e6256-b5e0-44bc-8f61-31e31844f957" containerName="horizon-log" Mar 12 14:03:50 crc kubenswrapper[4921]: E0312 14:03:50.140767 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc858fd-b2e1-4626-9e77-215bd07e374f" containerName="neutron-httpd" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.140776 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc858fd-b2e1-4626-9e77-215bd07e374f" containerName="neutron-httpd" Mar 12 14:03:50 crc kubenswrapper[4921]: E0312 14:03:50.140787 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e083a0f-e15a-4541-ac5b-2870ce8a245c" containerName="horizon" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.140794 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e083a0f-e15a-4541-ac5b-2870ce8a245c" containerName="horizon" Mar 12 14:03:50 crc kubenswrapper[4921]: E0312 14:03:50.140807 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f081f129-7b40-467c-98cc-420f18d1d3ca" containerName="horizon-log" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.140836 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f081f129-7b40-467c-98cc-420f18d1d3ca" containerName="horizon-log" Mar 12 14:03:50 crc kubenswrapper[4921]: E0312 14:03:50.140855 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426d27b4-1f08-4c20-84c9-67b47fbc4753" containerName="neutron-api" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.140864 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="426d27b4-1f08-4c20-84c9-67b47fbc4753" containerName="neutron-api" Mar 12 14:03:50 crc kubenswrapper[4921]: E0312 14:03:50.140875 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" containerName="neutron-httpd" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.140884 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" containerName="neutron-httpd" Mar 12 14:03:50 crc kubenswrapper[4921]: E0312 14:03:50.140903 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" containerName="neutron-api" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.140911 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" containerName="neutron-api" Mar 12 14:03:50 crc kubenswrapper[4921]: E0312 14:03:50.140923 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e083a0f-e15a-4541-ac5b-2870ce8a245c" containerName="horizon-log" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.140931 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e083a0f-e15a-4541-ac5b-2870ce8a245c" containerName="horizon-log" Mar 12 14:03:50 crc kubenswrapper[4921]: E0312 14:03:50.140941 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e88e6256-b5e0-44bc-8f61-31e31844f957" containerName="horizon" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.140949 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e88e6256-b5e0-44bc-8f61-31e31844f957" containerName="horizon" Mar 12 14:03:50 crc kubenswrapper[4921]: E0312 14:03:50.140969 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="426d27b4-1f08-4c20-84c9-67b47fbc4753" containerName="neutron-httpd" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.140977 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="426d27b4-1f08-4c20-84c9-67b47fbc4753" containerName="neutron-httpd" Mar 12 14:03:50 crc kubenswrapper[4921]: E0312 14:03:50.140991 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f081f129-7b40-467c-98cc-420f18d1d3ca" containerName="horizon" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.141000 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f081f129-7b40-467c-98cc-420f18d1d3ca" containerName="horizon" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.141264 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e88e6256-b5e0-44bc-8f61-31e31844f957" containerName="horizon-log" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.141281 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" containerName="neutron-api" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.141294 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="426d27b4-1f08-4c20-84c9-67b47fbc4753" containerName="neutron-httpd" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.141304 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="426d27b4-1f08-4c20-84c9-67b47fbc4753" containerName="neutron-api" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.141313 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f081f129-7b40-467c-98cc-420f18d1d3ca" containerName="horizon-log" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.141326 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d13f2b0-d36e-4e20-a7aa-0fce9a1728f5" containerName="neutron-httpd" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.141342 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e88e6256-b5e0-44bc-8f61-31e31844f957" containerName="horizon" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.141357 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e083a0f-e15a-4541-ac5b-2870ce8a245c" containerName="horizon" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.141369 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc858fd-b2e1-4626-9e77-215bd07e374f" containerName="neutron-api" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.141378 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc858fd-b2e1-4626-9e77-215bd07e374f" containerName="neutron-httpd" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.141391 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e083a0f-e15a-4541-ac5b-2870ce8a245c" containerName="horizon-log" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.141407 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f081f129-7b40-467c-98cc-420f18d1d3ca" containerName="horizon" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.142104 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.145306 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.145312 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-r9plt" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.145313 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.150106 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.168719 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.246965 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.247023 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nqqk\" (UniqueName: \"kubernetes.io/projected/b061c47e-9c37-48ed-a879-9263d780de9f-kube-api-access-9nqqk\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.247162 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.247235 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b061c47e-9c37-48ed-a879-9263d780de9f-config-data\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.247267 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b061c47e-9c37-48ed-a879-9263d780de9f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.247316 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b061c47e-9c37-48ed-a879-9263d780de9f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.247401 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.247496 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.247556 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b061c47e-9c37-48ed-a879-9263d780de9f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.350064 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.350108 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.350137 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b061c47e-9c37-48ed-a879-9263d780de9f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.350254 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.350279 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nqqk\" (UniqueName: \"kubernetes.io/projected/b061c47e-9c37-48ed-a879-9263d780de9f-kube-api-access-9nqqk\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.350372 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.350430 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b061c47e-9c37-48ed-a879-9263d780de9f-config-data\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.350457 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b061c47e-9c37-48ed-a879-9263d780de9f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.350492 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b061c47e-9c37-48ed-a879-9263d780de9f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.350959 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b061c47e-9c37-48ed-a879-9263d780de9f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.351178 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") device mount path \"/mnt/openstack/pv17\"" pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.351335 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b061c47e-9c37-48ed-a879-9263d780de9f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.351643 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b061c47e-9c37-48ed-a879-9263d780de9f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.352270 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b061c47e-9c37-48ed-a879-9263d780de9f-config-data\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.356382 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.363718 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.366236 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.366964 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nqqk\" (UniqueName: \"kubernetes.io/projected/b061c47e-9c37-48ed-a879-9263d780de9f-kube-api-access-9nqqk\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.384177 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"tempest-tests-tempest\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.463465 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 12 14:03:50 crc kubenswrapper[4921]: I0312 14:03:50.966282 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 12 14:03:51 crc kubenswrapper[4921]: I0312 14:03:51.703419 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b061c47e-9c37-48ed-a879-9263d780de9f","Type":"ContainerStarted","Data":"470c994aec24362bae4a0fe564f434cb8f71bf79cf999e9d406b81c3e7b4ca7f"} Mar 12 14:03:57 crc kubenswrapper[4921]: I0312 14:03:57.991505 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:03:57 crc kubenswrapper[4921]: E0312 14:03:57.992718 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:04:00 crc kubenswrapper[4921]: I0312 14:04:00.155786 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555404-dnjkp"] Mar 12 14:04:00 crc kubenswrapper[4921]: I0312 14:04:00.158398 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555404-dnjkp" Mar 12 14:04:00 crc kubenswrapper[4921]: I0312 14:04:00.160862 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:04:00 crc kubenswrapper[4921]: I0312 14:04:00.161056 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:04:00 crc kubenswrapper[4921]: I0312 14:04:00.161430 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:04:00 crc kubenswrapper[4921]: I0312 14:04:00.167035 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555404-dnjkp"] Mar 12 14:04:00 crc kubenswrapper[4921]: I0312 14:04:00.288696 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8ltm\" (UniqueName: \"kubernetes.io/projected/60262de2-7339-45cc-8f4e-7c74ede21b00-kube-api-access-w8ltm\") pod \"auto-csr-approver-29555404-dnjkp\" (UID: \"60262de2-7339-45cc-8f4e-7c74ede21b00\") " pod="openshift-infra/auto-csr-approver-29555404-dnjkp" Mar 12 14:04:00 crc kubenswrapper[4921]: I0312 14:04:00.391105 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8ltm\" (UniqueName: \"kubernetes.io/projected/60262de2-7339-45cc-8f4e-7c74ede21b00-kube-api-access-w8ltm\") pod \"auto-csr-approver-29555404-dnjkp\" (UID: \"60262de2-7339-45cc-8f4e-7c74ede21b00\") " pod="openshift-infra/auto-csr-approver-29555404-dnjkp" Mar 12 14:04:00 crc kubenswrapper[4921]: I0312 14:04:00.412928 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8ltm\" (UniqueName: \"kubernetes.io/projected/60262de2-7339-45cc-8f4e-7c74ede21b00-kube-api-access-w8ltm\") pod \"auto-csr-approver-29555404-dnjkp\" (UID: \"60262de2-7339-45cc-8f4e-7c74ede21b00\") " pod="openshift-infra/auto-csr-approver-29555404-dnjkp" Mar 12 14:04:00 crc kubenswrapper[4921]: I0312 14:04:00.486807 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555404-dnjkp" Mar 12 14:04:01 crc kubenswrapper[4921]: I0312 14:04:01.043091 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555404-dnjkp"] Mar 12 14:04:01 crc kubenswrapper[4921]: I0312 14:04:01.810775 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555404-dnjkp" event={"ID":"60262de2-7339-45cc-8f4e-7c74ede21b00","Type":"ContainerStarted","Data":"7111441b789e9bba92e24e9a299a8bd59288346bd4331f5b2ac7725f64a4b021"} Mar 12 14:04:08 crc kubenswrapper[4921]: I0312 14:04:08.984142 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:04:08 crc kubenswrapper[4921]: E0312 14:04:08.985030 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:04:17 crc kubenswrapper[4921]: E0312 14:04:17.545406 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 12 14:04:17 crc kubenswrapper[4921]: E0312 14:04:17.545883 4921 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9nqqk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(b061c47e-9c37-48ed-a879-9263d780de9f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 14:04:17 crc kubenswrapper[4921]: E0312 14:04:17.547120 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="b061c47e-9c37-48ed-a879-9263d780de9f" Mar 12 14:04:17 crc kubenswrapper[4921]: I0312 14:04:17.945228 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555404-dnjkp" event={"ID":"60262de2-7339-45cc-8f4e-7c74ede21b00","Type":"ContainerStarted","Data":"84e54ed51993e750155b176d5258bf04926bb8ea435dcd7673b5b0c8db4b1464"} Mar 12 14:04:17 crc kubenswrapper[4921]: E0312 14:04:17.947153 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="b061c47e-9c37-48ed-a879-9263d780de9f" Mar 12 14:04:17 crc kubenswrapper[4921]: I0312 14:04:17.981457 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555404-dnjkp" podStartSLOduration=1.5905470130000001 podStartE2EDuration="17.981433294s" podCreationTimestamp="2026-03-12 14:04:00 +0000 UTC" firstStartedPulling="2026-03-12 14:04:01.054700737 +0000 UTC m=+3263.744772708" lastFinishedPulling="2026-03-12 14:04:17.445586978 +0000 UTC m=+3280.135658989" observedRunningTime="2026-03-12 14:04:17.978912506 +0000 UTC m=+3280.668984497" watchObservedRunningTime="2026-03-12 14:04:17.981433294 +0000 UTC m=+3280.671505285" Mar 12 14:04:18 crc kubenswrapper[4921]: I0312 14:04:18.955966 4921 generic.go:334] "Generic (PLEG): container finished" podID="60262de2-7339-45cc-8f4e-7c74ede21b00" containerID="84e54ed51993e750155b176d5258bf04926bb8ea435dcd7673b5b0c8db4b1464" exitCode=0 Mar 12 14:04:18 crc kubenswrapper[4921]: I0312 14:04:18.956077 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555404-dnjkp" event={"ID":"60262de2-7339-45cc-8f4e-7c74ede21b00","Type":"ContainerDied","Data":"84e54ed51993e750155b176d5258bf04926bb8ea435dcd7673b5b0c8db4b1464"} Mar 12 14:04:20 crc kubenswrapper[4921]: I0312 14:04:20.392191 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555404-dnjkp" Mar 12 14:04:20 crc kubenswrapper[4921]: I0312 14:04:20.458036 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8ltm\" (UniqueName: \"kubernetes.io/projected/60262de2-7339-45cc-8f4e-7c74ede21b00-kube-api-access-w8ltm\") pod \"60262de2-7339-45cc-8f4e-7c74ede21b00\" (UID: \"60262de2-7339-45cc-8f4e-7c74ede21b00\") " Mar 12 14:04:20 crc kubenswrapper[4921]: I0312 14:04:20.466300 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60262de2-7339-45cc-8f4e-7c74ede21b00-kube-api-access-w8ltm" (OuterVolumeSpecName: "kube-api-access-w8ltm") pod "60262de2-7339-45cc-8f4e-7c74ede21b00" (UID: "60262de2-7339-45cc-8f4e-7c74ede21b00"). InnerVolumeSpecName "kube-api-access-w8ltm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:04:20 crc kubenswrapper[4921]: I0312 14:04:20.560240 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8ltm\" (UniqueName: \"kubernetes.io/projected/60262de2-7339-45cc-8f4e-7c74ede21b00-kube-api-access-w8ltm\") on node \"crc\" DevicePath \"\"" Mar 12 14:04:20 crc kubenswrapper[4921]: I0312 14:04:20.979088 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555404-dnjkp" event={"ID":"60262de2-7339-45cc-8f4e-7c74ede21b00","Type":"ContainerDied","Data":"7111441b789e9bba92e24e9a299a8bd59288346bd4331f5b2ac7725f64a4b021"} Mar 12 14:04:20 crc kubenswrapper[4921]: I0312 14:04:20.979499 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7111441b789e9bba92e24e9a299a8bd59288346bd4331f5b2ac7725f64a4b021" Mar 12 14:04:20 crc kubenswrapper[4921]: I0312 14:04:20.979161 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555404-dnjkp" Mar 12 14:04:21 crc kubenswrapper[4921]: I0312 14:04:21.506926 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555398-58zl7"] Mar 12 14:04:21 crc kubenswrapper[4921]: I0312 14:04:21.515765 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555398-58zl7"] Mar 12 14:04:22 crc kubenswrapper[4921]: I0312 14:04:22.563057 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab05a89-9086-4de8-9e24-03f59f6e2a0b" path="/var/lib/kubelet/pods/2ab05a89-9086-4de8-9e24-03f59f6e2a0b/volumes" Mar 12 14:04:23 crc kubenswrapper[4921]: I0312 14:04:23.983889 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:04:23 crc kubenswrapper[4921]: E0312 14:04:23.984538 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:04:32 crc kubenswrapper[4921]: I0312 14:04:32.087509 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b061c47e-9c37-48ed-a879-9263d780de9f","Type":"ContainerStarted","Data":"7f87c3a680a9388c9ad8b04a2749f8310e51245fe52013e18a20e5f8f7775e41"} Mar 12 14:04:32 crc kubenswrapper[4921]: I0312 14:04:32.120028 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.648235118 podStartE2EDuration="43.120011266s" podCreationTimestamp="2026-03-12 14:03:49 +0000 UTC" firstStartedPulling="2026-03-12 14:03:50.968547138 +0000 UTC m=+3253.658619109" lastFinishedPulling="2026-03-12 14:04:30.440323286 +0000 UTC m=+3293.130395257" observedRunningTime="2026-03-12 14:04:32.111859544 +0000 UTC m=+3294.801931565" watchObservedRunningTime="2026-03-12 14:04:32.120011266 +0000 UTC m=+3294.810083247" Mar 12 14:04:35 crc kubenswrapper[4921]: I0312 14:04:35.000057 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:04:36 crc kubenswrapper[4921]: I0312 14:04:36.142627 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"dcea6ad9c4de2d2d2b80377b6008ddac70ede11b4baa44e9ad37e97a8c292848"} Mar 12 14:04:45 crc kubenswrapper[4921]: I0312 14:04:45.250065 4921 scope.go:117] "RemoveContainer" containerID="2e9af1b25a2313f2cd4c9eca01b93c8be5a194ebf02e0099de6ff4c716bc4c9b" Mar 12 14:06:00 crc kubenswrapper[4921]: I0312 14:06:00.148335 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555406-pfcj8"] Mar 12 14:06:00 crc kubenswrapper[4921]: E0312 14:06:00.149460 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60262de2-7339-45cc-8f4e-7c74ede21b00" containerName="oc" Mar 12 14:06:00 crc kubenswrapper[4921]: I0312 14:06:00.149477 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="60262de2-7339-45cc-8f4e-7c74ede21b00" containerName="oc" Mar 12 14:06:00 crc kubenswrapper[4921]: I0312 14:06:00.149709 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="60262de2-7339-45cc-8f4e-7c74ede21b00" containerName="oc" Mar 12 14:06:00 crc kubenswrapper[4921]: I0312 14:06:00.150572 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555406-pfcj8" Mar 12 14:06:00 crc kubenswrapper[4921]: I0312 14:06:00.153401 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:06:00 crc kubenswrapper[4921]: I0312 14:06:00.153639 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:06:00 crc kubenswrapper[4921]: I0312 14:06:00.153845 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:06:00 crc kubenswrapper[4921]: I0312 14:06:00.159205 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555406-pfcj8"] Mar 12 14:06:00 crc kubenswrapper[4921]: I0312 14:06:00.223784 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhbft\" (UniqueName: \"kubernetes.io/projected/adfd5a52-50b8-4d44-a80e-856ccc5e5514-kube-api-access-bhbft\") pod \"auto-csr-approver-29555406-pfcj8\" (UID: \"adfd5a52-50b8-4d44-a80e-856ccc5e5514\") " pod="openshift-infra/auto-csr-approver-29555406-pfcj8" Mar 12 14:06:00 crc kubenswrapper[4921]: I0312 14:06:00.325532 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhbft\" (UniqueName: \"kubernetes.io/projected/adfd5a52-50b8-4d44-a80e-856ccc5e5514-kube-api-access-bhbft\") pod \"auto-csr-approver-29555406-pfcj8\" (UID: \"adfd5a52-50b8-4d44-a80e-856ccc5e5514\") " pod="openshift-infra/auto-csr-approver-29555406-pfcj8" Mar 12 14:06:00 crc kubenswrapper[4921]: I0312 14:06:00.343010 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhbft\" (UniqueName: \"kubernetes.io/projected/adfd5a52-50b8-4d44-a80e-856ccc5e5514-kube-api-access-bhbft\") pod \"auto-csr-approver-29555406-pfcj8\" (UID: \"adfd5a52-50b8-4d44-a80e-856ccc5e5514\") " pod="openshift-infra/auto-csr-approver-29555406-pfcj8" Mar 12 14:06:00 crc kubenswrapper[4921]: I0312 14:06:00.469201 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555406-pfcj8" Mar 12 14:06:00 crc kubenswrapper[4921]: I0312 14:06:00.960581 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555406-pfcj8"] Mar 12 14:06:00 crc kubenswrapper[4921]: I0312 14:06:00.977470 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:06:01 crc kubenswrapper[4921]: I0312 14:06:01.963931 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555406-pfcj8" event={"ID":"adfd5a52-50b8-4d44-a80e-856ccc5e5514","Type":"ContainerStarted","Data":"6aa638bfb782a02617ce03008b9adc32ece95d7263216be8f364d28dca16e4a3"} Mar 12 14:06:02 crc kubenswrapper[4921]: I0312 14:06:02.973664 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555406-pfcj8" event={"ID":"adfd5a52-50b8-4d44-a80e-856ccc5e5514","Type":"ContainerStarted","Data":"6de3c2bcbfa20597e4f27b96a1001ec8b50ab0a7a2179aedc7112b9d31ba1d86"} Mar 12 14:06:02 crc kubenswrapper[4921]: I0312 14:06:02.998280 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555406-pfcj8" podStartSLOduration=1.53717555 podStartE2EDuration="2.998256283s" podCreationTimestamp="2026-03-12 14:06:00 +0000 UTC" firstStartedPulling="2026-03-12 14:06:00.977204181 +0000 UTC m=+3383.667276162" lastFinishedPulling="2026-03-12 14:06:02.438284934 +0000 UTC m=+3385.128356895" observedRunningTime="2026-03-12 14:06:02.987726549 +0000 UTC m=+3385.677798520" watchObservedRunningTime="2026-03-12 14:06:02.998256283 +0000 UTC m=+3385.688328254" Mar 12 14:06:03 crc kubenswrapper[4921]: I0312 14:06:03.983281 4921 generic.go:334] "Generic (PLEG): container finished" podID="adfd5a52-50b8-4d44-a80e-856ccc5e5514" containerID="6de3c2bcbfa20597e4f27b96a1001ec8b50ab0a7a2179aedc7112b9d31ba1d86" exitCode=0 Mar 12 14:06:03 crc kubenswrapper[4921]: I0312 14:06:03.992547 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555406-pfcj8" event={"ID":"adfd5a52-50b8-4d44-a80e-856ccc5e5514","Type":"ContainerDied","Data":"6de3c2bcbfa20597e4f27b96a1001ec8b50ab0a7a2179aedc7112b9d31ba1d86"} Mar 12 14:06:05 crc kubenswrapper[4921]: I0312 14:06:05.586632 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555406-pfcj8" Mar 12 14:06:05 crc kubenswrapper[4921]: I0312 14:06:05.780324 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhbft\" (UniqueName: \"kubernetes.io/projected/adfd5a52-50b8-4d44-a80e-856ccc5e5514-kube-api-access-bhbft\") pod \"adfd5a52-50b8-4d44-a80e-856ccc5e5514\" (UID: \"adfd5a52-50b8-4d44-a80e-856ccc5e5514\") " Mar 12 14:06:05 crc kubenswrapper[4921]: I0312 14:06:05.789039 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adfd5a52-50b8-4d44-a80e-856ccc5e5514-kube-api-access-bhbft" (OuterVolumeSpecName: "kube-api-access-bhbft") pod "adfd5a52-50b8-4d44-a80e-856ccc5e5514" (UID: "adfd5a52-50b8-4d44-a80e-856ccc5e5514"). InnerVolumeSpecName "kube-api-access-bhbft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:06:05 crc kubenswrapper[4921]: I0312 14:06:05.883643 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhbft\" (UniqueName: \"kubernetes.io/projected/adfd5a52-50b8-4d44-a80e-856ccc5e5514-kube-api-access-bhbft\") on node \"crc\" DevicePath \"\"" Mar 12 14:06:06 crc kubenswrapper[4921]: I0312 14:06:06.005776 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555406-pfcj8" event={"ID":"adfd5a52-50b8-4d44-a80e-856ccc5e5514","Type":"ContainerDied","Data":"6aa638bfb782a02617ce03008b9adc32ece95d7263216be8f364d28dca16e4a3"} Mar 12 14:06:06 crc kubenswrapper[4921]: I0312 14:06:06.005837 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aa638bfb782a02617ce03008b9adc32ece95d7263216be8f364d28dca16e4a3" Mar 12 14:06:06 crc kubenswrapper[4921]: I0312 14:06:06.005924 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555406-pfcj8" Mar 12 14:06:06 crc kubenswrapper[4921]: I0312 14:06:06.071285 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555400-nddn9"] Mar 12 14:06:06 crc kubenswrapper[4921]: I0312 14:06:06.080042 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555400-nddn9"] Mar 12 14:06:08 crc kubenswrapper[4921]: I0312 14:06:08.005215 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9477785-0666-4867-b966-5ea53dd6f0ea" path="/var/lib/kubelet/pods/a9477785-0666-4867-b966-5ea53dd6f0ea/volumes" Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.277834 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x4ppd"] Mar 12 14:06:27 crc kubenswrapper[4921]: E0312 14:06:27.279029 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adfd5a52-50b8-4d44-a80e-856ccc5e5514" containerName="oc" Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.279049 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="adfd5a52-50b8-4d44-a80e-856ccc5e5514" containerName="oc" Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.279281 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="adfd5a52-50b8-4d44-a80e-856ccc5e5514" containerName="oc" Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.281117 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.289234 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x4ppd"] Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.446402 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c085a7b1-56be-4c60-8955-c000ccfb6ec8-utilities\") pod \"certified-operators-x4ppd\" (UID: \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\") " pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.446796 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjlcr\" (UniqueName: \"kubernetes.io/projected/c085a7b1-56be-4c60-8955-c000ccfb6ec8-kube-api-access-jjlcr\") pod \"certified-operators-x4ppd\" (UID: \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\") " pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.446874 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c085a7b1-56be-4c60-8955-c000ccfb6ec8-catalog-content\") pod \"certified-operators-x4ppd\" (UID: \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\") " pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.548572 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c085a7b1-56be-4c60-8955-c000ccfb6ec8-utilities\") pod \"certified-operators-x4ppd\" (UID: \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\") " pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.548954 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjlcr\" (UniqueName: \"kubernetes.io/projected/c085a7b1-56be-4c60-8955-c000ccfb6ec8-kube-api-access-jjlcr\") pod \"certified-operators-x4ppd\" (UID: \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\") " pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.548994 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c085a7b1-56be-4c60-8955-c000ccfb6ec8-catalog-content\") pod \"certified-operators-x4ppd\" (UID: \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\") " pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.549091 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c085a7b1-56be-4c60-8955-c000ccfb6ec8-utilities\") pod \"certified-operators-x4ppd\" (UID: \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\") " pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.549413 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c085a7b1-56be-4c60-8955-c000ccfb6ec8-catalog-content\") pod \"certified-operators-x4ppd\" (UID: \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\") " pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.569209 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjlcr\" (UniqueName: \"kubernetes.io/projected/c085a7b1-56be-4c60-8955-c000ccfb6ec8-kube-api-access-jjlcr\") pod \"certified-operators-x4ppd\" (UID: \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\") " pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:27 crc kubenswrapper[4921]: I0312 14:06:27.656546 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:28 crc kubenswrapper[4921]: I0312 14:06:28.146392 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x4ppd"] Mar 12 14:06:28 crc kubenswrapper[4921]: I0312 14:06:28.213602 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4ppd" event={"ID":"c085a7b1-56be-4c60-8955-c000ccfb6ec8","Type":"ContainerStarted","Data":"e80e8eb1d38032342485e9069621fbede6ff022358db625492faf36617170501"} Mar 12 14:06:28 crc kubenswrapper[4921]: E0312 14:06:28.516550 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc085a7b1_56be_4c60_8955_c000ccfb6ec8.slice/crio-conmon-011f57a495d195cb03a16197ea1fc82dc870221c5a6e0da41fee1eb8cbf2f476.scope\": RecentStats: unable to find data in memory cache]" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.075199 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jk9xn"] Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.077687 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.090893 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jk9xn"] Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.226203 4921 generic.go:334] "Generic (PLEG): container finished" podID="c085a7b1-56be-4c60-8955-c000ccfb6ec8" containerID="011f57a495d195cb03a16197ea1fc82dc870221c5a6e0da41fee1eb8cbf2f476" exitCode=0 Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.226252 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4ppd" event={"ID":"c085a7b1-56be-4c60-8955-c000ccfb6ec8","Type":"ContainerDied","Data":"011f57a495d195cb03a16197ea1fc82dc870221c5a6e0da41fee1eb8cbf2f476"} Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.281892 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rhhs\" (UniqueName: \"kubernetes.io/projected/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-kube-api-access-4rhhs\") pod \"redhat-operators-jk9xn\" (UID: \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\") " pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.281988 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-utilities\") pod \"redhat-operators-jk9xn\" (UID: \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\") " pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.282195 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-catalog-content\") pod \"redhat-operators-jk9xn\" (UID: \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\") " pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.384667 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-catalog-content\") pod \"redhat-operators-jk9xn\" (UID: \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\") " pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.385125 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rhhs\" (UniqueName: \"kubernetes.io/projected/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-kube-api-access-4rhhs\") pod \"redhat-operators-jk9xn\" (UID: \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\") " pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.385169 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-utilities\") pod \"redhat-operators-jk9xn\" (UID: \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\") " pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.385189 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-catalog-content\") pod \"redhat-operators-jk9xn\" (UID: \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\") " pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.385653 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-utilities\") pod \"redhat-operators-jk9xn\" (UID: \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\") " pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.420402 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rhhs\" (UniqueName: \"kubernetes.io/projected/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-kube-api-access-4rhhs\") pod \"redhat-operators-jk9xn\" (UID: \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\") " pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.670740 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4mlr6"] Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.679337 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.693394 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4mlr6"] Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.712381 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.793608 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e520d70-8202-4958-a390-52590ccb5300-utilities\") pod \"community-operators-4mlr6\" (UID: \"1e520d70-8202-4958-a390-52590ccb5300\") " pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.793762 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9788\" (UniqueName: \"kubernetes.io/projected/1e520d70-8202-4958-a390-52590ccb5300-kube-api-access-c9788\") pod \"community-operators-4mlr6\" (UID: \"1e520d70-8202-4958-a390-52590ccb5300\") " pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.793858 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e520d70-8202-4958-a390-52590ccb5300-catalog-content\") pod \"community-operators-4mlr6\" (UID: \"1e520d70-8202-4958-a390-52590ccb5300\") " pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.899342 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e520d70-8202-4958-a390-52590ccb5300-utilities\") pod \"community-operators-4mlr6\" (UID: \"1e520d70-8202-4958-a390-52590ccb5300\") " pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.899660 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9788\" (UniqueName: \"kubernetes.io/projected/1e520d70-8202-4958-a390-52590ccb5300-kube-api-access-c9788\") pod \"community-operators-4mlr6\" (UID: \"1e520d70-8202-4958-a390-52590ccb5300\") " pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.899732 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e520d70-8202-4958-a390-52590ccb5300-catalog-content\") pod \"community-operators-4mlr6\" (UID: \"1e520d70-8202-4958-a390-52590ccb5300\") " pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.899924 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e520d70-8202-4958-a390-52590ccb5300-utilities\") pod \"community-operators-4mlr6\" (UID: \"1e520d70-8202-4958-a390-52590ccb5300\") " pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.900189 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e520d70-8202-4958-a390-52590ccb5300-catalog-content\") pod \"community-operators-4mlr6\" (UID: \"1e520d70-8202-4958-a390-52590ccb5300\") " pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:29 crc kubenswrapper[4921]: I0312 14:06:29.935276 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9788\" (UniqueName: \"kubernetes.io/projected/1e520d70-8202-4958-a390-52590ccb5300-kube-api-access-c9788\") pod \"community-operators-4mlr6\" (UID: \"1e520d70-8202-4958-a390-52590ccb5300\") " pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:30 crc kubenswrapper[4921]: I0312 14:06:30.000981 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:30 crc kubenswrapper[4921]: I0312 14:06:30.242264 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4ppd" event={"ID":"c085a7b1-56be-4c60-8955-c000ccfb6ec8","Type":"ContainerStarted","Data":"4526d676498d4ff33257f4124e09e2a40076e54084c798c766f1297ca1563f50"} Mar 12 14:06:30 crc kubenswrapper[4921]: I0312 14:06:30.276682 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jk9xn"] Mar 12 14:06:30 crc kubenswrapper[4921]: W0312 14:06:30.276764 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c5e15fc_5680_4fb4_8d36_154efdfbb4cc.slice/crio-186065d27c7b89253a4fb0cf5fecf2bd6df6df05ae3d766ab1ca5cac98ae412d WatchSource:0}: Error finding container 186065d27c7b89253a4fb0cf5fecf2bd6df6df05ae3d766ab1ca5cac98ae412d: Status 404 returned error can't find the container with id 186065d27c7b89253a4fb0cf5fecf2bd6df6df05ae3d766ab1ca5cac98ae412d Mar 12 14:06:30 crc kubenswrapper[4921]: I0312 14:06:30.505527 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4mlr6"] Mar 12 14:06:31 crc kubenswrapper[4921]: I0312 14:06:31.251797 4921 generic.go:334] "Generic (PLEG): container finished" podID="3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" containerID="be46df4f161af31bb08c3bffb42cdf2cfec45c851eb04a352331c01f70dbc2cc" exitCode=0 Mar 12 14:06:31 crc kubenswrapper[4921]: I0312 14:06:31.252049 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jk9xn" event={"ID":"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc","Type":"ContainerDied","Data":"be46df4f161af31bb08c3bffb42cdf2cfec45c851eb04a352331c01f70dbc2cc"} Mar 12 14:06:31 crc kubenswrapper[4921]: I0312 14:06:31.252211 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jk9xn" event={"ID":"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc","Type":"ContainerStarted","Data":"186065d27c7b89253a4fb0cf5fecf2bd6df6df05ae3d766ab1ca5cac98ae412d"} Mar 12 14:06:31 crc kubenswrapper[4921]: I0312 14:06:31.254832 4921 generic.go:334] "Generic (PLEG): container finished" podID="1e520d70-8202-4958-a390-52590ccb5300" containerID="f8a66f32bfe84622ebe56ed6409f0b4b393d67b1c78d323155a99a07a6cda54d" exitCode=0 Mar 12 14:06:31 crc kubenswrapper[4921]: I0312 14:06:31.254931 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mlr6" event={"ID":"1e520d70-8202-4958-a390-52590ccb5300","Type":"ContainerDied","Data":"f8a66f32bfe84622ebe56ed6409f0b4b393d67b1c78d323155a99a07a6cda54d"} Mar 12 14:06:31 crc kubenswrapper[4921]: I0312 14:06:31.254972 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mlr6" event={"ID":"1e520d70-8202-4958-a390-52590ccb5300","Type":"ContainerStarted","Data":"5f45f738a1d706892824e60f3a8d7efb713cf6321bd479b779db70b4769e71f6"} Mar 12 14:06:33 crc kubenswrapper[4921]: I0312 14:06:33.274487 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mlr6" event={"ID":"1e520d70-8202-4958-a390-52590ccb5300","Type":"ContainerStarted","Data":"cd7abe89c3427515d98ed3063893d846931f83d43436eb9d54d0796eba3331b0"} Mar 12 14:06:33 crc kubenswrapper[4921]: I0312 14:06:33.276833 4921 generic.go:334] "Generic (PLEG): container finished" podID="c085a7b1-56be-4c60-8955-c000ccfb6ec8" containerID="4526d676498d4ff33257f4124e09e2a40076e54084c798c766f1297ca1563f50" exitCode=0 Mar 12 14:06:33 crc kubenswrapper[4921]: I0312 14:06:33.276891 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4ppd" event={"ID":"c085a7b1-56be-4c60-8955-c000ccfb6ec8","Type":"ContainerDied","Data":"4526d676498d4ff33257f4124e09e2a40076e54084c798c766f1297ca1563f50"} Mar 12 14:06:33 crc kubenswrapper[4921]: I0312 14:06:33.279318 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jk9xn" event={"ID":"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc","Type":"ContainerStarted","Data":"09c43d71211cdefa4600a1e12ca2da08401cb6d4b4f57ff88eb28856833f8133"} Mar 12 14:06:34 crc kubenswrapper[4921]: I0312 14:06:34.293055 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4ppd" event={"ID":"c085a7b1-56be-4c60-8955-c000ccfb6ec8","Type":"ContainerStarted","Data":"4a166d499b99d1aa56c3f3d3609ff4d0ecc9eebd86c41c6a549e011ff7f07b70"} Mar 12 14:06:34 crc kubenswrapper[4921]: I0312 14:06:34.332065 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x4ppd" podStartSLOduration=2.786739963 podStartE2EDuration="7.332039745s" podCreationTimestamp="2026-03-12 14:06:27 +0000 UTC" firstStartedPulling="2026-03-12 14:06:29.228538028 +0000 UTC m=+3411.918610009" lastFinishedPulling="2026-03-12 14:06:33.77383782 +0000 UTC m=+3416.463909791" observedRunningTime="2026-03-12 14:06:34.320804669 +0000 UTC m=+3417.010876700" watchObservedRunningTime="2026-03-12 14:06:34.332039745 +0000 UTC m=+3417.022111746" Mar 12 14:06:36 crc kubenswrapper[4921]: I0312 14:06:36.310199 4921 generic.go:334] "Generic (PLEG): container finished" podID="1e520d70-8202-4958-a390-52590ccb5300" containerID="cd7abe89c3427515d98ed3063893d846931f83d43436eb9d54d0796eba3331b0" exitCode=0 Mar 12 14:06:36 crc kubenswrapper[4921]: I0312 14:06:36.310360 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mlr6" event={"ID":"1e520d70-8202-4958-a390-52590ccb5300","Type":"ContainerDied","Data":"cd7abe89c3427515d98ed3063893d846931f83d43436eb9d54d0796eba3331b0"} Mar 12 14:06:37 crc kubenswrapper[4921]: I0312 14:06:37.657559 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:37 crc kubenswrapper[4921]: I0312 14:06:37.657929 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:38 crc kubenswrapper[4921]: I0312 14:06:38.334621 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mlr6" event={"ID":"1e520d70-8202-4958-a390-52590ccb5300","Type":"ContainerStarted","Data":"2e7f5e1a5254c6d5f04c0e1b22a0f5c4481cbaaff5d8064bebcd4488914c2e85"} Mar 12 14:06:38 crc kubenswrapper[4921]: I0312 14:06:38.357106 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4mlr6" podStartSLOduration=3.591207279 podStartE2EDuration="9.357086573s" podCreationTimestamp="2026-03-12 14:06:29 +0000 UTC" firstStartedPulling="2026-03-12 14:06:31.258873065 +0000 UTC m=+3413.948945036" lastFinishedPulling="2026-03-12 14:06:37.024752359 +0000 UTC m=+3419.714824330" observedRunningTime="2026-03-12 14:06:38.352349377 +0000 UTC m=+3421.042421348" watchObservedRunningTime="2026-03-12 14:06:38.357086573 +0000 UTC m=+3421.047158554" Mar 12 14:06:38 crc kubenswrapper[4921]: I0312 14:06:38.707068 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-x4ppd" podUID="c085a7b1-56be-4c60-8955-c000ccfb6ec8" containerName="registry-server" probeResult="failure" output=< Mar 12 14:06:38 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 14:06:38 crc kubenswrapper[4921]: > Mar 12 14:06:39 crc kubenswrapper[4921]: I0312 14:06:39.344660 4921 generic.go:334] "Generic (PLEG): container finished" podID="3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" containerID="09c43d71211cdefa4600a1e12ca2da08401cb6d4b4f57ff88eb28856833f8133" exitCode=0 Mar 12 14:06:39 crc kubenswrapper[4921]: I0312 14:06:39.344984 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jk9xn" event={"ID":"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc","Type":"ContainerDied","Data":"09c43d71211cdefa4600a1e12ca2da08401cb6d4b4f57ff88eb28856833f8133"} Mar 12 14:06:40 crc kubenswrapper[4921]: I0312 14:06:40.002178 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:40 crc kubenswrapper[4921]: I0312 14:06:40.002223 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:41 crc kubenswrapper[4921]: I0312 14:06:41.058044 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4mlr6" podUID="1e520d70-8202-4958-a390-52590ccb5300" containerName="registry-server" probeResult="failure" output=< Mar 12 14:06:41 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 14:06:41 crc kubenswrapper[4921]: > Mar 12 14:06:41 crc kubenswrapper[4921]: I0312 14:06:41.365610 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jk9xn" event={"ID":"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc","Type":"ContainerStarted","Data":"4cbfe18730f228ca4a670fdf049a8d4b0a9d45291641b0bc5ef4b52d52c65968"} Mar 12 14:06:41 crc kubenswrapper[4921]: I0312 14:06:41.386268 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jk9xn" podStartSLOduration=3.35403205 podStartE2EDuration="12.386250686s" podCreationTimestamp="2026-03-12 14:06:29 +0000 UTC" firstStartedPulling="2026-03-12 14:06:31.254205732 +0000 UTC m=+3413.944277703" lastFinishedPulling="2026-03-12 14:06:40.286424368 +0000 UTC m=+3422.976496339" observedRunningTime="2026-03-12 14:06:41.380643803 +0000 UTC m=+3424.070715784" watchObservedRunningTime="2026-03-12 14:06:41.386250686 +0000 UTC m=+3424.076322647" Mar 12 14:06:45 crc kubenswrapper[4921]: I0312 14:06:45.360159 4921 scope.go:117] "RemoveContainer" containerID="bab3318f7e4bfa10390f12f5745b7634213c91719deb18176f1d832b464ed9d4" Mar 12 14:06:48 crc kubenswrapper[4921]: I0312 14:06:48.701138 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-x4ppd" podUID="c085a7b1-56be-4c60-8955-c000ccfb6ec8" containerName="registry-server" probeResult="failure" output=< Mar 12 14:06:48 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 14:06:48 crc kubenswrapper[4921]: > Mar 12 14:06:49 crc kubenswrapper[4921]: I0312 14:06:49.713220 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:06:49 crc kubenswrapper[4921]: I0312 14:06:49.713548 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:06:50 crc kubenswrapper[4921]: I0312 14:06:50.060536 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:50 crc kubenswrapper[4921]: I0312 14:06:50.127686 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:50 crc kubenswrapper[4921]: I0312 14:06:50.297663 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4mlr6"] Mar 12 14:06:50 crc kubenswrapper[4921]: I0312 14:06:50.774618 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jk9xn" podUID="3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" containerName="registry-server" probeResult="failure" output=< Mar 12 14:06:50 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 14:06:50 crc kubenswrapper[4921]: > Mar 12 14:06:51 crc kubenswrapper[4921]: I0312 14:06:51.469456 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4mlr6" podUID="1e520d70-8202-4958-a390-52590ccb5300" containerName="registry-server" containerID="cri-o://2e7f5e1a5254c6d5f04c0e1b22a0f5c4481cbaaff5d8064bebcd4488914c2e85" gracePeriod=2 Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.184617 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.263972 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e520d70-8202-4958-a390-52590ccb5300-utilities\") pod \"1e520d70-8202-4958-a390-52590ccb5300\" (UID: \"1e520d70-8202-4958-a390-52590ccb5300\") " Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.264008 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9788\" (UniqueName: \"kubernetes.io/projected/1e520d70-8202-4958-a390-52590ccb5300-kube-api-access-c9788\") pod \"1e520d70-8202-4958-a390-52590ccb5300\" (UID: \"1e520d70-8202-4958-a390-52590ccb5300\") " Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.264038 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e520d70-8202-4958-a390-52590ccb5300-catalog-content\") pod \"1e520d70-8202-4958-a390-52590ccb5300\" (UID: \"1e520d70-8202-4958-a390-52590ccb5300\") " Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.264683 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e520d70-8202-4958-a390-52590ccb5300-utilities" (OuterVolumeSpecName: "utilities") pod "1e520d70-8202-4958-a390-52590ccb5300" (UID: "1e520d70-8202-4958-a390-52590ccb5300"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.271045 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e520d70-8202-4958-a390-52590ccb5300-kube-api-access-c9788" (OuterVolumeSpecName: "kube-api-access-c9788") pod "1e520d70-8202-4958-a390-52590ccb5300" (UID: "1e520d70-8202-4958-a390-52590ccb5300"). InnerVolumeSpecName "kube-api-access-c9788". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.330770 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e520d70-8202-4958-a390-52590ccb5300-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e520d70-8202-4958-a390-52590ccb5300" (UID: "1e520d70-8202-4958-a390-52590ccb5300"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.366917 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e520d70-8202-4958-a390-52590ccb5300-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.367286 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9788\" (UniqueName: \"kubernetes.io/projected/1e520d70-8202-4958-a390-52590ccb5300-kube-api-access-c9788\") on node \"crc\" DevicePath \"\"" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.367418 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e520d70-8202-4958-a390-52590ccb5300-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.478863 4921 generic.go:334] "Generic (PLEG): container finished" podID="1e520d70-8202-4958-a390-52590ccb5300" containerID="2e7f5e1a5254c6d5f04c0e1b22a0f5c4481cbaaff5d8064bebcd4488914c2e85" exitCode=0 Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.478922 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mlr6" event={"ID":"1e520d70-8202-4958-a390-52590ccb5300","Type":"ContainerDied","Data":"2e7f5e1a5254c6d5f04c0e1b22a0f5c4481cbaaff5d8064bebcd4488914c2e85"} Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.479279 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4mlr6" event={"ID":"1e520d70-8202-4958-a390-52590ccb5300","Type":"ContainerDied","Data":"5f45f738a1d706892824e60f3a8d7efb713cf6321bd479b779db70b4769e71f6"} Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.479380 4921 scope.go:117] "RemoveContainer" containerID="2e7f5e1a5254c6d5f04c0e1b22a0f5c4481cbaaff5d8064bebcd4488914c2e85" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.478982 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4mlr6" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.522506 4921 scope.go:117] "RemoveContainer" containerID="cd7abe89c3427515d98ed3063893d846931f83d43436eb9d54d0796eba3331b0" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.528419 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4mlr6"] Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.544457 4921 scope.go:117] "RemoveContainer" containerID="f8a66f32bfe84622ebe56ed6409f0b4b393d67b1c78d323155a99a07a6cda54d" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.549909 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4mlr6"] Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.590920 4921 scope.go:117] "RemoveContainer" containerID="2e7f5e1a5254c6d5f04c0e1b22a0f5c4481cbaaff5d8064bebcd4488914c2e85" Mar 12 14:06:52 crc kubenswrapper[4921]: E0312 14:06:52.591290 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e7f5e1a5254c6d5f04c0e1b22a0f5c4481cbaaff5d8064bebcd4488914c2e85\": container with ID starting with 2e7f5e1a5254c6d5f04c0e1b22a0f5c4481cbaaff5d8064bebcd4488914c2e85 not found: ID does not exist" containerID="2e7f5e1a5254c6d5f04c0e1b22a0f5c4481cbaaff5d8064bebcd4488914c2e85" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.591324 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e7f5e1a5254c6d5f04c0e1b22a0f5c4481cbaaff5d8064bebcd4488914c2e85"} err="failed to get container status \"2e7f5e1a5254c6d5f04c0e1b22a0f5c4481cbaaff5d8064bebcd4488914c2e85\": rpc error: code = NotFound desc = could not find container \"2e7f5e1a5254c6d5f04c0e1b22a0f5c4481cbaaff5d8064bebcd4488914c2e85\": container with ID starting with 2e7f5e1a5254c6d5f04c0e1b22a0f5c4481cbaaff5d8064bebcd4488914c2e85 not found: ID does not exist" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.591362 4921 scope.go:117] "RemoveContainer" containerID="cd7abe89c3427515d98ed3063893d846931f83d43436eb9d54d0796eba3331b0" Mar 12 14:06:52 crc kubenswrapper[4921]: E0312 14:06:52.591760 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd7abe89c3427515d98ed3063893d846931f83d43436eb9d54d0796eba3331b0\": container with ID starting with cd7abe89c3427515d98ed3063893d846931f83d43436eb9d54d0796eba3331b0 not found: ID does not exist" containerID="cd7abe89c3427515d98ed3063893d846931f83d43436eb9d54d0796eba3331b0" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.591826 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7abe89c3427515d98ed3063893d846931f83d43436eb9d54d0796eba3331b0"} err="failed to get container status \"cd7abe89c3427515d98ed3063893d846931f83d43436eb9d54d0796eba3331b0\": rpc error: code = NotFound desc = could not find container \"cd7abe89c3427515d98ed3063893d846931f83d43436eb9d54d0796eba3331b0\": container with ID starting with cd7abe89c3427515d98ed3063893d846931f83d43436eb9d54d0796eba3331b0 not found: ID does not exist" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.591864 4921 scope.go:117] "RemoveContainer" containerID="f8a66f32bfe84622ebe56ed6409f0b4b393d67b1c78d323155a99a07a6cda54d" Mar 12 14:06:52 crc kubenswrapper[4921]: E0312 14:06:52.592211 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8a66f32bfe84622ebe56ed6409f0b4b393d67b1c78d323155a99a07a6cda54d\": container with ID starting with f8a66f32bfe84622ebe56ed6409f0b4b393d67b1c78d323155a99a07a6cda54d not found: ID does not exist" containerID="f8a66f32bfe84622ebe56ed6409f0b4b393d67b1c78d323155a99a07a6cda54d" Mar 12 14:06:52 crc kubenswrapper[4921]: I0312 14:06:52.592242 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a66f32bfe84622ebe56ed6409f0b4b393d67b1c78d323155a99a07a6cda54d"} err="failed to get container status \"f8a66f32bfe84622ebe56ed6409f0b4b393d67b1c78d323155a99a07a6cda54d\": rpc error: code = NotFound desc = could not find container \"f8a66f32bfe84622ebe56ed6409f0b4b393d67b1c78d323155a99a07a6cda54d\": container with ID starting with f8a66f32bfe84622ebe56ed6409f0b4b393d67b1c78d323155a99a07a6cda54d not found: ID does not exist" Mar 12 14:06:53 crc kubenswrapper[4921]: I0312 14:06:53.994461 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e520d70-8202-4958-a390-52590ccb5300" path="/var/lib/kubelet/pods/1e520d70-8202-4958-a390-52590ccb5300/volumes" Mar 12 14:06:56 crc kubenswrapper[4921]: I0312 14:06:56.324279 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:06:56 crc kubenswrapper[4921]: I0312 14:06:56.324352 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:06:57 crc kubenswrapper[4921]: I0312 14:06:57.725234 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:57 crc kubenswrapper[4921]: I0312 14:06:57.790174 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:06:58 crc kubenswrapper[4921]: I0312 14:06:58.473994 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x4ppd"] Mar 12 14:06:59 crc kubenswrapper[4921]: I0312 14:06:59.535750 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x4ppd" podUID="c085a7b1-56be-4c60-8955-c000ccfb6ec8" containerName="registry-server" containerID="cri-o://4a166d499b99d1aa56c3f3d3609ff4d0ecc9eebd86c41c6a549e011ff7f07b70" gracePeriod=2 Mar 12 14:06:59 crc kubenswrapper[4921]: I0312 14:06:59.772466 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:06:59 crc kubenswrapper[4921]: I0312 14:06:59.841353 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.232226 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.343964 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c085a7b1-56be-4c60-8955-c000ccfb6ec8-catalog-content\") pod \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\" (UID: \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\") " Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.344015 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c085a7b1-56be-4c60-8955-c000ccfb6ec8-utilities\") pod \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\" (UID: \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\") " Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.344052 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjlcr\" (UniqueName: \"kubernetes.io/projected/c085a7b1-56be-4c60-8955-c000ccfb6ec8-kube-api-access-jjlcr\") pod \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\" (UID: \"c085a7b1-56be-4c60-8955-c000ccfb6ec8\") " Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.345160 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c085a7b1-56be-4c60-8955-c000ccfb6ec8-utilities" (OuterVolumeSpecName: "utilities") pod "c085a7b1-56be-4c60-8955-c000ccfb6ec8" (UID: "c085a7b1-56be-4c60-8955-c000ccfb6ec8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.350975 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c085a7b1-56be-4c60-8955-c000ccfb6ec8-kube-api-access-jjlcr" (OuterVolumeSpecName: "kube-api-access-jjlcr") pod "c085a7b1-56be-4c60-8955-c000ccfb6ec8" (UID: "c085a7b1-56be-4c60-8955-c000ccfb6ec8"). InnerVolumeSpecName "kube-api-access-jjlcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.421271 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c085a7b1-56be-4c60-8955-c000ccfb6ec8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c085a7b1-56be-4c60-8955-c000ccfb6ec8" (UID: "c085a7b1-56be-4c60-8955-c000ccfb6ec8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.446429 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c085a7b1-56be-4c60-8955-c000ccfb6ec8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.446462 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c085a7b1-56be-4c60-8955-c000ccfb6ec8-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.446476 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjlcr\" (UniqueName: \"kubernetes.io/projected/c085a7b1-56be-4c60-8955-c000ccfb6ec8-kube-api-access-jjlcr\") on node \"crc\" DevicePath \"\"" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.566991 4921 generic.go:334] "Generic (PLEG): container finished" podID="c085a7b1-56be-4c60-8955-c000ccfb6ec8" containerID="4a166d499b99d1aa56c3f3d3609ff4d0ecc9eebd86c41c6a549e011ff7f07b70" exitCode=0 Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.568916 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x4ppd" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.575123 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4ppd" event={"ID":"c085a7b1-56be-4c60-8955-c000ccfb6ec8","Type":"ContainerDied","Data":"4a166d499b99d1aa56c3f3d3609ff4d0ecc9eebd86c41c6a549e011ff7f07b70"} Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.575200 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x4ppd" event={"ID":"c085a7b1-56be-4c60-8955-c000ccfb6ec8","Type":"ContainerDied","Data":"e80e8eb1d38032342485e9069621fbede6ff022358db625492faf36617170501"} Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.575224 4921 scope.go:117] "RemoveContainer" containerID="4a166d499b99d1aa56c3f3d3609ff4d0ecc9eebd86c41c6a549e011ff7f07b70" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.608115 4921 scope.go:117] "RemoveContainer" containerID="4526d676498d4ff33257f4124e09e2a40076e54084c798c766f1297ca1563f50" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.615533 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x4ppd"] Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.625392 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x4ppd"] Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.632602 4921 scope.go:117] "RemoveContainer" containerID="011f57a495d195cb03a16197ea1fc82dc870221c5a6e0da41fee1eb8cbf2f476" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.683046 4921 scope.go:117] "RemoveContainer" containerID="4a166d499b99d1aa56c3f3d3609ff4d0ecc9eebd86c41c6a549e011ff7f07b70" Mar 12 14:07:00 crc kubenswrapper[4921]: E0312 14:07:00.687536 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a166d499b99d1aa56c3f3d3609ff4d0ecc9eebd86c41c6a549e011ff7f07b70\": container with ID starting with 4a166d499b99d1aa56c3f3d3609ff4d0ecc9eebd86c41c6a549e011ff7f07b70 not found: ID does not exist" containerID="4a166d499b99d1aa56c3f3d3609ff4d0ecc9eebd86c41c6a549e011ff7f07b70" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.687572 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a166d499b99d1aa56c3f3d3609ff4d0ecc9eebd86c41c6a549e011ff7f07b70"} err="failed to get container status \"4a166d499b99d1aa56c3f3d3609ff4d0ecc9eebd86c41c6a549e011ff7f07b70\": rpc error: code = NotFound desc = could not find container \"4a166d499b99d1aa56c3f3d3609ff4d0ecc9eebd86c41c6a549e011ff7f07b70\": container with ID starting with 4a166d499b99d1aa56c3f3d3609ff4d0ecc9eebd86c41c6a549e011ff7f07b70 not found: ID does not exist" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.687603 4921 scope.go:117] "RemoveContainer" containerID="4526d676498d4ff33257f4124e09e2a40076e54084c798c766f1297ca1563f50" Mar 12 14:07:00 crc kubenswrapper[4921]: E0312 14:07:00.690350 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4526d676498d4ff33257f4124e09e2a40076e54084c798c766f1297ca1563f50\": container with ID starting with 4526d676498d4ff33257f4124e09e2a40076e54084c798c766f1297ca1563f50 not found: ID does not exist" containerID="4526d676498d4ff33257f4124e09e2a40076e54084c798c766f1297ca1563f50" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.690377 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4526d676498d4ff33257f4124e09e2a40076e54084c798c766f1297ca1563f50"} err="failed to get container status \"4526d676498d4ff33257f4124e09e2a40076e54084c798c766f1297ca1563f50\": rpc error: code = NotFound desc = could not find container \"4526d676498d4ff33257f4124e09e2a40076e54084c798c766f1297ca1563f50\": container with ID starting with 4526d676498d4ff33257f4124e09e2a40076e54084c798c766f1297ca1563f50 not found: ID does not exist" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.690393 4921 scope.go:117] "RemoveContainer" containerID="011f57a495d195cb03a16197ea1fc82dc870221c5a6e0da41fee1eb8cbf2f476" Mar 12 14:07:00 crc kubenswrapper[4921]: E0312 14:07:00.690753 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"011f57a495d195cb03a16197ea1fc82dc870221c5a6e0da41fee1eb8cbf2f476\": container with ID starting with 011f57a495d195cb03a16197ea1fc82dc870221c5a6e0da41fee1eb8cbf2f476 not found: ID does not exist" containerID="011f57a495d195cb03a16197ea1fc82dc870221c5a6e0da41fee1eb8cbf2f476" Mar 12 14:07:00 crc kubenswrapper[4921]: I0312 14:07:00.690773 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"011f57a495d195cb03a16197ea1fc82dc870221c5a6e0da41fee1eb8cbf2f476"} err="failed to get container status \"011f57a495d195cb03a16197ea1fc82dc870221c5a6e0da41fee1eb8cbf2f476\": rpc error: code = NotFound desc = could not find container \"011f57a495d195cb03a16197ea1fc82dc870221c5a6e0da41fee1eb8cbf2f476\": container with ID starting with 011f57a495d195cb03a16197ea1fc82dc870221c5a6e0da41fee1eb8cbf2f476 not found: ID does not exist" Mar 12 14:07:01 crc kubenswrapper[4921]: I0312 14:07:01.273547 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jk9xn"] Mar 12 14:07:01 crc kubenswrapper[4921]: I0312 14:07:01.575614 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jk9xn" podUID="3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" containerName="registry-server" containerID="cri-o://4cbfe18730f228ca4a670fdf049a8d4b0a9d45291641b0bc5ef4b52d52c65968" gracePeriod=2 Mar 12 14:07:01 crc kubenswrapper[4921]: I0312 14:07:01.995553 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c085a7b1-56be-4c60-8955-c000ccfb6ec8" path="/var/lib/kubelet/pods/c085a7b1-56be-4c60-8955-c000ccfb6ec8/volumes" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.164855 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.191081 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-utilities\") pod \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\" (UID: \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\") " Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.191160 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rhhs\" (UniqueName: \"kubernetes.io/projected/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-kube-api-access-4rhhs\") pod \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\" (UID: \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\") " Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.191315 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-catalog-content\") pod \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\" (UID: \"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc\") " Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.192804 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-utilities" (OuterVolumeSpecName: "utilities") pod "3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" (UID: "3c5e15fc-5680-4fb4-8d36-154efdfbb4cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.201108 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-kube-api-access-4rhhs" (OuterVolumeSpecName: "kube-api-access-4rhhs") pod "3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" (UID: "3c5e15fc-5680-4fb4-8d36-154efdfbb4cc"). InnerVolumeSpecName "kube-api-access-4rhhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.294068 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.294335 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rhhs\" (UniqueName: \"kubernetes.io/projected/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-kube-api-access-4rhhs\") on node \"crc\" DevicePath \"\"" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.339397 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" (UID: "3c5e15fc-5680-4fb4-8d36-154efdfbb4cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.396452 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.585942 4921 generic.go:334] "Generic (PLEG): container finished" podID="3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" containerID="4cbfe18730f228ca4a670fdf049a8d4b0a9d45291641b0bc5ef4b52d52c65968" exitCode=0 Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.585983 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jk9xn" event={"ID":"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc","Type":"ContainerDied","Data":"4cbfe18730f228ca4a670fdf049a8d4b0a9d45291641b0bc5ef4b52d52c65968"} Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.586015 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jk9xn" event={"ID":"3c5e15fc-5680-4fb4-8d36-154efdfbb4cc","Type":"ContainerDied","Data":"186065d27c7b89253a4fb0cf5fecf2bd6df6df05ae3d766ab1ca5cac98ae412d"} Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.586035 4921 scope.go:117] "RemoveContainer" containerID="4cbfe18730f228ca4a670fdf049a8d4b0a9d45291641b0bc5ef4b52d52c65968" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.587188 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jk9xn" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.616208 4921 scope.go:117] "RemoveContainer" containerID="09c43d71211cdefa4600a1e12ca2da08401cb6d4b4f57ff88eb28856833f8133" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.625331 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jk9xn"] Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.636626 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jk9xn"] Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.642743 4921 scope.go:117] "RemoveContainer" containerID="be46df4f161af31bb08c3bffb42cdf2cfec45c851eb04a352331c01f70dbc2cc" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.684219 4921 scope.go:117] "RemoveContainer" containerID="4cbfe18730f228ca4a670fdf049a8d4b0a9d45291641b0bc5ef4b52d52c65968" Mar 12 14:07:02 crc kubenswrapper[4921]: E0312 14:07:02.685091 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cbfe18730f228ca4a670fdf049a8d4b0a9d45291641b0bc5ef4b52d52c65968\": container with ID starting with 4cbfe18730f228ca4a670fdf049a8d4b0a9d45291641b0bc5ef4b52d52c65968 not found: ID does not exist" containerID="4cbfe18730f228ca4a670fdf049a8d4b0a9d45291641b0bc5ef4b52d52c65968" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.685140 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cbfe18730f228ca4a670fdf049a8d4b0a9d45291641b0bc5ef4b52d52c65968"} err="failed to get container status \"4cbfe18730f228ca4a670fdf049a8d4b0a9d45291641b0bc5ef4b52d52c65968\": rpc error: code = NotFound desc = could not find container \"4cbfe18730f228ca4a670fdf049a8d4b0a9d45291641b0bc5ef4b52d52c65968\": container with ID starting with 4cbfe18730f228ca4a670fdf049a8d4b0a9d45291641b0bc5ef4b52d52c65968 not found: ID does not exist" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.685166 4921 scope.go:117] "RemoveContainer" containerID="09c43d71211cdefa4600a1e12ca2da08401cb6d4b4f57ff88eb28856833f8133" Mar 12 14:07:02 crc kubenswrapper[4921]: E0312 14:07:02.685722 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c43d71211cdefa4600a1e12ca2da08401cb6d4b4f57ff88eb28856833f8133\": container with ID starting with 09c43d71211cdefa4600a1e12ca2da08401cb6d4b4f57ff88eb28856833f8133 not found: ID does not exist" containerID="09c43d71211cdefa4600a1e12ca2da08401cb6d4b4f57ff88eb28856833f8133" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.685769 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c43d71211cdefa4600a1e12ca2da08401cb6d4b4f57ff88eb28856833f8133"} err="failed to get container status \"09c43d71211cdefa4600a1e12ca2da08401cb6d4b4f57ff88eb28856833f8133\": rpc error: code = NotFound desc = could not find container \"09c43d71211cdefa4600a1e12ca2da08401cb6d4b4f57ff88eb28856833f8133\": container with ID starting with 09c43d71211cdefa4600a1e12ca2da08401cb6d4b4f57ff88eb28856833f8133 not found: ID does not exist" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.685796 4921 scope.go:117] "RemoveContainer" containerID="be46df4f161af31bb08c3bffb42cdf2cfec45c851eb04a352331c01f70dbc2cc" Mar 12 14:07:02 crc kubenswrapper[4921]: E0312 14:07:02.686141 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be46df4f161af31bb08c3bffb42cdf2cfec45c851eb04a352331c01f70dbc2cc\": container with ID starting with be46df4f161af31bb08c3bffb42cdf2cfec45c851eb04a352331c01f70dbc2cc not found: ID does not exist" containerID="be46df4f161af31bb08c3bffb42cdf2cfec45c851eb04a352331c01f70dbc2cc" Mar 12 14:07:02 crc kubenswrapper[4921]: I0312 14:07:02.686203 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be46df4f161af31bb08c3bffb42cdf2cfec45c851eb04a352331c01f70dbc2cc"} err="failed to get container status \"be46df4f161af31bb08c3bffb42cdf2cfec45c851eb04a352331c01f70dbc2cc\": rpc error: code = NotFound desc = could not find container \"be46df4f161af31bb08c3bffb42cdf2cfec45c851eb04a352331c01f70dbc2cc\": container with ID starting with be46df4f161af31bb08c3bffb42cdf2cfec45c851eb04a352331c01f70dbc2cc not found: ID does not exist" Mar 12 14:07:03 crc kubenswrapper[4921]: I0312 14:07:03.994384 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" path="/var/lib/kubelet/pods/3c5e15fc-5680-4fb4-8d36-154efdfbb4cc/volumes" Mar 12 14:07:26 crc kubenswrapper[4921]: I0312 14:07:26.324232 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:07:26 crc kubenswrapper[4921]: I0312 14:07:26.324800 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:07:45 crc kubenswrapper[4921]: I0312 14:07:45.447948 4921 scope.go:117] "RemoveContainer" containerID="83326320c66c2208b995e7c5fe0d56a94154636a9722ca249c50e59ed101c393" Mar 12 14:07:45 crc kubenswrapper[4921]: I0312 14:07:45.476031 4921 scope.go:117] "RemoveContainer" containerID="4ac67f9ad89b4d336da226d7912a68dde958aab8a3aebb1cd0a3205f2b3433b7" Mar 12 14:07:56 crc kubenswrapper[4921]: I0312 14:07:56.323252 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:07:56 crc kubenswrapper[4921]: I0312 14:07:56.323768 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:07:56 crc kubenswrapper[4921]: I0312 14:07:56.323806 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 14:07:56 crc kubenswrapper[4921]: I0312 14:07:56.324396 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dcea6ad9c4de2d2d2b80377b6008ddac70ede11b4baa44e9ad37e97a8c292848"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 14:07:56 crc kubenswrapper[4921]: I0312 14:07:56.324438 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://dcea6ad9c4de2d2d2b80377b6008ddac70ede11b4baa44e9ad37e97a8c292848" gracePeriod=600 Mar 12 14:07:57 crc kubenswrapper[4921]: I0312 14:07:57.064896 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="dcea6ad9c4de2d2d2b80377b6008ddac70ede11b4baa44e9ad37e97a8c292848" exitCode=0 Mar 12 14:07:57 crc kubenswrapper[4921]: I0312 14:07:57.064973 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"dcea6ad9c4de2d2d2b80377b6008ddac70ede11b4baa44e9ad37e97a8c292848"} Mar 12 14:07:57 crc kubenswrapper[4921]: I0312 14:07:57.065553 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb"} Mar 12 14:07:57 crc kubenswrapper[4921]: I0312 14:07:57.065578 4921 scope.go:117] "RemoveContainer" containerID="4ac2d4fc600ec09d18cac9e053676b6dc99d5229da81b484cb37dbd6196e43d6" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.139503 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555408-gh4wv"] Mar 12 14:08:00 crc kubenswrapper[4921]: E0312 14:08:00.140581 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c085a7b1-56be-4c60-8955-c000ccfb6ec8" containerName="extract-content" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.140597 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c085a7b1-56be-4c60-8955-c000ccfb6ec8" containerName="extract-content" Mar 12 14:08:00 crc kubenswrapper[4921]: E0312 14:08:00.140621 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e520d70-8202-4958-a390-52590ccb5300" containerName="registry-server" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.140629 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e520d70-8202-4958-a390-52590ccb5300" containerName="registry-server" Mar 12 14:08:00 crc kubenswrapper[4921]: E0312 14:08:00.140644 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c085a7b1-56be-4c60-8955-c000ccfb6ec8" containerName="registry-server" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.140652 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c085a7b1-56be-4c60-8955-c000ccfb6ec8" containerName="registry-server" Mar 12 14:08:00 crc kubenswrapper[4921]: E0312 14:08:00.140670 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e520d70-8202-4958-a390-52590ccb5300" containerName="extract-utilities" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.140680 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e520d70-8202-4958-a390-52590ccb5300" containerName="extract-utilities" Mar 12 14:08:00 crc kubenswrapper[4921]: E0312 14:08:00.140697 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c085a7b1-56be-4c60-8955-c000ccfb6ec8" containerName="extract-utilities" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.140705 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c085a7b1-56be-4c60-8955-c000ccfb6ec8" containerName="extract-utilities" Mar 12 14:08:00 crc kubenswrapper[4921]: E0312 14:08:00.140718 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" containerName="extract-content" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.140726 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" containerName="extract-content" Mar 12 14:08:00 crc kubenswrapper[4921]: E0312 14:08:00.140737 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" containerName="registry-server" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.140745 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" containerName="registry-server" Mar 12 14:08:00 crc kubenswrapper[4921]: E0312 14:08:00.140761 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" containerName="extract-utilities" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.140770 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" containerName="extract-utilities" Mar 12 14:08:00 crc kubenswrapper[4921]: E0312 14:08:00.140779 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e520d70-8202-4958-a390-52590ccb5300" containerName="extract-content" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.140790 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e520d70-8202-4958-a390-52590ccb5300" containerName="extract-content" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.141051 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e520d70-8202-4958-a390-52590ccb5300" containerName="registry-server" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.141073 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c5e15fc-5680-4fb4-8d36-154efdfbb4cc" containerName="registry-server" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.141100 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c085a7b1-56be-4c60-8955-c000ccfb6ec8" containerName="registry-server" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.141969 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555408-gh4wv" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.144195 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.144311 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.147202 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.150247 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555408-gh4wv"] Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.202434 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9r6v\" (UniqueName: \"kubernetes.io/projected/c93ce20a-c243-456e-8c80-b35db8025428-kube-api-access-q9r6v\") pod \"auto-csr-approver-29555408-gh4wv\" (UID: \"c93ce20a-c243-456e-8c80-b35db8025428\") " pod="openshift-infra/auto-csr-approver-29555408-gh4wv" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.305314 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9r6v\" (UniqueName: \"kubernetes.io/projected/c93ce20a-c243-456e-8c80-b35db8025428-kube-api-access-q9r6v\") pod \"auto-csr-approver-29555408-gh4wv\" (UID: \"c93ce20a-c243-456e-8c80-b35db8025428\") " pod="openshift-infra/auto-csr-approver-29555408-gh4wv" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.331170 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9r6v\" (UniqueName: \"kubernetes.io/projected/c93ce20a-c243-456e-8c80-b35db8025428-kube-api-access-q9r6v\") pod \"auto-csr-approver-29555408-gh4wv\" (UID: \"c93ce20a-c243-456e-8c80-b35db8025428\") " pod="openshift-infra/auto-csr-approver-29555408-gh4wv" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.460354 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555408-gh4wv" Mar 12 14:08:00 crc kubenswrapper[4921]: I0312 14:08:00.968417 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555408-gh4wv"] Mar 12 14:08:00 crc kubenswrapper[4921]: W0312 14:08:00.974517 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc93ce20a_c243_456e_8c80_b35db8025428.slice/crio-74ef85619fcc57d51a13555832a08d6e129cec25307d92b9e690519315f5e5b6 WatchSource:0}: Error finding container 74ef85619fcc57d51a13555832a08d6e129cec25307d92b9e690519315f5e5b6: Status 404 returned error can't find the container with id 74ef85619fcc57d51a13555832a08d6e129cec25307d92b9e690519315f5e5b6 Mar 12 14:08:01 crc kubenswrapper[4921]: I0312 14:08:01.100686 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555408-gh4wv" event={"ID":"c93ce20a-c243-456e-8c80-b35db8025428","Type":"ContainerStarted","Data":"74ef85619fcc57d51a13555832a08d6e129cec25307d92b9e690519315f5e5b6"} Mar 12 14:08:03 crc kubenswrapper[4921]: I0312 14:08:03.119077 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555408-gh4wv" event={"ID":"c93ce20a-c243-456e-8c80-b35db8025428","Type":"ContainerStarted","Data":"50be04af38672c6279295602bb9bfee3662ce916cc804d6adb830cdf8b7ae601"} Mar 12 14:08:04 crc kubenswrapper[4921]: I0312 14:08:04.127702 4921 generic.go:334] "Generic (PLEG): container finished" podID="c93ce20a-c243-456e-8c80-b35db8025428" containerID="50be04af38672c6279295602bb9bfee3662ce916cc804d6adb830cdf8b7ae601" exitCode=0 Mar 12 14:08:04 crc kubenswrapper[4921]: I0312 14:08:04.127972 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555408-gh4wv" event={"ID":"c93ce20a-c243-456e-8c80-b35db8025428","Type":"ContainerDied","Data":"50be04af38672c6279295602bb9bfee3662ce916cc804d6adb830cdf8b7ae601"} Mar 12 14:08:05 crc kubenswrapper[4921]: I0312 14:08:05.627683 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555408-gh4wv" Mar 12 14:08:05 crc kubenswrapper[4921]: I0312 14:08:05.705070 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9r6v\" (UniqueName: \"kubernetes.io/projected/c93ce20a-c243-456e-8c80-b35db8025428-kube-api-access-q9r6v\") pod \"c93ce20a-c243-456e-8c80-b35db8025428\" (UID: \"c93ce20a-c243-456e-8c80-b35db8025428\") " Mar 12 14:08:05 crc kubenswrapper[4921]: I0312 14:08:05.713842 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c93ce20a-c243-456e-8c80-b35db8025428-kube-api-access-q9r6v" (OuterVolumeSpecName: "kube-api-access-q9r6v") pod "c93ce20a-c243-456e-8c80-b35db8025428" (UID: "c93ce20a-c243-456e-8c80-b35db8025428"). InnerVolumeSpecName "kube-api-access-q9r6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:08:05 crc kubenswrapper[4921]: I0312 14:08:05.807930 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9r6v\" (UniqueName: \"kubernetes.io/projected/c93ce20a-c243-456e-8c80-b35db8025428-kube-api-access-q9r6v\") on node \"crc\" DevicePath \"\"" Mar 12 14:08:06 crc kubenswrapper[4921]: I0312 14:08:06.147194 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555408-gh4wv" event={"ID":"c93ce20a-c243-456e-8c80-b35db8025428","Type":"ContainerDied","Data":"74ef85619fcc57d51a13555832a08d6e129cec25307d92b9e690519315f5e5b6"} Mar 12 14:08:06 crc kubenswrapper[4921]: I0312 14:08:06.147233 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74ef85619fcc57d51a13555832a08d6e129cec25307d92b9e690519315f5e5b6" Mar 12 14:08:06 crc kubenswrapper[4921]: I0312 14:08:06.147243 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555408-gh4wv" Mar 12 14:08:06 crc kubenswrapper[4921]: I0312 14:08:06.196681 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555402-jsfzg"] Mar 12 14:08:06 crc kubenswrapper[4921]: I0312 14:08:06.205060 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555402-jsfzg"] Mar 12 14:08:07 crc kubenswrapper[4921]: I0312 14:08:07.998626 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="626b7901-75fd-4a39-bef1-2fc34d374f41" path="/var/lib/kubelet/pods/626b7901-75fd-4a39-bef1-2fc34d374f41/volumes" Mar 12 14:08:45 crc kubenswrapper[4921]: I0312 14:08:45.590885 4921 scope.go:117] "RemoveContainer" containerID="92eb4c6b9c5004a837e83effdc00b01cbf738416d64a6afb99e84d38fecc7584" Mar 12 14:08:45 crc kubenswrapper[4921]: I0312 14:08:45.820538 4921 scope.go:117] "RemoveContainer" containerID="77e0acb25e568d205e7e59e9d8be85b343ee0622b60ebfc9eea244d3ed4c049e" Mar 12 14:08:45 crc kubenswrapper[4921]: I0312 14:08:45.853904 4921 scope.go:117] "RemoveContainer" containerID="39f0e93d2aac44476afa1cd4f5c38b431622b2118c53082b5e35045add8acf6c" Mar 12 14:09:56 crc kubenswrapper[4921]: I0312 14:09:56.324164 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:09:56 crc kubenswrapper[4921]: I0312 14:09:56.325193 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:10:00 crc kubenswrapper[4921]: I0312 14:10:00.147222 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555410-4mtkl"] Mar 12 14:10:00 crc kubenswrapper[4921]: E0312 14:10:00.148082 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c93ce20a-c243-456e-8c80-b35db8025428" containerName="oc" Mar 12 14:10:00 crc kubenswrapper[4921]: I0312 14:10:00.148093 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c93ce20a-c243-456e-8c80-b35db8025428" containerName="oc" Mar 12 14:10:00 crc kubenswrapper[4921]: I0312 14:10:00.148296 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c93ce20a-c243-456e-8c80-b35db8025428" containerName="oc" Mar 12 14:10:00 crc kubenswrapper[4921]: I0312 14:10:00.149005 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555410-4mtkl" Mar 12 14:10:00 crc kubenswrapper[4921]: I0312 14:10:00.151741 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:10:00 crc kubenswrapper[4921]: I0312 14:10:00.152626 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:10:00 crc kubenswrapper[4921]: I0312 14:10:00.152632 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:10:00 crc kubenswrapper[4921]: I0312 14:10:00.163597 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555410-4mtkl"] Mar 12 14:10:00 crc kubenswrapper[4921]: I0312 14:10:00.235095 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsvz5\" (UniqueName: \"kubernetes.io/projected/61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18-kube-api-access-zsvz5\") pod \"auto-csr-approver-29555410-4mtkl\" (UID: \"61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18\") " pod="openshift-infra/auto-csr-approver-29555410-4mtkl" Mar 12 14:10:00 crc kubenswrapper[4921]: I0312 14:10:00.337478 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsvz5\" (UniqueName: \"kubernetes.io/projected/61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18-kube-api-access-zsvz5\") pod \"auto-csr-approver-29555410-4mtkl\" (UID: \"61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18\") " pod="openshift-infra/auto-csr-approver-29555410-4mtkl" Mar 12 14:10:00 crc kubenswrapper[4921]: I0312 14:10:00.358880 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsvz5\" (UniqueName: \"kubernetes.io/projected/61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18-kube-api-access-zsvz5\") pod \"auto-csr-approver-29555410-4mtkl\" (UID: \"61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18\") " pod="openshift-infra/auto-csr-approver-29555410-4mtkl" Mar 12 14:10:00 crc kubenswrapper[4921]: I0312 14:10:00.470962 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555410-4mtkl" Mar 12 14:10:00 crc kubenswrapper[4921]: I0312 14:10:00.917503 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555410-4mtkl"] Mar 12 14:10:01 crc kubenswrapper[4921]: I0312 14:10:01.100008 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555410-4mtkl" event={"ID":"61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18","Type":"ContainerStarted","Data":"c3a1e9c727495ba0a8490f30424e9eb8a5ecc2e14e111f9e9e5a411ae16137de"} Mar 12 14:10:03 crc kubenswrapper[4921]: I0312 14:10:03.121463 4921 generic.go:334] "Generic (PLEG): container finished" podID="61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18" containerID="3f7acd354d16fde7500d2b695982d47bdaae1eb2757bfa56e643a5a37b76b9e2" exitCode=0 Mar 12 14:10:03 crc kubenswrapper[4921]: I0312 14:10:03.123045 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555410-4mtkl" event={"ID":"61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18","Type":"ContainerDied","Data":"3f7acd354d16fde7500d2b695982d47bdaae1eb2757bfa56e643a5a37b76b9e2"} Mar 12 14:10:04 crc kubenswrapper[4921]: I0312 14:10:04.697079 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555410-4mtkl" Mar 12 14:10:04 crc kubenswrapper[4921]: I0312 14:10:04.841774 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsvz5\" (UniqueName: \"kubernetes.io/projected/61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18-kube-api-access-zsvz5\") pod \"61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18\" (UID: \"61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18\") " Mar 12 14:10:04 crc kubenswrapper[4921]: I0312 14:10:04.851241 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18-kube-api-access-zsvz5" (OuterVolumeSpecName: "kube-api-access-zsvz5") pod "61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18" (UID: "61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18"). InnerVolumeSpecName "kube-api-access-zsvz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:10:04 crc kubenswrapper[4921]: I0312 14:10:04.944522 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsvz5\" (UniqueName: \"kubernetes.io/projected/61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18-kube-api-access-zsvz5\") on node \"crc\" DevicePath \"\"" Mar 12 14:10:05 crc kubenswrapper[4921]: I0312 14:10:05.142609 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555410-4mtkl" event={"ID":"61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18","Type":"ContainerDied","Data":"c3a1e9c727495ba0a8490f30424e9eb8a5ecc2e14e111f9e9e5a411ae16137de"} Mar 12 14:10:05 crc kubenswrapper[4921]: I0312 14:10:05.142650 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3a1e9c727495ba0a8490f30424e9eb8a5ecc2e14e111f9e9e5a411ae16137de" Mar 12 14:10:05 crc kubenswrapper[4921]: I0312 14:10:05.142722 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555410-4mtkl" Mar 12 14:10:05 crc kubenswrapper[4921]: I0312 14:10:05.778624 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555404-dnjkp"] Mar 12 14:10:05 crc kubenswrapper[4921]: I0312 14:10:05.787734 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555404-dnjkp"] Mar 12 14:10:05 crc kubenswrapper[4921]: I0312 14:10:05.995598 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60262de2-7339-45cc-8f4e-7c74ede21b00" path="/var/lib/kubelet/pods/60262de2-7339-45cc-8f4e-7c74ede21b00/volumes" Mar 12 14:10:26 crc kubenswrapper[4921]: I0312 14:10:26.324143 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:10:26 crc kubenswrapper[4921]: I0312 14:10:26.324738 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:10:45 crc kubenswrapper[4921]: I0312 14:10:45.997804 4921 scope.go:117] "RemoveContainer" containerID="84e54ed51993e750155b176d5258bf04926bb8ea435dcd7673b5b0c8db4b1464" Mar 12 14:10:56 crc kubenswrapper[4921]: I0312 14:10:56.324288 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:10:56 crc kubenswrapper[4921]: I0312 14:10:56.325022 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:10:56 crc kubenswrapper[4921]: I0312 14:10:56.325090 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 14:10:56 crc kubenswrapper[4921]: I0312 14:10:56.326164 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 14:10:56 crc kubenswrapper[4921]: I0312 14:10:56.326235 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" gracePeriod=600 Mar 12 14:10:56 crc kubenswrapper[4921]: E0312 14:10:56.468239 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:10:56 crc kubenswrapper[4921]: I0312 14:10:56.600346 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" exitCode=0 Mar 12 14:10:56 crc kubenswrapper[4921]: I0312 14:10:56.600393 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb"} Mar 12 14:10:56 crc kubenswrapper[4921]: I0312 14:10:56.600426 4921 scope.go:117] "RemoveContainer" containerID="dcea6ad9c4de2d2d2b80377b6008ddac70ede11b4baa44e9ad37e97a8c292848" Mar 12 14:10:56 crc kubenswrapper[4921]: I0312 14:10:56.601075 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:10:56 crc kubenswrapper[4921]: E0312 14:10:56.601415 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:11:09 crc kubenswrapper[4921]: I0312 14:11:09.983572 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:11:09 crc kubenswrapper[4921]: E0312 14:11:09.984424 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.340604 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8w2q2"] Mar 12 14:11:13 crc kubenswrapper[4921]: E0312 14:11:13.341590 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18" containerName="oc" Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.341606 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18" containerName="oc" Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.341857 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18" containerName="oc" Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.343610 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.352935 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8w2q2"] Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.492658 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp2fk\" (UniqueName: \"kubernetes.io/projected/0b42fb70-17d3-4735-ac5b-723337223db1-kube-api-access-dp2fk\") pod \"redhat-marketplace-8w2q2\" (UID: \"0b42fb70-17d3-4735-ac5b-723337223db1\") " pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.492753 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b42fb70-17d3-4735-ac5b-723337223db1-catalog-content\") pod \"redhat-marketplace-8w2q2\" (UID: \"0b42fb70-17d3-4735-ac5b-723337223db1\") " pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.492926 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b42fb70-17d3-4735-ac5b-723337223db1-utilities\") pod \"redhat-marketplace-8w2q2\" (UID: \"0b42fb70-17d3-4735-ac5b-723337223db1\") " pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.595140 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b42fb70-17d3-4735-ac5b-723337223db1-utilities\") pod \"redhat-marketplace-8w2q2\" (UID: \"0b42fb70-17d3-4735-ac5b-723337223db1\") " pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.595289 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp2fk\" (UniqueName: \"kubernetes.io/projected/0b42fb70-17d3-4735-ac5b-723337223db1-kube-api-access-dp2fk\") pod \"redhat-marketplace-8w2q2\" (UID: \"0b42fb70-17d3-4735-ac5b-723337223db1\") " pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.595751 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b42fb70-17d3-4735-ac5b-723337223db1-catalog-content\") pod \"redhat-marketplace-8w2q2\" (UID: \"0b42fb70-17d3-4735-ac5b-723337223db1\") " pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.595850 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b42fb70-17d3-4735-ac5b-723337223db1-utilities\") pod \"redhat-marketplace-8w2q2\" (UID: \"0b42fb70-17d3-4735-ac5b-723337223db1\") " pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.596179 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b42fb70-17d3-4735-ac5b-723337223db1-catalog-content\") pod \"redhat-marketplace-8w2q2\" (UID: \"0b42fb70-17d3-4735-ac5b-723337223db1\") " pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.630133 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp2fk\" (UniqueName: \"kubernetes.io/projected/0b42fb70-17d3-4735-ac5b-723337223db1-kube-api-access-dp2fk\") pod \"redhat-marketplace-8w2q2\" (UID: \"0b42fb70-17d3-4735-ac5b-723337223db1\") " pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:13 crc kubenswrapper[4921]: I0312 14:11:13.669266 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:14 crc kubenswrapper[4921]: I0312 14:11:14.117636 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8w2q2"] Mar 12 14:11:14 crc kubenswrapper[4921]: I0312 14:11:14.773164 4921 generic.go:334] "Generic (PLEG): container finished" podID="0b42fb70-17d3-4735-ac5b-723337223db1" containerID="bc68858c96f0a4d374e25edccc7375c72d3e696a1e1234fe4d19fdb44ac2deb2" exitCode=0 Mar 12 14:11:14 crc kubenswrapper[4921]: I0312 14:11:14.773237 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w2q2" event={"ID":"0b42fb70-17d3-4735-ac5b-723337223db1","Type":"ContainerDied","Data":"bc68858c96f0a4d374e25edccc7375c72d3e696a1e1234fe4d19fdb44ac2deb2"} Mar 12 14:11:14 crc kubenswrapper[4921]: I0312 14:11:14.773423 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w2q2" event={"ID":"0b42fb70-17d3-4735-ac5b-723337223db1","Type":"ContainerStarted","Data":"0139f39bca24dc64953d79d4690a257119d3cecbc8cc55bd9288002781bdc60d"} Mar 12 14:11:14 crc kubenswrapper[4921]: I0312 14:11:14.775366 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:11:15 crc kubenswrapper[4921]: I0312 14:11:15.785813 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w2q2" event={"ID":"0b42fb70-17d3-4735-ac5b-723337223db1","Type":"ContainerStarted","Data":"910bd8c5ec348bb197b77d23efb7015cf3f6c9e0060a2de2700026fcbe18785e"} Mar 12 14:11:16 crc kubenswrapper[4921]: I0312 14:11:16.798495 4921 generic.go:334] "Generic (PLEG): container finished" podID="0b42fb70-17d3-4735-ac5b-723337223db1" containerID="910bd8c5ec348bb197b77d23efb7015cf3f6c9e0060a2de2700026fcbe18785e" exitCode=0 Mar 12 14:11:16 crc kubenswrapper[4921]: I0312 14:11:16.798611 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w2q2" event={"ID":"0b42fb70-17d3-4735-ac5b-723337223db1","Type":"ContainerDied","Data":"910bd8c5ec348bb197b77d23efb7015cf3f6c9e0060a2de2700026fcbe18785e"} Mar 12 14:11:17 crc kubenswrapper[4921]: I0312 14:11:17.812523 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w2q2" event={"ID":"0b42fb70-17d3-4735-ac5b-723337223db1","Type":"ContainerStarted","Data":"846c311d70114859b188b8dd322936978948f47bebd704c70a5b245864549502"} Mar 12 14:11:17 crc kubenswrapper[4921]: I0312 14:11:17.851206 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8w2q2" podStartSLOduration=2.437499162 podStartE2EDuration="4.851182303s" podCreationTimestamp="2026-03-12 14:11:13 +0000 UTC" firstStartedPulling="2026-03-12 14:11:14.775146679 +0000 UTC m=+3697.465218650" lastFinishedPulling="2026-03-12 14:11:17.18882982 +0000 UTC m=+3699.878901791" observedRunningTime="2026-03-12 14:11:17.83748401 +0000 UTC m=+3700.527556021" watchObservedRunningTime="2026-03-12 14:11:17.851182303 +0000 UTC m=+3700.541254314" Mar 12 14:11:23 crc kubenswrapper[4921]: I0312 14:11:23.670254 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:23 crc kubenswrapper[4921]: I0312 14:11:23.670804 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:23 crc kubenswrapper[4921]: I0312 14:11:23.753990 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:23 crc kubenswrapper[4921]: I0312 14:11:23.916966 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:23 crc kubenswrapper[4921]: I0312 14:11:23.996005 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8w2q2"] Mar 12 14:11:24 crc kubenswrapper[4921]: I0312 14:11:24.983325 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:11:24 crc kubenswrapper[4921]: E0312 14:11:24.983618 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:11:25 crc kubenswrapper[4921]: I0312 14:11:25.907812 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8w2q2" podUID="0b42fb70-17d3-4735-ac5b-723337223db1" containerName="registry-server" containerID="cri-o://846c311d70114859b188b8dd322936978948f47bebd704c70a5b245864549502" gracePeriod=2 Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.588357 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.784923 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp2fk\" (UniqueName: \"kubernetes.io/projected/0b42fb70-17d3-4735-ac5b-723337223db1-kube-api-access-dp2fk\") pod \"0b42fb70-17d3-4735-ac5b-723337223db1\" (UID: \"0b42fb70-17d3-4735-ac5b-723337223db1\") " Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.784990 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b42fb70-17d3-4735-ac5b-723337223db1-utilities\") pod \"0b42fb70-17d3-4735-ac5b-723337223db1\" (UID: \"0b42fb70-17d3-4735-ac5b-723337223db1\") " Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.785053 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b42fb70-17d3-4735-ac5b-723337223db1-catalog-content\") pod \"0b42fb70-17d3-4735-ac5b-723337223db1\" (UID: \"0b42fb70-17d3-4735-ac5b-723337223db1\") " Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.785776 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b42fb70-17d3-4735-ac5b-723337223db1-utilities" (OuterVolumeSpecName: "utilities") pod "0b42fb70-17d3-4735-ac5b-723337223db1" (UID: "0b42fb70-17d3-4735-ac5b-723337223db1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.786414 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b42fb70-17d3-4735-ac5b-723337223db1-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.812154 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b42fb70-17d3-4735-ac5b-723337223db1-kube-api-access-dp2fk" (OuterVolumeSpecName: "kube-api-access-dp2fk") pod "0b42fb70-17d3-4735-ac5b-723337223db1" (UID: "0b42fb70-17d3-4735-ac5b-723337223db1"). InnerVolumeSpecName "kube-api-access-dp2fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.826414 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b42fb70-17d3-4735-ac5b-723337223db1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b42fb70-17d3-4735-ac5b-723337223db1" (UID: "0b42fb70-17d3-4735-ac5b-723337223db1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.889293 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp2fk\" (UniqueName: \"kubernetes.io/projected/0b42fb70-17d3-4735-ac5b-723337223db1-kube-api-access-dp2fk\") on node \"crc\" DevicePath \"\"" Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.889338 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b42fb70-17d3-4735-ac5b-723337223db1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.920057 4921 generic.go:334] "Generic (PLEG): container finished" podID="0b42fb70-17d3-4735-ac5b-723337223db1" containerID="846c311d70114859b188b8dd322936978948f47bebd704c70a5b245864549502" exitCode=0 Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.920097 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w2q2" event={"ID":"0b42fb70-17d3-4735-ac5b-723337223db1","Type":"ContainerDied","Data":"846c311d70114859b188b8dd322936978948f47bebd704c70a5b245864549502"} Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.920123 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8w2q2" event={"ID":"0b42fb70-17d3-4735-ac5b-723337223db1","Type":"ContainerDied","Data":"0139f39bca24dc64953d79d4690a257119d3cecbc8cc55bd9288002781bdc60d"} Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.920142 4921 scope.go:117] "RemoveContainer" containerID="846c311d70114859b188b8dd322936978948f47bebd704c70a5b245864549502" Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.920263 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8w2q2" Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.952968 4921 scope.go:117] "RemoveContainer" containerID="910bd8c5ec348bb197b77d23efb7015cf3f6c9e0060a2de2700026fcbe18785e" Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.976530 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8w2q2"] Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.985603 4921 scope.go:117] "RemoveContainer" containerID="bc68858c96f0a4d374e25edccc7375c72d3e696a1e1234fe4d19fdb44ac2deb2" Mar 12 14:11:26 crc kubenswrapper[4921]: I0312 14:11:26.987424 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8w2q2"] Mar 12 14:11:27 crc kubenswrapper[4921]: I0312 14:11:27.038182 4921 scope.go:117] "RemoveContainer" containerID="846c311d70114859b188b8dd322936978948f47bebd704c70a5b245864549502" Mar 12 14:11:27 crc kubenswrapper[4921]: E0312 14:11:27.038698 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"846c311d70114859b188b8dd322936978948f47bebd704c70a5b245864549502\": container with ID starting with 846c311d70114859b188b8dd322936978948f47bebd704c70a5b245864549502 not found: ID does not exist" containerID="846c311d70114859b188b8dd322936978948f47bebd704c70a5b245864549502" Mar 12 14:11:27 crc kubenswrapper[4921]: I0312 14:11:27.038746 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846c311d70114859b188b8dd322936978948f47bebd704c70a5b245864549502"} err="failed to get container status \"846c311d70114859b188b8dd322936978948f47bebd704c70a5b245864549502\": rpc error: code = NotFound desc = could not find container \"846c311d70114859b188b8dd322936978948f47bebd704c70a5b245864549502\": container with ID starting with 846c311d70114859b188b8dd322936978948f47bebd704c70a5b245864549502 not found: ID does not exist" Mar 12 14:11:27 crc kubenswrapper[4921]: I0312 14:11:27.038779 4921 scope.go:117] "RemoveContainer" containerID="910bd8c5ec348bb197b77d23efb7015cf3f6c9e0060a2de2700026fcbe18785e" Mar 12 14:11:27 crc kubenswrapper[4921]: E0312 14:11:27.039200 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"910bd8c5ec348bb197b77d23efb7015cf3f6c9e0060a2de2700026fcbe18785e\": container with ID starting with 910bd8c5ec348bb197b77d23efb7015cf3f6c9e0060a2de2700026fcbe18785e not found: ID does not exist" containerID="910bd8c5ec348bb197b77d23efb7015cf3f6c9e0060a2de2700026fcbe18785e" Mar 12 14:11:27 crc kubenswrapper[4921]: I0312 14:11:27.039226 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"910bd8c5ec348bb197b77d23efb7015cf3f6c9e0060a2de2700026fcbe18785e"} err="failed to get container status \"910bd8c5ec348bb197b77d23efb7015cf3f6c9e0060a2de2700026fcbe18785e\": rpc error: code = NotFound desc = could not find container \"910bd8c5ec348bb197b77d23efb7015cf3f6c9e0060a2de2700026fcbe18785e\": container with ID starting with 910bd8c5ec348bb197b77d23efb7015cf3f6c9e0060a2de2700026fcbe18785e not found: ID does not exist" Mar 12 14:11:27 crc kubenswrapper[4921]: I0312 14:11:27.039240 4921 scope.go:117] "RemoveContainer" containerID="bc68858c96f0a4d374e25edccc7375c72d3e696a1e1234fe4d19fdb44ac2deb2" Mar 12 14:11:27 crc kubenswrapper[4921]: E0312 14:11:27.039613 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc68858c96f0a4d374e25edccc7375c72d3e696a1e1234fe4d19fdb44ac2deb2\": container with ID starting with bc68858c96f0a4d374e25edccc7375c72d3e696a1e1234fe4d19fdb44ac2deb2 not found: ID does not exist" containerID="bc68858c96f0a4d374e25edccc7375c72d3e696a1e1234fe4d19fdb44ac2deb2" Mar 12 14:11:27 crc kubenswrapper[4921]: I0312 14:11:27.039654 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc68858c96f0a4d374e25edccc7375c72d3e696a1e1234fe4d19fdb44ac2deb2"} err="failed to get container status \"bc68858c96f0a4d374e25edccc7375c72d3e696a1e1234fe4d19fdb44ac2deb2\": rpc error: code = NotFound desc = could not find container \"bc68858c96f0a4d374e25edccc7375c72d3e696a1e1234fe4d19fdb44ac2deb2\": container with ID starting with bc68858c96f0a4d374e25edccc7375c72d3e696a1e1234fe4d19fdb44ac2deb2 not found: ID does not exist" Mar 12 14:11:27 crc kubenswrapper[4921]: I0312 14:11:27.999183 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b42fb70-17d3-4735-ac5b-723337223db1" path="/var/lib/kubelet/pods/0b42fb70-17d3-4735-ac5b-723337223db1/volumes" Mar 12 14:11:37 crc kubenswrapper[4921]: I0312 14:11:37.989689 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:11:37 crc kubenswrapper[4921]: E0312 14:11:37.990460 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:11:52 crc kubenswrapper[4921]: I0312 14:11:52.983641 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:11:52 crc kubenswrapper[4921]: E0312 14:11:52.984424 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.139883 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555412-8brfw"] Mar 12 14:12:00 crc kubenswrapper[4921]: E0312 14:12:00.140980 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b42fb70-17d3-4735-ac5b-723337223db1" containerName="extract-content" Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.140999 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b42fb70-17d3-4735-ac5b-723337223db1" containerName="extract-content" Mar 12 14:12:00 crc kubenswrapper[4921]: E0312 14:12:00.141014 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b42fb70-17d3-4735-ac5b-723337223db1" containerName="registry-server" Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.141021 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b42fb70-17d3-4735-ac5b-723337223db1" containerName="registry-server" Mar 12 14:12:00 crc kubenswrapper[4921]: E0312 14:12:00.141040 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b42fb70-17d3-4735-ac5b-723337223db1" containerName="extract-utilities" Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.141047 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b42fb70-17d3-4735-ac5b-723337223db1" containerName="extract-utilities" Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.141281 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b42fb70-17d3-4735-ac5b-723337223db1" containerName="registry-server" Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.142065 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555412-8brfw" Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.143986 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.144101 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.145275 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.147698 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555412-8brfw"] Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.261760 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ml6k\" (UniqueName: \"kubernetes.io/projected/c85e233c-885c-4bbf-be47-c8437a37a46b-kube-api-access-7ml6k\") pod \"auto-csr-approver-29555412-8brfw\" (UID: \"c85e233c-885c-4bbf-be47-c8437a37a46b\") " pod="openshift-infra/auto-csr-approver-29555412-8brfw" Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.363832 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ml6k\" (UniqueName: \"kubernetes.io/projected/c85e233c-885c-4bbf-be47-c8437a37a46b-kube-api-access-7ml6k\") pod \"auto-csr-approver-29555412-8brfw\" (UID: \"c85e233c-885c-4bbf-be47-c8437a37a46b\") " pod="openshift-infra/auto-csr-approver-29555412-8brfw" Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.389230 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ml6k\" (UniqueName: \"kubernetes.io/projected/c85e233c-885c-4bbf-be47-c8437a37a46b-kube-api-access-7ml6k\") pod \"auto-csr-approver-29555412-8brfw\" (UID: \"c85e233c-885c-4bbf-be47-c8437a37a46b\") " pod="openshift-infra/auto-csr-approver-29555412-8brfw" Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.466301 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555412-8brfw" Mar 12 14:12:00 crc kubenswrapper[4921]: I0312 14:12:00.931291 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555412-8brfw"] Mar 12 14:12:01 crc kubenswrapper[4921]: I0312 14:12:01.245561 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555412-8brfw" event={"ID":"c85e233c-885c-4bbf-be47-c8437a37a46b","Type":"ContainerStarted","Data":"3c6cc174f5498d23e1a2333aa6350da01426a486afb1fb1d9bcb80fa90261232"} Mar 12 14:12:03 crc kubenswrapper[4921]: I0312 14:12:03.289033 4921 generic.go:334] "Generic (PLEG): container finished" podID="c85e233c-885c-4bbf-be47-c8437a37a46b" containerID="1e4fbe54207181963f3ffecbb4c1859ea61cced7fb37e29fe4f1ec112d86e22d" exitCode=0 Mar 12 14:12:03 crc kubenswrapper[4921]: I0312 14:12:03.289464 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555412-8brfw" event={"ID":"c85e233c-885c-4bbf-be47-c8437a37a46b","Type":"ContainerDied","Data":"1e4fbe54207181963f3ffecbb4c1859ea61cced7fb37e29fe4f1ec112d86e22d"} Mar 12 14:12:04 crc kubenswrapper[4921]: I0312 14:12:04.875207 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555412-8brfw" Mar 12 14:12:04 crc kubenswrapper[4921]: I0312 14:12:04.974832 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ml6k\" (UniqueName: \"kubernetes.io/projected/c85e233c-885c-4bbf-be47-c8437a37a46b-kube-api-access-7ml6k\") pod \"c85e233c-885c-4bbf-be47-c8437a37a46b\" (UID: \"c85e233c-885c-4bbf-be47-c8437a37a46b\") " Mar 12 14:12:04 crc kubenswrapper[4921]: I0312 14:12:04.988676 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c85e233c-885c-4bbf-be47-c8437a37a46b-kube-api-access-7ml6k" (OuterVolumeSpecName: "kube-api-access-7ml6k") pod "c85e233c-885c-4bbf-be47-c8437a37a46b" (UID: "c85e233c-885c-4bbf-be47-c8437a37a46b"). InnerVolumeSpecName "kube-api-access-7ml6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:12:05 crc kubenswrapper[4921]: I0312 14:12:05.077586 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ml6k\" (UniqueName: \"kubernetes.io/projected/c85e233c-885c-4bbf-be47-c8437a37a46b-kube-api-access-7ml6k\") on node \"crc\" DevicePath \"\"" Mar 12 14:12:05 crc kubenswrapper[4921]: I0312 14:12:05.305774 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555412-8brfw" event={"ID":"c85e233c-885c-4bbf-be47-c8437a37a46b","Type":"ContainerDied","Data":"3c6cc174f5498d23e1a2333aa6350da01426a486afb1fb1d9bcb80fa90261232"} Mar 12 14:12:05 crc kubenswrapper[4921]: I0312 14:12:05.305839 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c6cc174f5498d23e1a2333aa6350da01426a486afb1fb1d9bcb80fa90261232" Mar 12 14:12:05 crc kubenswrapper[4921]: I0312 14:12:05.305897 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555412-8brfw" Mar 12 14:12:05 crc kubenswrapper[4921]: I0312 14:12:05.932895 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555406-pfcj8"] Mar 12 14:12:05 crc kubenswrapper[4921]: I0312 14:12:05.941059 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555406-pfcj8"] Mar 12 14:12:05 crc kubenswrapper[4921]: I0312 14:12:05.993517 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adfd5a52-50b8-4d44-a80e-856ccc5e5514" path="/var/lib/kubelet/pods/adfd5a52-50b8-4d44-a80e-856ccc5e5514/volumes" Mar 12 14:12:06 crc kubenswrapper[4921]: I0312 14:12:06.984486 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:12:06 crc kubenswrapper[4921]: E0312 14:12:06.985148 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:12:21 crc kubenswrapper[4921]: I0312 14:12:21.983344 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:12:21 crc kubenswrapper[4921]: E0312 14:12:21.984156 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:12:36 crc kubenswrapper[4921]: I0312 14:12:36.983855 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:12:36 crc kubenswrapper[4921]: E0312 14:12:36.984622 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:12:46 crc kubenswrapper[4921]: I0312 14:12:46.124054 4921 scope.go:117] "RemoveContainer" containerID="6de3c2bcbfa20597e4f27b96a1001ec8b50ab0a7a2179aedc7112b9d31ba1d86" Mar 12 14:12:50 crc kubenswrapper[4921]: I0312 14:12:50.983467 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:12:50 crc kubenswrapper[4921]: E0312 14:12:50.984339 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:13:01 crc kubenswrapper[4921]: I0312 14:13:01.984587 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:13:01 crc kubenswrapper[4921]: E0312 14:13:01.985506 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:13:16 crc kubenswrapper[4921]: I0312 14:13:16.983009 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:13:16 crc kubenswrapper[4921]: E0312 14:13:16.983790 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:13:29 crc kubenswrapper[4921]: I0312 14:13:29.983547 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:13:29 crc kubenswrapper[4921]: E0312 14:13:29.984412 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:13:43 crc kubenswrapper[4921]: I0312 14:13:43.984533 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:13:43 crc kubenswrapper[4921]: E0312 14:13:43.985494 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:13:55 crc kubenswrapper[4921]: I0312 14:13:55.983792 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:13:55 crc kubenswrapper[4921]: E0312 14:13:55.984462 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:14:00 crc kubenswrapper[4921]: I0312 14:14:00.170507 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555414-xnzjk"] Mar 12 14:14:00 crc kubenswrapper[4921]: E0312 14:14:00.171479 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85e233c-885c-4bbf-be47-c8437a37a46b" containerName="oc" Mar 12 14:14:00 crc kubenswrapper[4921]: I0312 14:14:00.171492 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85e233c-885c-4bbf-be47-c8437a37a46b" containerName="oc" Mar 12 14:14:00 crc kubenswrapper[4921]: I0312 14:14:00.171703 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c85e233c-885c-4bbf-be47-c8437a37a46b" containerName="oc" Mar 12 14:14:00 crc kubenswrapper[4921]: I0312 14:14:00.172333 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555414-xnzjk" Mar 12 14:14:00 crc kubenswrapper[4921]: I0312 14:14:00.176446 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:14:00 crc kubenswrapper[4921]: I0312 14:14:00.178911 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:14:00 crc kubenswrapper[4921]: I0312 14:14:00.179144 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:14:00 crc kubenswrapper[4921]: I0312 14:14:00.192730 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555414-xnzjk"] Mar 12 14:14:00 crc kubenswrapper[4921]: I0312 14:14:00.329956 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcqpt\" (UniqueName: \"kubernetes.io/projected/5a4240d3-387f-42dd-a1ed-5a81ebfb96e9-kube-api-access-qcqpt\") pod \"auto-csr-approver-29555414-xnzjk\" (UID: \"5a4240d3-387f-42dd-a1ed-5a81ebfb96e9\") " pod="openshift-infra/auto-csr-approver-29555414-xnzjk" Mar 12 14:14:00 crc kubenswrapper[4921]: I0312 14:14:00.432042 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcqpt\" (UniqueName: \"kubernetes.io/projected/5a4240d3-387f-42dd-a1ed-5a81ebfb96e9-kube-api-access-qcqpt\") pod \"auto-csr-approver-29555414-xnzjk\" (UID: \"5a4240d3-387f-42dd-a1ed-5a81ebfb96e9\") " pod="openshift-infra/auto-csr-approver-29555414-xnzjk" Mar 12 14:14:00 crc kubenswrapper[4921]: I0312 14:14:00.455551 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcqpt\" (UniqueName: \"kubernetes.io/projected/5a4240d3-387f-42dd-a1ed-5a81ebfb96e9-kube-api-access-qcqpt\") pod \"auto-csr-approver-29555414-xnzjk\" (UID: \"5a4240d3-387f-42dd-a1ed-5a81ebfb96e9\") " pod="openshift-infra/auto-csr-approver-29555414-xnzjk" Mar 12 14:14:00 crc kubenswrapper[4921]: I0312 14:14:00.502259 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555414-xnzjk" Mar 12 14:14:00 crc kubenswrapper[4921]: I0312 14:14:00.969139 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555414-xnzjk"] Mar 12 14:14:01 crc kubenswrapper[4921]: I0312 14:14:01.397111 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555414-xnzjk" event={"ID":"5a4240d3-387f-42dd-a1ed-5a81ebfb96e9","Type":"ContainerStarted","Data":"600f8d3d8e79821d550f27344cd5b574fead32149dcc22ee1ce97fd963d53f36"} Mar 12 14:14:03 crc kubenswrapper[4921]: I0312 14:14:03.415546 4921 generic.go:334] "Generic (PLEG): container finished" podID="5a4240d3-387f-42dd-a1ed-5a81ebfb96e9" containerID="aa95301622884e889de9029c8e7a238cf07b727f86cc66d6b0760bd007648398" exitCode=0 Mar 12 14:14:03 crc kubenswrapper[4921]: I0312 14:14:03.415602 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555414-xnzjk" event={"ID":"5a4240d3-387f-42dd-a1ed-5a81ebfb96e9","Type":"ContainerDied","Data":"aa95301622884e889de9029c8e7a238cf07b727f86cc66d6b0760bd007648398"} Mar 12 14:14:04 crc kubenswrapper[4921]: I0312 14:14:04.984006 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555414-xnzjk" Mar 12 14:14:05 crc kubenswrapper[4921]: I0312 14:14:05.124735 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcqpt\" (UniqueName: \"kubernetes.io/projected/5a4240d3-387f-42dd-a1ed-5a81ebfb96e9-kube-api-access-qcqpt\") pod \"5a4240d3-387f-42dd-a1ed-5a81ebfb96e9\" (UID: \"5a4240d3-387f-42dd-a1ed-5a81ebfb96e9\") " Mar 12 14:14:05 crc kubenswrapper[4921]: I0312 14:14:05.132219 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a4240d3-387f-42dd-a1ed-5a81ebfb96e9-kube-api-access-qcqpt" (OuterVolumeSpecName: "kube-api-access-qcqpt") pod "5a4240d3-387f-42dd-a1ed-5a81ebfb96e9" (UID: "5a4240d3-387f-42dd-a1ed-5a81ebfb96e9"). InnerVolumeSpecName "kube-api-access-qcqpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:14:05 crc kubenswrapper[4921]: I0312 14:14:05.227831 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcqpt\" (UniqueName: \"kubernetes.io/projected/5a4240d3-387f-42dd-a1ed-5a81ebfb96e9-kube-api-access-qcqpt\") on node \"crc\" DevicePath \"\"" Mar 12 14:14:05 crc kubenswrapper[4921]: I0312 14:14:05.440580 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555414-xnzjk" event={"ID":"5a4240d3-387f-42dd-a1ed-5a81ebfb96e9","Type":"ContainerDied","Data":"600f8d3d8e79821d550f27344cd5b574fead32149dcc22ee1ce97fd963d53f36"} Mar 12 14:14:05 crc kubenswrapper[4921]: I0312 14:14:05.440960 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="600f8d3d8e79821d550f27344cd5b574fead32149dcc22ee1ce97fd963d53f36" Mar 12 14:14:05 crc kubenswrapper[4921]: I0312 14:14:05.440616 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555414-xnzjk" Mar 12 14:14:06 crc kubenswrapper[4921]: I0312 14:14:06.079925 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555408-gh4wv"] Mar 12 14:14:06 crc kubenswrapper[4921]: I0312 14:14:06.087627 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555408-gh4wv"] Mar 12 14:14:08 crc kubenswrapper[4921]: I0312 14:14:08.005631 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c93ce20a-c243-456e-8c80-b35db8025428" path="/var/lib/kubelet/pods/c93ce20a-c243-456e-8c80-b35db8025428/volumes" Mar 12 14:14:10 crc kubenswrapper[4921]: I0312 14:14:10.983156 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:14:10 crc kubenswrapper[4921]: E0312 14:14:10.983791 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:14:24 crc kubenswrapper[4921]: I0312 14:14:24.984166 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:14:24 crc kubenswrapper[4921]: E0312 14:14:24.984742 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:14:37 crc kubenswrapper[4921]: I0312 14:14:37.988983 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:14:37 crc kubenswrapper[4921]: E0312 14:14:37.989584 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:14:46 crc kubenswrapper[4921]: I0312 14:14:46.232857 4921 scope.go:117] "RemoveContainer" containerID="50be04af38672c6279295602bb9bfee3662ce916cc804d6adb830cdf8b7ae601" Mar 12 14:14:49 crc kubenswrapper[4921]: I0312 14:14:49.984008 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:14:49 crc kubenswrapper[4921]: E0312 14:14:49.984855 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.165230 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns"] Mar 12 14:15:00 crc kubenswrapper[4921]: E0312 14:15:00.166075 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a4240d3-387f-42dd-a1ed-5a81ebfb96e9" containerName="oc" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.166087 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a4240d3-387f-42dd-a1ed-5a81ebfb96e9" containerName="oc" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.166265 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a4240d3-387f-42dd-a1ed-5a81ebfb96e9" containerName="oc" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.166990 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.168651 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.169442 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.175591 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns"] Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.272724 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c314933f-5c81-4c99-872c-e16f3d6317f4-secret-volume\") pod \"collect-profiles-29555415-xsmns\" (UID: \"c314933f-5c81-4c99-872c-e16f3d6317f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.272980 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q87bg\" (UniqueName: \"kubernetes.io/projected/c314933f-5c81-4c99-872c-e16f3d6317f4-kube-api-access-q87bg\") pod \"collect-profiles-29555415-xsmns\" (UID: \"c314933f-5c81-4c99-872c-e16f3d6317f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.273078 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c314933f-5c81-4c99-872c-e16f3d6317f4-config-volume\") pod \"collect-profiles-29555415-xsmns\" (UID: \"c314933f-5c81-4c99-872c-e16f3d6317f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.374458 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q87bg\" (UniqueName: \"kubernetes.io/projected/c314933f-5c81-4c99-872c-e16f3d6317f4-kube-api-access-q87bg\") pod \"collect-profiles-29555415-xsmns\" (UID: \"c314933f-5c81-4c99-872c-e16f3d6317f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.374535 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c314933f-5c81-4c99-872c-e16f3d6317f4-config-volume\") pod \"collect-profiles-29555415-xsmns\" (UID: \"c314933f-5c81-4c99-872c-e16f3d6317f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.374589 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c314933f-5c81-4c99-872c-e16f3d6317f4-secret-volume\") pod \"collect-profiles-29555415-xsmns\" (UID: \"c314933f-5c81-4c99-872c-e16f3d6317f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.375929 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c314933f-5c81-4c99-872c-e16f3d6317f4-config-volume\") pod \"collect-profiles-29555415-xsmns\" (UID: \"c314933f-5c81-4c99-872c-e16f3d6317f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.393990 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c314933f-5c81-4c99-872c-e16f3d6317f4-secret-volume\") pod \"collect-profiles-29555415-xsmns\" (UID: \"c314933f-5c81-4c99-872c-e16f3d6317f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.402541 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q87bg\" (UniqueName: \"kubernetes.io/projected/c314933f-5c81-4c99-872c-e16f3d6317f4-kube-api-access-q87bg\") pod \"collect-profiles-29555415-xsmns\" (UID: \"c314933f-5c81-4c99-872c-e16f3d6317f4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" Mar 12 14:15:00 crc kubenswrapper[4921]: I0312 14:15:00.521666 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" Mar 12 14:15:01 crc kubenswrapper[4921]: I0312 14:15:01.043068 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns"] Mar 12 14:15:01 crc kubenswrapper[4921]: I0312 14:15:01.975470 4921 generic.go:334] "Generic (PLEG): container finished" podID="c314933f-5c81-4c99-872c-e16f3d6317f4" containerID="e6f2d1304816860c507b46e8230027c3fa84b5c66cb0f65241c8cd5403d96885" exitCode=0 Mar 12 14:15:01 crc kubenswrapper[4921]: I0312 14:15:01.975713 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" event={"ID":"c314933f-5c81-4c99-872c-e16f3d6317f4","Type":"ContainerDied","Data":"e6f2d1304816860c507b46e8230027c3fa84b5c66cb0f65241c8cd5403d96885"} Mar 12 14:15:01 crc kubenswrapper[4921]: I0312 14:15:01.975923 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" event={"ID":"c314933f-5c81-4c99-872c-e16f3d6317f4","Type":"ContainerStarted","Data":"addc56301e7b2d1895f238a5f02a15678662b2f7bb9f1a83bf1843a2c34815b5"} Mar 12 14:15:01 crc kubenswrapper[4921]: I0312 14:15:01.983661 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:15:01 crc kubenswrapper[4921]: E0312 14:15:01.984123 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:15:03 crc kubenswrapper[4921]: I0312 14:15:03.530254 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" Mar 12 14:15:03 crc kubenswrapper[4921]: I0312 14:15:03.649484 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c314933f-5c81-4c99-872c-e16f3d6317f4-config-volume\") pod \"c314933f-5c81-4c99-872c-e16f3d6317f4\" (UID: \"c314933f-5c81-4c99-872c-e16f3d6317f4\") " Mar 12 14:15:03 crc kubenswrapper[4921]: I0312 14:15:03.649570 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c314933f-5c81-4c99-872c-e16f3d6317f4-secret-volume\") pod \"c314933f-5c81-4c99-872c-e16f3d6317f4\" (UID: \"c314933f-5c81-4c99-872c-e16f3d6317f4\") " Mar 12 14:15:03 crc kubenswrapper[4921]: I0312 14:15:03.649598 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q87bg\" (UniqueName: \"kubernetes.io/projected/c314933f-5c81-4c99-872c-e16f3d6317f4-kube-api-access-q87bg\") pod \"c314933f-5c81-4c99-872c-e16f3d6317f4\" (UID: \"c314933f-5c81-4c99-872c-e16f3d6317f4\") " Mar 12 14:15:03 crc kubenswrapper[4921]: I0312 14:15:03.650413 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c314933f-5c81-4c99-872c-e16f3d6317f4-config-volume" (OuterVolumeSpecName: "config-volume") pod "c314933f-5c81-4c99-872c-e16f3d6317f4" (UID: "c314933f-5c81-4c99-872c-e16f3d6317f4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:15:03 crc kubenswrapper[4921]: I0312 14:15:03.663397 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c314933f-5c81-4c99-872c-e16f3d6317f4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c314933f-5c81-4c99-872c-e16f3d6317f4" (UID: "c314933f-5c81-4c99-872c-e16f3d6317f4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:15:03 crc kubenswrapper[4921]: I0312 14:15:03.665996 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c314933f-5c81-4c99-872c-e16f3d6317f4-kube-api-access-q87bg" (OuterVolumeSpecName: "kube-api-access-q87bg") pod "c314933f-5c81-4c99-872c-e16f3d6317f4" (UID: "c314933f-5c81-4c99-872c-e16f3d6317f4"). InnerVolumeSpecName "kube-api-access-q87bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:15:03 crc kubenswrapper[4921]: I0312 14:15:03.752123 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c314933f-5c81-4c99-872c-e16f3d6317f4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 14:15:03 crc kubenswrapper[4921]: I0312 14:15:03.752177 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c314933f-5c81-4c99-872c-e16f3d6317f4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 14:15:03 crc kubenswrapper[4921]: I0312 14:15:03.752192 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q87bg\" (UniqueName: \"kubernetes.io/projected/c314933f-5c81-4c99-872c-e16f3d6317f4-kube-api-access-q87bg\") on node \"crc\" DevicePath \"\"" Mar 12 14:15:03 crc kubenswrapper[4921]: I0312 14:15:03.992367 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" Mar 12 14:15:03 crc kubenswrapper[4921]: I0312 14:15:03.994233 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns" event={"ID":"c314933f-5c81-4c99-872c-e16f3d6317f4","Type":"ContainerDied","Data":"addc56301e7b2d1895f238a5f02a15678662b2f7bb9f1a83bf1843a2c34815b5"} Mar 12 14:15:03 crc kubenswrapper[4921]: I0312 14:15:03.994386 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="addc56301e7b2d1895f238a5f02a15678662b2f7bb9f1a83bf1843a2c34815b5" Mar 12 14:15:04 crc kubenswrapper[4921]: I0312 14:15:04.607909 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l"] Mar 12 14:15:04 crc kubenswrapper[4921]: I0312 14:15:04.619112 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555370-7cx7l"] Mar 12 14:15:05 crc kubenswrapper[4921]: I0312 14:15:05.996905 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e52e7fe-8b59-4e0f-a70d-cc63836749b4" path="/var/lib/kubelet/pods/9e52e7fe-8b59-4e0f-a70d-cc63836749b4/volumes" Mar 12 14:15:13 crc kubenswrapper[4921]: I0312 14:15:13.983969 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:15:13 crc kubenswrapper[4921]: E0312 14:15:13.984734 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:15:28 crc kubenswrapper[4921]: I0312 14:15:28.983699 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:15:28 crc kubenswrapper[4921]: E0312 14:15:28.984856 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:15:39 crc kubenswrapper[4921]: I0312 14:15:39.983667 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:15:39 crc kubenswrapper[4921]: E0312 14:15:39.984696 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:15:46 crc kubenswrapper[4921]: I0312 14:15:46.590784 4921 scope.go:117] "RemoveContainer" containerID="3c6e616b4b05287a4a4056d55160b0b809b8de7a24bcd9a779b790e54a669cb9" Mar 12 14:15:50 crc kubenswrapper[4921]: I0312 14:15:50.983994 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:15:50 crc kubenswrapper[4921]: E0312 14:15:50.984982 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:16:00 crc kubenswrapper[4921]: I0312 14:16:00.152607 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555416-jwnhq"] Mar 12 14:16:00 crc kubenswrapper[4921]: E0312 14:16:00.153614 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c314933f-5c81-4c99-872c-e16f3d6317f4" containerName="collect-profiles" Mar 12 14:16:00 crc kubenswrapper[4921]: I0312 14:16:00.153629 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c314933f-5c81-4c99-872c-e16f3d6317f4" containerName="collect-profiles" Mar 12 14:16:00 crc kubenswrapper[4921]: I0312 14:16:00.153856 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c314933f-5c81-4c99-872c-e16f3d6317f4" containerName="collect-profiles" Mar 12 14:16:00 crc kubenswrapper[4921]: I0312 14:16:00.154729 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555416-jwnhq" Mar 12 14:16:00 crc kubenswrapper[4921]: I0312 14:16:00.159397 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:16:00 crc kubenswrapper[4921]: I0312 14:16:00.160555 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:16:00 crc kubenswrapper[4921]: I0312 14:16:00.161135 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:16:00 crc kubenswrapper[4921]: I0312 14:16:00.166307 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555416-jwnhq"] Mar 12 14:16:00 crc kubenswrapper[4921]: I0312 14:16:00.213019 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zvpc\" (UniqueName: \"kubernetes.io/projected/0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1-kube-api-access-4zvpc\") pod \"auto-csr-approver-29555416-jwnhq\" (UID: \"0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1\") " pod="openshift-infra/auto-csr-approver-29555416-jwnhq" Mar 12 14:16:00 crc kubenswrapper[4921]: I0312 14:16:00.314934 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zvpc\" (UniqueName: \"kubernetes.io/projected/0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1-kube-api-access-4zvpc\") pod \"auto-csr-approver-29555416-jwnhq\" (UID: \"0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1\") " pod="openshift-infra/auto-csr-approver-29555416-jwnhq" Mar 12 14:16:00 crc kubenswrapper[4921]: I0312 14:16:00.335489 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zvpc\" (UniqueName: \"kubernetes.io/projected/0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1-kube-api-access-4zvpc\") pod \"auto-csr-approver-29555416-jwnhq\" (UID: \"0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1\") " pod="openshift-infra/auto-csr-approver-29555416-jwnhq" Mar 12 14:16:00 crc kubenswrapper[4921]: I0312 14:16:00.513646 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555416-jwnhq" Mar 12 14:16:01 crc kubenswrapper[4921]: I0312 14:16:01.013575 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555416-jwnhq"] Mar 12 14:16:01 crc kubenswrapper[4921]: I0312 14:16:01.788000 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555416-jwnhq" event={"ID":"0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1","Type":"ContainerStarted","Data":"11d61478eb10d51bd139d88005b61c9d9e01cc60c33c7d8b10cd4c347f18a061"} Mar 12 14:16:02 crc kubenswrapper[4921]: I0312 14:16:02.799172 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555416-jwnhq" event={"ID":"0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1","Type":"ContainerStarted","Data":"694358b65ef6dda4bd9b20e54980fd7074ff88dff9bff0887a59ea67649a2ea9"} Mar 12 14:16:02 crc kubenswrapper[4921]: I0312 14:16:02.816956 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555416-jwnhq" podStartSLOduration=1.743621452 podStartE2EDuration="2.816934212s" podCreationTimestamp="2026-03-12 14:16:00 +0000 UTC" firstStartedPulling="2026-03-12 14:16:01.019460569 +0000 UTC m=+3983.709532530" lastFinishedPulling="2026-03-12 14:16:02.092773309 +0000 UTC m=+3984.782845290" observedRunningTime="2026-03-12 14:16:02.813770694 +0000 UTC m=+3985.503842665" watchObservedRunningTime="2026-03-12 14:16:02.816934212 +0000 UTC m=+3985.507006183" Mar 12 14:16:03 crc kubenswrapper[4921]: I0312 14:16:03.807063 4921 generic.go:334] "Generic (PLEG): container finished" podID="0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1" containerID="694358b65ef6dda4bd9b20e54980fd7074ff88dff9bff0887a59ea67649a2ea9" exitCode=0 Mar 12 14:16:03 crc kubenswrapper[4921]: I0312 14:16:03.807258 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555416-jwnhq" event={"ID":"0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1","Type":"ContainerDied","Data":"694358b65ef6dda4bd9b20e54980fd7074ff88dff9bff0887a59ea67649a2ea9"} Mar 12 14:16:04 crc kubenswrapper[4921]: I0312 14:16:04.983806 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:16:05 crc kubenswrapper[4921]: I0312 14:16:05.424115 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555416-jwnhq" Mar 12 14:16:05 crc kubenswrapper[4921]: I0312 14:16:05.521012 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zvpc\" (UniqueName: \"kubernetes.io/projected/0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1-kube-api-access-4zvpc\") pod \"0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1\" (UID: \"0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1\") " Mar 12 14:16:05 crc kubenswrapper[4921]: I0312 14:16:05.823660 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555416-jwnhq" event={"ID":"0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1","Type":"ContainerDied","Data":"11d61478eb10d51bd139d88005b61c9d9e01cc60c33c7d8b10cd4c347f18a061"} Mar 12 14:16:05 crc kubenswrapper[4921]: I0312 14:16:05.823709 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11d61478eb10d51bd139d88005b61c9d9e01cc60c33c7d8b10cd4c347f18a061" Mar 12 14:16:05 crc kubenswrapper[4921]: I0312 14:16:05.823674 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555416-jwnhq" Mar 12 14:16:05 crc kubenswrapper[4921]: I0312 14:16:05.826194 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"6bc5f01d3dd879fd949dcd43b51ed7002793c74a0fcf4b2431e6945845a731d6"} Mar 12 14:16:05 crc kubenswrapper[4921]: I0312 14:16:05.910155 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1-kube-api-access-4zvpc" (OuterVolumeSpecName: "kube-api-access-4zvpc") pod "0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1" (UID: "0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1"). InnerVolumeSpecName "kube-api-access-4zvpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:16:05 crc kubenswrapper[4921]: I0312 14:16:05.926855 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555410-4mtkl"] Mar 12 14:16:05 crc kubenswrapper[4921]: I0312 14:16:05.936303 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zvpc\" (UniqueName: \"kubernetes.io/projected/0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1-kube-api-access-4zvpc\") on node \"crc\" DevicePath \"\"" Mar 12 14:16:05 crc kubenswrapper[4921]: I0312 14:16:05.942110 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555410-4mtkl"] Mar 12 14:16:05 crc kubenswrapper[4921]: I0312 14:16:05.996373 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18" path="/var/lib/kubelet/pods/61e9ad3e-f0d0-4e4e-839a-eec71b8cdf18/volumes" Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.586229 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7nvcf"] Mar 12 14:16:44 crc kubenswrapper[4921]: E0312 14:16:44.587097 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1" containerName="oc" Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.587108 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1" containerName="oc" Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.587310 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1" containerName="oc" Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.588590 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.600028 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7nvcf"] Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.756271 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-utilities\") pod \"community-operators-7nvcf\" (UID: \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\") " pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.756698 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klzl4\" (UniqueName: \"kubernetes.io/projected/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-kube-api-access-klzl4\") pod \"community-operators-7nvcf\" (UID: \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\") " pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.756785 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-catalog-content\") pod \"community-operators-7nvcf\" (UID: \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\") " pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.858928 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-catalog-content\") pod \"community-operators-7nvcf\" (UID: \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\") " pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.859050 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-utilities\") pod \"community-operators-7nvcf\" (UID: \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\") " pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.859109 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klzl4\" (UniqueName: \"kubernetes.io/projected/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-kube-api-access-klzl4\") pod \"community-operators-7nvcf\" (UID: \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\") " pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.859723 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-catalog-content\") pod \"community-operators-7nvcf\" (UID: \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\") " pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.859776 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-utilities\") pod \"community-operators-7nvcf\" (UID: \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\") " pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.880211 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klzl4\" (UniqueName: \"kubernetes.io/projected/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-kube-api-access-klzl4\") pod \"community-operators-7nvcf\" (UID: \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\") " pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:16:44 crc kubenswrapper[4921]: I0312 14:16:44.964125 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:16:45 crc kubenswrapper[4921]: I0312 14:16:45.570785 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7nvcf"] Mar 12 14:16:46 crc kubenswrapper[4921]: I0312 14:16:46.205473 4921 generic.go:334] "Generic (PLEG): container finished" podID="5581d28f-35ff-4aa3-8826-ede3bdfbcce4" containerID="bc2cedce0f2f0a75c02b425f7fb701f85d803b489cba9bb9ff66e15b499bb3b3" exitCode=0 Mar 12 14:16:46 crc kubenswrapper[4921]: I0312 14:16:46.205677 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nvcf" event={"ID":"5581d28f-35ff-4aa3-8826-ede3bdfbcce4","Type":"ContainerDied","Data":"bc2cedce0f2f0a75c02b425f7fb701f85d803b489cba9bb9ff66e15b499bb3b3"} Mar 12 14:16:46 crc kubenswrapper[4921]: I0312 14:16:46.205841 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nvcf" event={"ID":"5581d28f-35ff-4aa3-8826-ede3bdfbcce4","Type":"ContainerStarted","Data":"a8ce0b28f942042daf9f745ad39858c04fde0c9a68798309864070c7516c0a0d"} Mar 12 14:16:46 crc kubenswrapper[4921]: I0312 14:16:46.208837 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:16:46 crc kubenswrapper[4921]: I0312 14:16:46.677579 4921 scope.go:117] "RemoveContainer" containerID="3f7acd354d16fde7500d2b695982d47bdaae1eb2757bfa56e643a5a37b76b9e2" Mar 12 14:16:47 crc kubenswrapper[4921]: I0312 14:16:47.169194 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6lfk4"] Mar 12 14:16:47 crc kubenswrapper[4921]: I0312 14:16:47.172911 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:16:47 crc kubenswrapper[4921]: I0312 14:16:47.186131 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6lfk4"] Mar 12 14:16:47 crc kubenswrapper[4921]: I0312 14:16:47.243186 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-catalog-content\") pod \"redhat-operators-6lfk4\" (UID: \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\") " pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:16:47 crc kubenswrapper[4921]: I0312 14:16:47.243377 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-utilities\") pod \"redhat-operators-6lfk4\" (UID: \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\") " pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:16:47 crc kubenswrapper[4921]: I0312 14:16:47.243638 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvdtt\" (UniqueName: \"kubernetes.io/projected/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-kube-api-access-zvdtt\") pod \"redhat-operators-6lfk4\" (UID: \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\") " pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:16:47 crc kubenswrapper[4921]: I0312 14:16:47.345377 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdtt\" (UniqueName: \"kubernetes.io/projected/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-kube-api-access-zvdtt\") pod \"redhat-operators-6lfk4\" (UID: \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\") " pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:16:47 crc kubenswrapper[4921]: I0312 14:16:47.345523 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-catalog-content\") pod \"redhat-operators-6lfk4\" (UID: \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\") " pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:16:47 crc kubenswrapper[4921]: I0312 14:16:47.345607 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-utilities\") pod \"redhat-operators-6lfk4\" (UID: \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\") " pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:16:47 crc kubenswrapper[4921]: I0312 14:16:47.346100 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-catalog-content\") pod \"redhat-operators-6lfk4\" (UID: \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\") " pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:16:47 crc kubenswrapper[4921]: I0312 14:16:47.346155 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-utilities\") pod \"redhat-operators-6lfk4\" (UID: \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\") " pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:16:47 crc kubenswrapper[4921]: I0312 14:16:47.367744 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvdtt\" (UniqueName: \"kubernetes.io/projected/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-kube-api-access-zvdtt\") pod \"redhat-operators-6lfk4\" (UID: \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\") " pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:16:47 crc kubenswrapper[4921]: I0312 14:16:47.535570 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:16:48 crc kubenswrapper[4921]: I0312 14:16:48.065908 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6lfk4"] Mar 12 14:16:48 crc kubenswrapper[4921]: I0312 14:16:48.226223 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nvcf" event={"ID":"5581d28f-35ff-4aa3-8826-ede3bdfbcce4","Type":"ContainerStarted","Data":"0e9a50e65aa092509654360e48e854bc0923ad1e607639517e470ac68db7ccf0"} Mar 12 14:16:48 crc kubenswrapper[4921]: W0312 14:16:48.610392 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a63c206_749a_42f4_bcd4_0a8f7dda1f7f.slice/crio-1a68848620c65bf9599e0b5d2c93c879bbba106b9dfab664742489e13b0d0b1e WatchSource:0}: Error finding container 1a68848620c65bf9599e0b5d2c93c879bbba106b9dfab664742489e13b0d0b1e: Status 404 returned error can't find the container with id 1a68848620c65bf9599e0b5d2c93c879bbba106b9dfab664742489e13b0d0b1e Mar 12 14:16:49 crc kubenswrapper[4921]: I0312 14:16:49.234679 4921 generic.go:334] "Generic (PLEG): container finished" podID="5581d28f-35ff-4aa3-8826-ede3bdfbcce4" containerID="0e9a50e65aa092509654360e48e854bc0923ad1e607639517e470ac68db7ccf0" exitCode=0 Mar 12 14:16:49 crc kubenswrapper[4921]: I0312 14:16:49.234723 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nvcf" event={"ID":"5581d28f-35ff-4aa3-8826-ede3bdfbcce4","Type":"ContainerDied","Data":"0e9a50e65aa092509654360e48e854bc0923ad1e607639517e470ac68db7ccf0"} Mar 12 14:16:49 crc kubenswrapper[4921]: I0312 14:16:49.236834 4921 generic.go:334] "Generic (PLEG): container finished" podID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" containerID="87418b2fc6302751ec34ffd4ee561c97126a83e2b00f236d9be307715b223b45" exitCode=0 Mar 12 14:16:49 crc kubenswrapper[4921]: I0312 14:16:49.236856 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6lfk4" event={"ID":"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f","Type":"ContainerDied","Data":"87418b2fc6302751ec34ffd4ee561c97126a83e2b00f236d9be307715b223b45"} Mar 12 14:16:49 crc kubenswrapper[4921]: I0312 14:16:49.236887 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6lfk4" event={"ID":"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f","Type":"ContainerStarted","Data":"1a68848620c65bf9599e0b5d2c93c879bbba106b9dfab664742489e13b0d0b1e"} Mar 12 14:16:50 crc kubenswrapper[4921]: I0312 14:16:50.246349 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nvcf" event={"ID":"5581d28f-35ff-4aa3-8826-ede3bdfbcce4","Type":"ContainerStarted","Data":"915c1fa6eee50671e29c1d2416326990556a0e907b536b25bbe2465169042061"} Mar 12 14:16:50 crc kubenswrapper[4921]: I0312 14:16:50.272496 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7nvcf" podStartSLOduration=2.806313908 podStartE2EDuration="6.272477949s" podCreationTimestamp="2026-03-12 14:16:44 +0000 UTC" firstStartedPulling="2026-03-12 14:16:46.208413584 +0000 UTC m=+4028.898485565" lastFinishedPulling="2026-03-12 14:16:49.674577635 +0000 UTC m=+4032.364649606" observedRunningTime="2026-03-12 14:16:50.271088746 +0000 UTC m=+4032.961160717" watchObservedRunningTime="2026-03-12 14:16:50.272477949 +0000 UTC m=+4032.962549920" Mar 12 14:16:51 crc kubenswrapper[4921]: I0312 14:16:51.254639 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6lfk4" event={"ID":"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f","Type":"ContainerStarted","Data":"6d37c2147cf433091a9c2aaf86583cda7786b5b5b8de967e1ee4b3cf0133037c"} Mar 12 14:16:54 crc kubenswrapper[4921]: I0312 14:16:54.965223 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:16:54 crc kubenswrapper[4921]: I0312 14:16:54.965785 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:16:55 crc kubenswrapper[4921]: I0312 14:16:55.295354 4921 generic.go:334] "Generic (PLEG): container finished" podID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" containerID="6d37c2147cf433091a9c2aaf86583cda7786b5b5b8de967e1ee4b3cf0133037c" exitCode=0 Mar 12 14:16:55 crc kubenswrapper[4921]: I0312 14:16:55.295470 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6lfk4" event={"ID":"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f","Type":"ContainerDied","Data":"6d37c2147cf433091a9c2aaf86583cda7786b5b5b8de967e1ee4b3cf0133037c"} Mar 12 14:16:56 crc kubenswrapper[4921]: I0312 14:16:56.020403 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-7nvcf" podUID="5581d28f-35ff-4aa3-8826-ede3bdfbcce4" containerName="registry-server" probeResult="failure" output=< Mar 12 14:16:56 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 14:16:56 crc kubenswrapper[4921]: > Mar 12 14:16:56 crc kubenswrapper[4921]: I0312 14:16:56.307097 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6lfk4" event={"ID":"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f","Type":"ContainerStarted","Data":"4c68fe7ac0764b1d9291b3ef98e55b013f63b2c55dcd7e0bd5a55cafea6607dd"} Mar 12 14:16:56 crc kubenswrapper[4921]: I0312 14:16:56.325285 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6lfk4" podStartSLOduration=2.841730546 podStartE2EDuration="9.325270818s" podCreationTimestamp="2026-03-12 14:16:47 +0000 UTC" firstStartedPulling="2026-03-12 14:16:49.238135647 +0000 UTC m=+4031.928207618" lastFinishedPulling="2026-03-12 14:16:55.721675919 +0000 UTC m=+4038.411747890" observedRunningTime="2026-03-12 14:16:56.323419421 +0000 UTC m=+4039.013491392" watchObservedRunningTime="2026-03-12 14:16:56.325270818 +0000 UTC m=+4039.015342789" Mar 12 14:16:57 crc kubenswrapper[4921]: I0312 14:16:57.540243 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:16:57 crc kubenswrapper[4921]: I0312 14:16:57.540572 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:16:58 crc kubenswrapper[4921]: I0312 14:16:58.588603 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6lfk4" podUID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" containerName="registry-server" probeResult="failure" output=< Mar 12 14:16:58 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 14:16:58 crc kubenswrapper[4921]: > Mar 12 14:17:05 crc kubenswrapper[4921]: I0312 14:17:05.018759 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:17:05 crc kubenswrapper[4921]: I0312 14:17:05.071860 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:17:05 crc kubenswrapper[4921]: I0312 14:17:05.253446 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7nvcf"] Mar 12 14:17:06 crc kubenswrapper[4921]: I0312 14:17:06.426657 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7nvcf" podUID="5581d28f-35ff-4aa3-8826-ede3bdfbcce4" containerName="registry-server" containerID="cri-o://915c1fa6eee50671e29c1d2416326990556a0e907b536b25bbe2465169042061" gracePeriod=2 Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.123533 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.255864 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klzl4\" (UniqueName: \"kubernetes.io/projected/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-kube-api-access-klzl4\") pod \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\" (UID: \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\") " Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.256283 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-utilities\") pod \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\" (UID: \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\") " Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.256509 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-catalog-content\") pod \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\" (UID: \"5581d28f-35ff-4aa3-8826-ede3bdfbcce4\") " Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.257331 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-utilities" (OuterVolumeSpecName: "utilities") pod "5581d28f-35ff-4aa3-8826-ede3bdfbcce4" (UID: "5581d28f-35ff-4aa3-8826-ede3bdfbcce4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.264254 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-kube-api-access-klzl4" (OuterVolumeSpecName: "kube-api-access-klzl4") pod "5581d28f-35ff-4aa3-8826-ede3bdfbcce4" (UID: "5581d28f-35ff-4aa3-8826-ede3bdfbcce4"). InnerVolumeSpecName "kube-api-access-klzl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.333471 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5581d28f-35ff-4aa3-8826-ede3bdfbcce4" (UID: "5581d28f-35ff-4aa3-8826-ede3bdfbcce4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.359144 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.359172 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klzl4\" (UniqueName: \"kubernetes.io/projected/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-kube-api-access-klzl4\") on node \"crc\" DevicePath \"\"" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.359184 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5581d28f-35ff-4aa3-8826-ede3bdfbcce4-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.437363 4921 generic.go:334] "Generic (PLEG): container finished" podID="5581d28f-35ff-4aa3-8826-ede3bdfbcce4" containerID="915c1fa6eee50671e29c1d2416326990556a0e907b536b25bbe2465169042061" exitCode=0 Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.437408 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nvcf" event={"ID":"5581d28f-35ff-4aa3-8826-ede3bdfbcce4","Type":"ContainerDied","Data":"915c1fa6eee50671e29c1d2416326990556a0e907b536b25bbe2465169042061"} Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.437441 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7nvcf" event={"ID":"5581d28f-35ff-4aa3-8826-ede3bdfbcce4","Type":"ContainerDied","Data":"a8ce0b28f942042daf9f745ad39858c04fde0c9a68798309864070c7516c0a0d"} Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.437460 4921 scope.go:117] "RemoveContainer" containerID="915c1fa6eee50671e29c1d2416326990556a0e907b536b25bbe2465169042061" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.438596 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7nvcf" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.462204 4921 scope.go:117] "RemoveContainer" containerID="0e9a50e65aa092509654360e48e854bc0923ad1e607639517e470ac68db7ccf0" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.481035 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7nvcf"] Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.489436 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7nvcf"] Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.497807 4921 scope.go:117] "RemoveContainer" containerID="bc2cedce0f2f0a75c02b425f7fb701f85d803b489cba9bb9ff66e15b499bb3b3" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.529137 4921 scope.go:117] "RemoveContainer" containerID="915c1fa6eee50671e29c1d2416326990556a0e907b536b25bbe2465169042061" Mar 12 14:17:07 crc kubenswrapper[4921]: E0312 14:17:07.529631 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"915c1fa6eee50671e29c1d2416326990556a0e907b536b25bbe2465169042061\": container with ID starting with 915c1fa6eee50671e29c1d2416326990556a0e907b536b25bbe2465169042061 not found: ID does not exist" containerID="915c1fa6eee50671e29c1d2416326990556a0e907b536b25bbe2465169042061" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.529678 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"915c1fa6eee50671e29c1d2416326990556a0e907b536b25bbe2465169042061"} err="failed to get container status \"915c1fa6eee50671e29c1d2416326990556a0e907b536b25bbe2465169042061\": rpc error: code = NotFound desc = could not find container \"915c1fa6eee50671e29c1d2416326990556a0e907b536b25bbe2465169042061\": container with ID starting with 915c1fa6eee50671e29c1d2416326990556a0e907b536b25bbe2465169042061 not found: ID does not exist" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.529704 4921 scope.go:117] "RemoveContainer" containerID="0e9a50e65aa092509654360e48e854bc0923ad1e607639517e470ac68db7ccf0" Mar 12 14:17:07 crc kubenswrapper[4921]: E0312 14:17:07.530149 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e9a50e65aa092509654360e48e854bc0923ad1e607639517e470ac68db7ccf0\": container with ID starting with 0e9a50e65aa092509654360e48e854bc0923ad1e607639517e470ac68db7ccf0 not found: ID does not exist" containerID="0e9a50e65aa092509654360e48e854bc0923ad1e607639517e470ac68db7ccf0" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.530192 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e9a50e65aa092509654360e48e854bc0923ad1e607639517e470ac68db7ccf0"} err="failed to get container status \"0e9a50e65aa092509654360e48e854bc0923ad1e607639517e470ac68db7ccf0\": rpc error: code = NotFound desc = could not find container \"0e9a50e65aa092509654360e48e854bc0923ad1e607639517e470ac68db7ccf0\": container with ID starting with 0e9a50e65aa092509654360e48e854bc0923ad1e607639517e470ac68db7ccf0 not found: ID does not exist" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.530221 4921 scope.go:117] "RemoveContainer" containerID="bc2cedce0f2f0a75c02b425f7fb701f85d803b489cba9bb9ff66e15b499bb3b3" Mar 12 14:17:07 crc kubenswrapper[4921]: E0312 14:17:07.530496 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc2cedce0f2f0a75c02b425f7fb701f85d803b489cba9bb9ff66e15b499bb3b3\": container with ID starting with bc2cedce0f2f0a75c02b425f7fb701f85d803b489cba9bb9ff66e15b499bb3b3 not found: ID does not exist" containerID="bc2cedce0f2f0a75c02b425f7fb701f85d803b489cba9bb9ff66e15b499bb3b3" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.530522 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc2cedce0f2f0a75c02b425f7fb701f85d803b489cba9bb9ff66e15b499bb3b3"} err="failed to get container status \"bc2cedce0f2f0a75c02b425f7fb701f85d803b489cba9bb9ff66e15b499bb3b3\": rpc error: code = NotFound desc = could not find container \"bc2cedce0f2f0a75c02b425f7fb701f85d803b489cba9bb9ff66e15b499bb3b3\": container with ID starting with bc2cedce0f2f0a75c02b425f7fb701f85d803b489cba9bb9ff66e15b499bb3b3 not found: ID does not exist" Mar 12 14:17:07 crc kubenswrapper[4921]: I0312 14:17:07.994964 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5581d28f-35ff-4aa3-8826-ede3bdfbcce4" path="/var/lib/kubelet/pods/5581d28f-35ff-4aa3-8826-ede3bdfbcce4/volumes" Mar 12 14:17:08 crc kubenswrapper[4921]: I0312 14:17:08.589362 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6lfk4" podUID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" containerName="registry-server" probeResult="failure" output=< Mar 12 14:17:08 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 14:17:08 crc kubenswrapper[4921]: > Mar 12 14:17:18 crc kubenswrapper[4921]: I0312 14:17:18.585050 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6lfk4" podUID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" containerName="registry-server" probeResult="failure" output=< Mar 12 14:17:18 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 14:17:18 crc kubenswrapper[4921]: > Mar 12 14:17:27 crc kubenswrapper[4921]: I0312 14:17:27.609275 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:17:27 crc kubenswrapper[4921]: I0312 14:17:27.707231 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:17:27 crc kubenswrapper[4921]: I0312 14:17:27.853948 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6lfk4"] Mar 12 14:17:28 crc kubenswrapper[4921]: I0312 14:17:28.624954 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6lfk4" podUID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" containerName="registry-server" containerID="cri-o://4c68fe7ac0764b1d9291b3ef98e55b013f63b2c55dcd7e0bd5a55cafea6607dd" gracePeriod=2 Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.288320 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.475163 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-utilities\") pod \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\" (UID: \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\") " Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.475278 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvdtt\" (UniqueName: \"kubernetes.io/projected/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-kube-api-access-zvdtt\") pod \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\" (UID: \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\") " Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.475346 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-catalog-content\") pod \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\" (UID: \"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f\") " Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.476480 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-utilities" (OuterVolumeSpecName: "utilities") pod "5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" (UID: "5a63c206-749a-42f4-bcd4-0a8f7dda1f7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.480622 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-kube-api-access-zvdtt" (OuterVolumeSpecName: "kube-api-access-zvdtt") pod "5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" (UID: "5a63c206-749a-42f4-bcd4-0a8f7dda1f7f"). InnerVolumeSpecName "kube-api-access-zvdtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.577197 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.577510 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvdtt\" (UniqueName: \"kubernetes.io/projected/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-kube-api-access-zvdtt\") on node \"crc\" DevicePath \"\"" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.600978 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" (UID: "5a63c206-749a-42f4-bcd4-0a8f7dda1f7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.638128 4921 generic.go:334] "Generic (PLEG): container finished" podID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" containerID="4c68fe7ac0764b1d9291b3ef98e55b013f63b2c55dcd7e0bd5a55cafea6607dd" exitCode=0 Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.638173 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6lfk4" event={"ID":"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f","Type":"ContainerDied","Data":"4c68fe7ac0764b1d9291b3ef98e55b013f63b2c55dcd7e0bd5a55cafea6607dd"} Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.638215 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6lfk4" event={"ID":"5a63c206-749a-42f4-bcd4-0a8f7dda1f7f","Type":"ContainerDied","Data":"1a68848620c65bf9599e0b5d2c93c879bbba106b9dfab664742489e13b0d0b1e"} Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.638232 4921 scope.go:117] "RemoveContainer" containerID="4c68fe7ac0764b1d9291b3ef98e55b013f63b2c55dcd7e0bd5a55cafea6607dd" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.638344 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6lfk4" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.682254 4921 scope.go:117] "RemoveContainer" containerID="6d37c2147cf433091a9c2aaf86583cda7786b5b5b8de967e1ee4b3cf0133037c" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.696608 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6lfk4"] Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.703143 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.719860 4921 scope.go:117] "RemoveContainer" containerID="87418b2fc6302751ec34ffd4ee561c97126a83e2b00f236d9be307715b223b45" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.723528 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6lfk4"] Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.776879 4921 scope.go:117] "RemoveContainer" containerID="4c68fe7ac0764b1d9291b3ef98e55b013f63b2c55dcd7e0bd5a55cafea6607dd" Mar 12 14:17:29 crc kubenswrapper[4921]: E0312 14:17:29.777736 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c68fe7ac0764b1d9291b3ef98e55b013f63b2c55dcd7e0bd5a55cafea6607dd\": container with ID starting with 4c68fe7ac0764b1d9291b3ef98e55b013f63b2c55dcd7e0bd5a55cafea6607dd not found: ID does not exist" containerID="4c68fe7ac0764b1d9291b3ef98e55b013f63b2c55dcd7e0bd5a55cafea6607dd" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.777767 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c68fe7ac0764b1d9291b3ef98e55b013f63b2c55dcd7e0bd5a55cafea6607dd"} err="failed to get container status \"4c68fe7ac0764b1d9291b3ef98e55b013f63b2c55dcd7e0bd5a55cafea6607dd\": rpc error: code = NotFound desc = could not find container \"4c68fe7ac0764b1d9291b3ef98e55b013f63b2c55dcd7e0bd5a55cafea6607dd\": container with ID starting with 4c68fe7ac0764b1d9291b3ef98e55b013f63b2c55dcd7e0bd5a55cafea6607dd not found: ID does not exist" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.777790 4921 scope.go:117] "RemoveContainer" containerID="6d37c2147cf433091a9c2aaf86583cda7786b5b5b8de967e1ee4b3cf0133037c" Mar 12 14:17:29 crc kubenswrapper[4921]: E0312 14:17:29.778173 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d37c2147cf433091a9c2aaf86583cda7786b5b5b8de967e1ee4b3cf0133037c\": container with ID starting with 6d37c2147cf433091a9c2aaf86583cda7786b5b5b8de967e1ee4b3cf0133037c not found: ID does not exist" containerID="6d37c2147cf433091a9c2aaf86583cda7786b5b5b8de967e1ee4b3cf0133037c" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.778195 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d37c2147cf433091a9c2aaf86583cda7786b5b5b8de967e1ee4b3cf0133037c"} err="failed to get container status \"6d37c2147cf433091a9c2aaf86583cda7786b5b5b8de967e1ee4b3cf0133037c\": rpc error: code = NotFound desc = could not find container \"6d37c2147cf433091a9c2aaf86583cda7786b5b5b8de967e1ee4b3cf0133037c\": container with ID starting with 6d37c2147cf433091a9c2aaf86583cda7786b5b5b8de967e1ee4b3cf0133037c not found: ID does not exist" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.778207 4921 scope.go:117] "RemoveContainer" containerID="87418b2fc6302751ec34ffd4ee561c97126a83e2b00f236d9be307715b223b45" Mar 12 14:17:29 crc kubenswrapper[4921]: E0312 14:17:29.778705 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87418b2fc6302751ec34ffd4ee561c97126a83e2b00f236d9be307715b223b45\": container with ID starting with 87418b2fc6302751ec34ffd4ee561c97126a83e2b00f236d9be307715b223b45 not found: ID does not exist" containerID="87418b2fc6302751ec34ffd4ee561c97126a83e2b00f236d9be307715b223b45" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.778725 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87418b2fc6302751ec34ffd4ee561c97126a83e2b00f236d9be307715b223b45"} err="failed to get container status \"87418b2fc6302751ec34ffd4ee561c97126a83e2b00f236d9be307715b223b45\": rpc error: code = NotFound desc = could not find container \"87418b2fc6302751ec34ffd4ee561c97126a83e2b00f236d9be307715b223b45\": container with ID starting with 87418b2fc6302751ec34ffd4ee561c97126a83e2b00f236d9be307715b223b45 not found: ID does not exist" Mar 12 14:17:29 crc kubenswrapper[4921]: I0312 14:17:29.993438 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" path="/var/lib/kubelet/pods/5a63c206-749a-42f4-bcd4-0a8f7dda1f7f/volumes" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.815858 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zpr26"] Mar 12 14:17:35 crc kubenswrapper[4921]: E0312 14:17:35.816896 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" containerName="extract-utilities" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.816914 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" containerName="extract-utilities" Mar 12 14:17:35 crc kubenswrapper[4921]: E0312 14:17:35.816928 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5581d28f-35ff-4aa3-8826-ede3bdfbcce4" containerName="registry-server" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.816936 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5581d28f-35ff-4aa3-8826-ede3bdfbcce4" containerName="registry-server" Mar 12 14:17:35 crc kubenswrapper[4921]: E0312 14:17:35.816956 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" containerName="extract-content" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.816964 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" containerName="extract-content" Mar 12 14:17:35 crc kubenswrapper[4921]: E0312 14:17:35.816980 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" containerName="registry-server" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.816988 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" containerName="registry-server" Mar 12 14:17:35 crc kubenswrapper[4921]: E0312 14:17:35.817004 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5581d28f-35ff-4aa3-8826-ede3bdfbcce4" containerName="extract-content" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.817011 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5581d28f-35ff-4aa3-8826-ede3bdfbcce4" containerName="extract-content" Mar 12 14:17:35 crc kubenswrapper[4921]: E0312 14:17:35.817022 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5581d28f-35ff-4aa3-8826-ede3bdfbcce4" containerName="extract-utilities" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.817030 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5581d28f-35ff-4aa3-8826-ede3bdfbcce4" containerName="extract-utilities" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.817300 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5581d28f-35ff-4aa3-8826-ede3bdfbcce4" containerName="registry-server" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.817323 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a63c206-749a-42f4-bcd4-0a8f7dda1f7f" containerName="registry-server" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.818942 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.820790 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-catalog-content\") pod \"certified-operators-zpr26\" (UID: \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\") " pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.821039 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7tsn\" (UniqueName: \"kubernetes.io/projected/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-kube-api-access-r7tsn\") pod \"certified-operators-zpr26\" (UID: \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\") " pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.821081 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-utilities\") pod \"certified-operators-zpr26\" (UID: \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\") " pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.827659 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zpr26"] Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.923380 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-catalog-content\") pod \"certified-operators-zpr26\" (UID: \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\") " pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.923576 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7tsn\" (UniqueName: \"kubernetes.io/projected/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-kube-api-access-r7tsn\") pod \"certified-operators-zpr26\" (UID: \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\") " pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.923600 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-utilities\") pod \"certified-operators-zpr26\" (UID: \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\") " pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.923912 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-catalog-content\") pod \"certified-operators-zpr26\" (UID: \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\") " pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.924039 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-utilities\") pod \"certified-operators-zpr26\" (UID: \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\") " pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:35 crc kubenswrapper[4921]: I0312 14:17:35.955455 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7tsn\" (UniqueName: \"kubernetes.io/projected/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-kube-api-access-r7tsn\") pod \"certified-operators-zpr26\" (UID: \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\") " pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:36 crc kubenswrapper[4921]: I0312 14:17:36.152617 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:36 crc kubenswrapper[4921]: I0312 14:17:36.634186 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zpr26"] Mar 12 14:17:36 crc kubenswrapper[4921]: I0312 14:17:36.727647 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpr26" event={"ID":"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb","Type":"ContainerStarted","Data":"9c801d837d968914fd0d86ef7b0c92d7b4bc8cca60b14387e65629c7a9271bfd"} Mar 12 14:17:37 crc kubenswrapper[4921]: I0312 14:17:37.736390 4921 generic.go:334] "Generic (PLEG): container finished" podID="bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" containerID="9b29c1a00fbf84682d05c931d330170851bfbf60c71bccdedaaa153eaaf7228c" exitCode=0 Mar 12 14:17:37 crc kubenswrapper[4921]: I0312 14:17:37.736491 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpr26" event={"ID":"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb","Type":"ContainerDied","Data":"9b29c1a00fbf84682d05c931d330170851bfbf60c71bccdedaaa153eaaf7228c"} Mar 12 14:17:38 crc kubenswrapper[4921]: I0312 14:17:38.745388 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpr26" event={"ID":"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb","Type":"ContainerStarted","Data":"154294471cd27b3c66d2f7a0d90d6d0650b7cf83680a047144195e7479a2b7b4"} Mar 12 14:17:40 crc kubenswrapper[4921]: I0312 14:17:40.763849 4921 generic.go:334] "Generic (PLEG): container finished" podID="bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" containerID="154294471cd27b3c66d2f7a0d90d6d0650b7cf83680a047144195e7479a2b7b4" exitCode=0 Mar 12 14:17:40 crc kubenswrapper[4921]: I0312 14:17:40.763950 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpr26" event={"ID":"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb","Type":"ContainerDied","Data":"154294471cd27b3c66d2f7a0d90d6d0650b7cf83680a047144195e7479a2b7b4"} Mar 12 14:17:41 crc kubenswrapper[4921]: I0312 14:17:41.775633 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpr26" event={"ID":"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb","Type":"ContainerStarted","Data":"89cf990692435588b58630df072d552fa202f09b417d366441587ac93c1c5407"} Mar 12 14:17:41 crc kubenswrapper[4921]: I0312 14:17:41.798032 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zpr26" podStartSLOduration=3.138641689 podStartE2EDuration="6.798009387s" podCreationTimestamp="2026-03-12 14:17:35 +0000 UTC" firstStartedPulling="2026-03-12 14:17:37.739785603 +0000 UTC m=+4080.429857574" lastFinishedPulling="2026-03-12 14:17:41.399153291 +0000 UTC m=+4084.089225272" observedRunningTime="2026-03-12 14:17:41.792537348 +0000 UTC m=+4084.482609319" watchObservedRunningTime="2026-03-12 14:17:41.798009387 +0000 UTC m=+4084.488081358" Mar 12 14:17:46 crc kubenswrapper[4921]: I0312 14:17:46.153597 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:46 crc kubenswrapper[4921]: I0312 14:17:46.154247 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:46 crc kubenswrapper[4921]: I0312 14:17:46.208398 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:46 crc kubenswrapper[4921]: I0312 14:17:46.863663 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:46 crc kubenswrapper[4921]: I0312 14:17:46.912602 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zpr26"] Mar 12 14:17:48 crc kubenswrapper[4921]: I0312 14:17:48.829123 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zpr26" podUID="bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" containerName="registry-server" containerID="cri-o://89cf990692435588b58630df072d552fa202f09b417d366441587ac93c1c5407" gracePeriod=2 Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.548791 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.643192 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7tsn\" (UniqueName: \"kubernetes.io/projected/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-kube-api-access-r7tsn\") pod \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\" (UID: \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\") " Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.643336 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-catalog-content\") pod \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\" (UID: \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\") " Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.643470 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-utilities\") pod \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\" (UID: \"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb\") " Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.644020 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-utilities" (OuterVolumeSpecName: "utilities") pod "bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" (UID: "bf8f238d-11d5-44c8-a40b-06a4b81ff2bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.649433 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-kube-api-access-r7tsn" (OuterVolumeSpecName: "kube-api-access-r7tsn") pod "bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" (UID: "bf8f238d-11d5-44c8-a40b-06a4b81ff2bb"). InnerVolumeSpecName "kube-api-access-r7tsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.695244 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" (UID: "bf8f238d-11d5-44c8-a40b-06a4b81ff2bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.745619 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.745656 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7tsn\" (UniqueName: \"kubernetes.io/projected/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-kube-api-access-r7tsn\") on node \"crc\" DevicePath \"\"" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.745672 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.838384 4921 generic.go:334] "Generic (PLEG): container finished" podID="bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" containerID="89cf990692435588b58630df072d552fa202f09b417d366441587ac93c1c5407" exitCode=0 Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.838429 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zpr26" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.838447 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpr26" event={"ID":"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb","Type":"ContainerDied","Data":"89cf990692435588b58630df072d552fa202f09b417d366441587ac93c1c5407"} Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.838885 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpr26" event={"ID":"bf8f238d-11d5-44c8-a40b-06a4b81ff2bb","Type":"ContainerDied","Data":"9c801d837d968914fd0d86ef7b0c92d7b4bc8cca60b14387e65629c7a9271bfd"} Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.838904 4921 scope.go:117] "RemoveContainer" containerID="89cf990692435588b58630df072d552fa202f09b417d366441587ac93c1c5407" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.856776 4921 scope.go:117] "RemoveContainer" containerID="154294471cd27b3c66d2f7a0d90d6d0650b7cf83680a047144195e7479a2b7b4" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.868355 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zpr26"] Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.879997 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zpr26"] Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.887096 4921 scope.go:117] "RemoveContainer" containerID="9b29c1a00fbf84682d05c931d330170851bfbf60c71bccdedaaa153eaaf7228c" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.919990 4921 scope.go:117] "RemoveContainer" containerID="89cf990692435588b58630df072d552fa202f09b417d366441587ac93c1c5407" Mar 12 14:17:49 crc kubenswrapper[4921]: E0312 14:17:49.920483 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89cf990692435588b58630df072d552fa202f09b417d366441587ac93c1c5407\": container with ID starting with 89cf990692435588b58630df072d552fa202f09b417d366441587ac93c1c5407 not found: ID does not exist" containerID="89cf990692435588b58630df072d552fa202f09b417d366441587ac93c1c5407" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.920522 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89cf990692435588b58630df072d552fa202f09b417d366441587ac93c1c5407"} err="failed to get container status \"89cf990692435588b58630df072d552fa202f09b417d366441587ac93c1c5407\": rpc error: code = NotFound desc = could not find container \"89cf990692435588b58630df072d552fa202f09b417d366441587ac93c1c5407\": container with ID starting with 89cf990692435588b58630df072d552fa202f09b417d366441587ac93c1c5407 not found: ID does not exist" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.920553 4921 scope.go:117] "RemoveContainer" containerID="154294471cd27b3c66d2f7a0d90d6d0650b7cf83680a047144195e7479a2b7b4" Mar 12 14:17:49 crc kubenswrapper[4921]: E0312 14:17:49.920927 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"154294471cd27b3c66d2f7a0d90d6d0650b7cf83680a047144195e7479a2b7b4\": container with ID starting with 154294471cd27b3c66d2f7a0d90d6d0650b7cf83680a047144195e7479a2b7b4 not found: ID does not exist" containerID="154294471cd27b3c66d2f7a0d90d6d0650b7cf83680a047144195e7479a2b7b4" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.920972 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"154294471cd27b3c66d2f7a0d90d6d0650b7cf83680a047144195e7479a2b7b4"} err="failed to get container status \"154294471cd27b3c66d2f7a0d90d6d0650b7cf83680a047144195e7479a2b7b4\": rpc error: code = NotFound desc = could not find container \"154294471cd27b3c66d2f7a0d90d6d0650b7cf83680a047144195e7479a2b7b4\": container with ID starting with 154294471cd27b3c66d2f7a0d90d6d0650b7cf83680a047144195e7479a2b7b4 not found: ID does not exist" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.921002 4921 scope.go:117] "RemoveContainer" containerID="9b29c1a00fbf84682d05c931d330170851bfbf60c71bccdedaaa153eaaf7228c" Mar 12 14:17:49 crc kubenswrapper[4921]: E0312 14:17:49.921409 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b29c1a00fbf84682d05c931d330170851bfbf60c71bccdedaaa153eaaf7228c\": container with ID starting with 9b29c1a00fbf84682d05c931d330170851bfbf60c71bccdedaaa153eaaf7228c not found: ID does not exist" containerID="9b29c1a00fbf84682d05c931d330170851bfbf60c71bccdedaaa153eaaf7228c" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.921437 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b29c1a00fbf84682d05c931d330170851bfbf60c71bccdedaaa153eaaf7228c"} err="failed to get container status \"9b29c1a00fbf84682d05c931d330170851bfbf60c71bccdedaaa153eaaf7228c\": rpc error: code = NotFound desc = could not find container \"9b29c1a00fbf84682d05c931d330170851bfbf60c71bccdedaaa153eaaf7228c\": container with ID starting with 9b29c1a00fbf84682d05c931d330170851bfbf60c71bccdedaaa153eaaf7228c not found: ID does not exist" Mar 12 14:17:49 crc kubenswrapper[4921]: I0312 14:17:49.993144 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" path="/var/lib/kubelet/pods/bf8f238d-11d5-44c8-a40b-06a4b81ff2bb/volumes" Mar 12 14:18:00 crc kubenswrapper[4921]: I0312 14:18:00.160922 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555418-xxgkp"] Mar 12 14:18:00 crc kubenswrapper[4921]: E0312 14:18:00.161851 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" containerName="registry-server" Mar 12 14:18:00 crc kubenswrapper[4921]: I0312 14:18:00.161864 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" containerName="registry-server" Mar 12 14:18:00 crc kubenswrapper[4921]: E0312 14:18:00.161874 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" containerName="extract-utilities" Mar 12 14:18:00 crc kubenswrapper[4921]: I0312 14:18:00.161881 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" containerName="extract-utilities" Mar 12 14:18:00 crc kubenswrapper[4921]: E0312 14:18:00.161913 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" containerName="extract-content" Mar 12 14:18:00 crc kubenswrapper[4921]: I0312 14:18:00.161921 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" containerName="extract-content" Mar 12 14:18:00 crc kubenswrapper[4921]: I0312 14:18:00.162120 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf8f238d-11d5-44c8-a40b-06a4b81ff2bb" containerName="registry-server" Mar 12 14:18:00 crc kubenswrapper[4921]: I0312 14:18:00.162807 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555418-xxgkp" Mar 12 14:18:00 crc kubenswrapper[4921]: I0312 14:18:00.165302 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:18:00 crc kubenswrapper[4921]: I0312 14:18:00.165453 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:18:00 crc kubenswrapper[4921]: I0312 14:18:00.166926 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:18:00 crc kubenswrapper[4921]: I0312 14:18:00.170988 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555418-xxgkp"] Mar 12 14:18:00 crc kubenswrapper[4921]: I0312 14:18:00.262065 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8hrt\" (UniqueName: \"kubernetes.io/projected/d4600b08-7049-4e69-a857-490413d1d7f2-kube-api-access-h8hrt\") pod \"auto-csr-approver-29555418-xxgkp\" (UID: \"d4600b08-7049-4e69-a857-490413d1d7f2\") " pod="openshift-infra/auto-csr-approver-29555418-xxgkp" Mar 12 14:18:00 crc kubenswrapper[4921]: I0312 14:18:00.364711 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8hrt\" (UniqueName: \"kubernetes.io/projected/d4600b08-7049-4e69-a857-490413d1d7f2-kube-api-access-h8hrt\") pod \"auto-csr-approver-29555418-xxgkp\" (UID: \"d4600b08-7049-4e69-a857-490413d1d7f2\") " pod="openshift-infra/auto-csr-approver-29555418-xxgkp" Mar 12 14:18:00 crc kubenswrapper[4921]: I0312 14:18:00.387588 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8hrt\" (UniqueName: \"kubernetes.io/projected/d4600b08-7049-4e69-a857-490413d1d7f2-kube-api-access-h8hrt\") pod \"auto-csr-approver-29555418-xxgkp\" (UID: \"d4600b08-7049-4e69-a857-490413d1d7f2\") " pod="openshift-infra/auto-csr-approver-29555418-xxgkp" Mar 12 14:18:00 crc kubenswrapper[4921]: I0312 14:18:00.480174 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555418-xxgkp" Mar 12 14:18:01 crc kubenswrapper[4921]: I0312 14:18:01.037636 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555418-xxgkp"] Mar 12 14:18:01 crc kubenswrapper[4921]: W0312 14:18:01.054318 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4600b08_7049_4e69_a857_490413d1d7f2.slice/crio-80b482ac958f032177d797ce2f0d194dc06d79a90a72c8b0ee9225f5e829016c WatchSource:0}: Error finding container 80b482ac958f032177d797ce2f0d194dc06d79a90a72c8b0ee9225f5e829016c: Status 404 returned error can't find the container with id 80b482ac958f032177d797ce2f0d194dc06d79a90a72c8b0ee9225f5e829016c Mar 12 14:18:02 crc kubenswrapper[4921]: I0312 14:18:02.014418 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555418-xxgkp" event={"ID":"d4600b08-7049-4e69-a857-490413d1d7f2","Type":"ContainerStarted","Data":"80b482ac958f032177d797ce2f0d194dc06d79a90a72c8b0ee9225f5e829016c"} Mar 12 14:18:03 crc kubenswrapper[4921]: I0312 14:18:03.024355 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555418-xxgkp" event={"ID":"d4600b08-7049-4e69-a857-490413d1d7f2","Type":"ContainerStarted","Data":"d487f8845a170909ae96bd8ca22cf1ad0124a90292dc2c9b1982ffa96a69acd4"} Mar 12 14:18:03 crc kubenswrapper[4921]: I0312 14:18:03.040951 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555418-xxgkp" podStartSLOduration=2.19083903 podStartE2EDuration="3.040931702s" podCreationTimestamp="2026-03-12 14:18:00 +0000 UTC" firstStartedPulling="2026-03-12 14:18:01.059903095 +0000 UTC m=+4103.749975056" lastFinishedPulling="2026-03-12 14:18:01.909995757 +0000 UTC m=+4104.600067728" observedRunningTime="2026-03-12 14:18:03.037877029 +0000 UTC m=+4105.727949020" watchObservedRunningTime="2026-03-12 14:18:03.040931702 +0000 UTC m=+4105.731003683" Mar 12 14:18:04 crc kubenswrapper[4921]: I0312 14:18:04.032896 4921 generic.go:334] "Generic (PLEG): container finished" podID="d4600b08-7049-4e69-a857-490413d1d7f2" containerID="d487f8845a170909ae96bd8ca22cf1ad0124a90292dc2c9b1982ffa96a69acd4" exitCode=0 Mar 12 14:18:04 crc kubenswrapper[4921]: I0312 14:18:04.032946 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555418-xxgkp" event={"ID":"d4600b08-7049-4e69-a857-490413d1d7f2","Type":"ContainerDied","Data":"d487f8845a170909ae96bd8ca22cf1ad0124a90292dc2c9b1982ffa96a69acd4"} Mar 12 14:18:05 crc kubenswrapper[4921]: I0312 14:18:05.616899 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555418-xxgkp" Mar 12 14:18:05 crc kubenswrapper[4921]: I0312 14:18:05.810742 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8hrt\" (UniqueName: \"kubernetes.io/projected/d4600b08-7049-4e69-a857-490413d1d7f2-kube-api-access-h8hrt\") pod \"d4600b08-7049-4e69-a857-490413d1d7f2\" (UID: \"d4600b08-7049-4e69-a857-490413d1d7f2\") " Mar 12 14:18:05 crc kubenswrapper[4921]: I0312 14:18:05.828518 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4600b08-7049-4e69-a857-490413d1d7f2-kube-api-access-h8hrt" (OuterVolumeSpecName: "kube-api-access-h8hrt") pod "d4600b08-7049-4e69-a857-490413d1d7f2" (UID: "d4600b08-7049-4e69-a857-490413d1d7f2"). InnerVolumeSpecName "kube-api-access-h8hrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:18:05 crc kubenswrapper[4921]: I0312 14:18:05.913483 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8hrt\" (UniqueName: \"kubernetes.io/projected/d4600b08-7049-4e69-a857-490413d1d7f2-kube-api-access-h8hrt\") on node \"crc\" DevicePath \"\"" Mar 12 14:18:06 crc kubenswrapper[4921]: I0312 14:18:06.058439 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555418-xxgkp" event={"ID":"d4600b08-7049-4e69-a857-490413d1d7f2","Type":"ContainerDied","Data":"80b482ac958f032177d797ce2f0d194dc06d79a90a72c8b0ee9225f5e829016c"} Mar 12 14:18:06 crc kubenswrapper[4921]: I0312 14:18:06.058484 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80b482ac958f032177d797ce2f0d194dc06d79a90a72c8b0ee9225f5e829016c" Mar 12 14:18:06 crc kubenswrapper[4921]: I0312 14:18:06.058509 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555418-xxgkp" Mar 12 14:18:06 crc kubenswrapper[4921]: I0312 14:18:06.094801 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555412-8brfw"] Mar 12 14:18:06 crc kubenswrapper[4921]: I0312 14:18:06.103922 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555412-8brfw"] Mar 12 14:18:07 crc kubenswrapper[4921]: I0312 14:18:07.996725 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c85e233c-885c-4bbf-be47-c8437a37a46b" path="/var/lib/kubelet/pods/c85e233c-885c-4bbf-be47-c8437a37a46b/volumes" Mar 12 14:18:26 crc kubenswrapper[4921]: I0312 14:18:26.323696 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:18:26 crc kubenswrapper[4921]: I0312 14:18:26.324986 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:18:46 crc kubenswrapper[4921]: I0312 14:18:46.813242 4921 scope.go:117] "RemoveContainer" containerID="1e4fbe54207181963f3ffecbb4c1859ea61cced7fb37e29fe4f1ec112d86e22d" Mar 12 14:18:56 crc kubenswrapper[4921]: I0312 14:18:56.324237 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:18:56 crc kubenswrapper[4921]: I0312 14:18:56.324948 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:19:26 crc kubenswrapper[4921]: I0312 14:19:26.508513 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:19:26 crc kubenswrapper[4921]: I0312 14:19:26.509261 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:19:26 crc kubenswrapper[4921]: I0312 14:19:26.542378 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 14:19:26 crc kubenswrapper[4921]: I0312 14:19:26.543333 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6bc5f01d3dd879fd949dcd43b51ed7002793c74a0fcf4b2431e6945845a731d6"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 14:19:26 crc kubenswrapper[4921]: I0312 14:19:26.543407 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://6bc5f01d3dd879fd949dcd43b51ed7002793c74a0fcf4b2431e6945845a731d6" gracePeriod=600 Mar 12 14:19:27 crc kubenswrapper[4921]: I0312 14:19:27.079213 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="6bc5f01d3dd879fd949dcd43b51ed7002793c74a0fcf4b2431e6945845a731d6" exitCode=0 Mar 12 14:19:27 crc kubenswrapper[4921]: I0312 14:19:27.079936 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"6bc5f01d3dd879fd949dcd43b51ed7002793c74a0fcf4b2431e6945845a731d6"} Mar 12 14:19:27 crc kubenswrapper[4921]: I0312 14:19:27.080057 4921 scope.go:117] "RemoveContainer" containerID="8bab92d0b007a6681f681369bd796a9e0c4e3615701517741e735e18580357fb" Mar 12 14:19:28 crc kubenswrapper[4921]: I0312 14:19:28.092552 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f"} Mar 12 14:20:00 crc kubenswrapper[4921]: I0312 14:20:00.159934 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555420-7gn8f"] Mar 12 14:20:00 crc kubenswrapper[4921]: E0312 14:20:00.161597 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4600b08-7049-4e69-a857-490413d1d7f2" containerName="oc" Mar 12 14:20:00 crc kubenswrapper[4921]: I0312 14:20:00.161757 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4600b08-7049-4e69-a857-490413d1d7f2" containerName="oc" Mar 12 14:20:00 crc kubenswrapper[4921]: I0312 14:20:00.162003 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4600b08-7049-4e69-a857-490413d1d7f2" containerName="oc" Mar 12 14:20:00 crc kubenswrapper[4921]: I0312 14:20:00.162690 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555420-7gn8f" Mar 12 14:20:00 crc kubenswrapper[4921]: I0312 14:20:00.165012 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:20:00 crc kubenswrapper[4921]: I0312 14:20:00.165131 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:20:00 crc kubenswrapper[4921]: I0312 14:20:00.165712 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:20:00 crc kubenswrapper[4921]: I0312 14:20:00.179951 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555420-7gn8f"] Mar 12 14:20:00 crc kubenswrapper[4921]: I0312 14:20:00.278871 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvkmz\" (UniqueName: \"kubernetes.io/projected/ab3d5978-4182-4e5c-8209-fa565cffb370-kube-api-access-mvkmz\") pod \"auto-csr-approver-29555420-7gn8f\" (UID: \"ab3d5978-4182-4e5c-8209-fa565cffb370\") " pod="openshift-infra/auto-csr-approver-29555420-7gn8f" Mar 12 14:20:00 crc kubenswrapper[4921]: I0312 14:20:00.380114 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvkmz\" (UniqueName: \"kubernetes.io/projected/ab3d5978-4182-4e5c-8209-fa565cffb370-kube-api-access-mvkmz\") pod \"auto-csr-approver-29555420-7gn8f\" (UID: \"ab3d5978-4182-4e5c-8209-fa565cffb370\") " pod="openshift-infra/auto-csr-approver-29555420-7gn8f" Mar 12 14:20:00 crc kubenswrapper[4921]: I0312 14:20:00.416754 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvkmz\" (UniqueName: \"kubernetes.io/projected/ab3d5978-4182-4e5c-8209-fa565cffb370-kube-api-access-mvkmz\") pod \"auto-csr-approver-29555420-7gn8f\" (UID: \"ab3d5978-4182-4e5c-8209-fa565cffb370\") " pod="openshift-infra/auto-csr-approver-29555420-7gn8f" Mar 12 14:20:00 crc kubenswrapper[4921]: I0312 14:20:00.486710 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555420-7gn8f" Mar 12 14:20:00 crc kubenswrapper[4921]: I0312 14:20:00.993116 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555420-7gn8f"] Mar 12 14:20:01 crc kubenswrapper[4921]: I0312 14:20:01.758736 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555420-7gn8f" event={"ID":"ab3d5978-4182-4e5c-8209-fa565cffb370","Type":"ContainerStarted","Data":"8c2f68cedddeeac133f12807cecbcaa2c914ed09b0b3c7ec5311e436a1f8a046"} Mar 12 14:20:02 crc kubenswrapper[4921]: I0312 14:20:02.768429 4921 generic.go:334] "Generic (PLEG): container finished" podID="ab3d5978-4182-4e5c-8209-fa565cffb370" containerID="ec1a19fd24e4c77fb11a655546e12880d9b699921f729ae3c94c9c2ae4922d29" exitCode=0 Mar 12 14:20:02 crc kubenswrapper[4921]: I0312 14:20:02.768471 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555420-7gn8f" event={"ID":"ab3d5978-4182-4e5c-8209-fa565cffb370","Type":"ContainerDied","Data":"ec1a19fd24e4c77fb11a655546e12880d9b699921f729ae3c94c9c2ae4922d29"} Mar 12 14:20:04 crc kubenswrapper[4921]: I0312 14:20:04.376769 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555420-7gn8f" Mar 12 14:20:04 crc kubenswrapper[4921]: I0312 14:20:04.576676 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvkmz\" (UniqueName: \"kubernetes.io/projected/ab3d5978-4182-4e5c-8209-fa565cffb370-kube-api-access-mvkmz\") pod \"ab3d5978-4182-4e5c-8209-fa565cffb370\" (UID: \"ab3d5978-4182-4e5c-8209-fa565cffb370\") " Mar 12 14:20:04 crc kubenswrapper[4921]: I0312 14:20:04.593127 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab3d5978-4182-4e5c-8209-fa565cffb370-kube-api-access-mvkmz" (OuterVolumeSpecName: "kube-api-access-mvkmz") pod "ab3d5978-4182-4e5c-8209-fa565cffb370" (UID: "ab3d5978-4182-4e5c-8209-fa565cffb370"). InnerVolumeSpecName "kube-api-access-mvkmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:20:04 crc kubenswrapper[4921]: I0312 14:20:04.679425 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvkmz\" (UniqueName: \"kubernetes.io/projected/ab3d5978-4182-4e5c-8209-fa565cffb370-kube-api-access-mvkmz\") on node \"crc\" DevicePath \"\"" Mar 12 14:20:04 crc kubenswrapper[4921]: I0312 14:20:04.800075 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555420-7gn8f" event={"ID":"ab3d5978-4182-4e5c-8209-fa565cffb370","Type":"ContainerDied","Data":"8c2f68cedddeeac133f12807cecbcaa2c914ed09b0b3c7ec5311e436a1f8a046"} Mar 12 14:20:04 crc kubenswrapper[4921]: I0312 14:20:04.800118 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c2f68cedddeeac133f12807cecbcaa2c914ed09b0b3c7ec5311e436a1f8a046" Mar 12 14:20:04 crc kubenswrapper[4921]: I0312 14:20:04.800138 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555420-7gn8f" Mar 12 14:20:05 crc kubenswrapper[4921]: I0312 14:20:05.453430 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555414-xnzjk"] Mar 12 14:20:05 crc kubenswrapper[4921]: I0312 14:20:05.461370 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555414-xnzjk"] Mar 12 14:20:05 crc kubenswrapper[4921]: I0312 14:20:05.999758 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a4240d3-387f-42dd-a1ed-5a81ebfb96e9" path="/var/lib/kubelet/pods/5a4240d3-387f-42dd-a1ed-5a81ebfb96e9/volumes" Mar 12 14:20:46 crc kubenswrapper[4921]: I0312 14:20:46.939476 4921 scope.go:117] "RemoveContainer" containerID="aa95301622884e889de9029c8e7a238cf07b727f86cc66d6b0760bd007648398" Mar 12 14:21:33 crc kubenswrapper[4921]: I0312 14:21:33.863318 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2v5ck"] Mar 12 14:21:33 crc kubenswrapper[4921]: E0312 14:21:33.864071 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab3d5978-4182-4e5c-8209-fa565cffb370" containerName="oc" Mar 12 14:21:33 crc kubenswrapper[4921]: I0312 14:21:33.864083 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab3d5978-4182-4e5c-8209-fa565cffb370" containerName="oc" Mar 12 14:21:33 crc kubenswrapper[4921]: I0312 14:21:33.864279 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab3d5978-4182-4e5c-8209-fa565cffb370" containerName="oc" Mar 12 14:21:33 crc kubenswrapper[4921]: I0312 14:21:33.865538 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:33 crc kubenswrapper[4921]: I0312 14:21:33.880049 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v5ck"] Mar 12 14:21:34 crc kubenswrapper[4921]: I0312 14:21:34.026646 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08208c32-e300-442e-a0ac-9803998d123d-utilities\") pod \"redhat-marketplace-2v5ck\" (UID: \"08208c32-e300-442e-a0ac-9803998d123d\") " pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:34 crc kubenswrapper[4921]: I0312 14:21:34.026943 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv8f8\" (UniqueName: \"kubernetes.io/projected/08208c32-e300-442e-a0ac-9803998d123d-kube-api-access-nv8f8\") pod \"redhat-marketplace-2v5ck\" (UID: \"08208c32-e300-442e-a0ac-9803998d123d\") " pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:34 crc kubenswrapper[4921]: I0312 14:21:34.027079 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08208c32-e300-442e-a0ac-9803998d123d-catalog-content\") pod \"redhat-marketplace-2v5ck\" (UID: \"08208c32-e300-442e-a0ac-9803998d123d\") " pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:34 crc kubenswrapper[4921]: I0312 14:21:34.129500 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv8f8\" (UniqueName: \"kubernetes.io/projected/08208c32-e300-442e-a0ac-9803998d123d-kube-api-access-nv8f8\") pod \"redhat-marketplace-2v5ck\" (UID: \"08208c32-e300-442e-a0ac-9803998d123d\") " pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:34 crc kubenswrapper[4921]: I0312 14:21:34.129565 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08208c32-e300-442e-a0ac-9803998d123d-catalog-content\") pod \"redhat-marketplace-2v5ck\" (UID: \"08208c32-e300-442e-a0ac-9803998d123d\") " pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:34 crc kubenswrapper[4921]: I0312 14:21:34.129692 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08208c32-e300-442e-a0ac-9803998d123d-utilities\") pod \"redhat-marketplace-2v5ck\" (UID: \"08208c32-e300-442e-a0ac-9803998d123d\") " pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:34 crc kubenswrapper[4921]: I0312 14:21:34.130335 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08208c32-e300-442e-a0ac-9803998d123d-catalog-content\") pod \"redhat-marketplace-2v5ck\" (UID: \"08208c32-e300-442e-a0ac-9803998d123d\") " pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:34 crc kubenswrapper[4921]: I0312 14:21:34.130390 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08208c32-e300-442e-a0ac-9803998d123d-utilities\") pod \"redhat-marketplace-2v5ck\" (UID: \"08208c32-e300-442e-a0ac-9803998d123d\") " pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:34 crc kubenswrapper[4921]: I0312 14:21:34.152804 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv8f8\" (UniqueName: \"kubernetes.io/projected/08208c32-e300-442e-a0ac-9803998d123d-kube-api-access-nv8f8\") pod \"redhat-marketplace-2v5ck\" (UID: \"08208c32-e300-442e-a0ac-9803998d123d\") " pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:34 crc kubenswrapper[4921]: I0312 14:21:34.184275 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:34 crc kubenswrapper[4921]: I0312 14:21:34.629043 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v5ck"] Mar 12 14:21:35 crc kubenswrapper[4921]: I0312 14:21:35.607762 4921 generic.go:334] "Generic (PLEG): container finished" podID="08208c32-e300-442e-a0ac-9803998d123d" containerID="e05d5daf1d396f6b4e98c5a9544e7e50a85ec7965061d8edaa264f466c974734" exitCode=0 Mar 12 14:21:35 crc kubenswrapper[4921]: I0312 14:21:35.607868 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v5ck" event={"ID":"08208c32-e300-442e-a0ac-9803998d123d","Type":"ContainerDied","Data":"e05d5daf1d396f6b4e98c5a9544e7e50a85ec7965061d8edaa264f466c974734"} Mar 12 14:21:35 crc kubenswrapper[4921]: I0312 14:21:35.608128 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v5ck" event={"ID":"08208c32-e300-442e-a0ac-9803998d123d","Type":"ContainerStarted","Data":"76a67f97171d9252c1380b9ef954ced1193f11797e942e4334854363f4ff0d9d"} Mar 12 14:21:37 crc kubenswrapper[4921]: I0312 14:21:37.631930 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v5ck" event={"ID":"08208c32-e300-442e-a0ac-9803998d123d","Type":"ContainerStarted","Data":"9670a18d6cc4e8fbebf4b38ef6e3579a0afc362c1c01acee2b3ced72fc706945"} Mar 12 14:21:38 crc kubenswrapper[4921]: I0312 14:21:38.643572 4921 generic.go:334] "Generic (PLEG): container finished" podID="08208c32-e300-442e-a0ac-9803998d123d" containerID="9670a18d6cc4e8fbebf4b38ef6e3579a0afc362c1c01acee2b3ced72fc706945" exitCode=0 Mar 12 14:21:38 crc kubenswrapper[4921]: I0312 14:21:38.643630 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v5ck" event={"ID":"08208c32-e300-442e-a0ac-9803998d123d","Type":"ContainerDied","Data":"9670a18d6cc4e8fbebf4b38ef6e3579a0afc362c1c01acee2b3ced72fc706945"} Mar 12 14:21:39 crc kubenswrapper[4921]: I0312 14:21:39.654283 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v5ck" event={"ID":"08208c32-e300-442e-a0ac-9803998d123d","Type":"ContainerStarted","Data":"c4477f3d11b9dbdc892f9f64a54ae0f87e2629ff69f1c7ba15ef689531d02c83"} Mar 12 14:21:39 crc kubenswrapper[4921]: I0312 14:21:39.695043 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2v5ck" podStartSLOduration=3.279100662 podStartE2EDuration="6.695021242s" podCreationTimestamp="2026-03-12 14:21:33 +0000 UTC" firstStartedPulling="2026-03-12 14:21:35.610401222 +0000 UTC m=+4318.300473193" lastFinishedPulling="2026-03-12 14:21:39.026321802 +0000 UTC m=+4321.716393773" observedRunningTime="2026-03-12 14:21:39.680430291 +0000 UTC m=+4322.370502302" watchObservedRunningTime="2026-03-12 14:21:39.695021242 +0000 UTC m=+4322.385093223" Mar 12 14:21:44 crc kubenswrapper[4921]: I0312 14:21:44.185241 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:44 crc kubenswrapper[4921]: I0312 14:21:44.185959 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:44 crc kubenswrapper[4921]: I0312 14:21:44.476992 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:44 crc kubenswrapper[4921]: I0312 14:21:44.759020 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:44 crc kubenswrapper[4921]: I0312 14:21:44.815409 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v5ck"] Mar 12 14:21:46 crc kubenswrapper[4921]: I0312 14:21:46.734069 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2v5ck" podUID="08208c32-e300-442e-a0ac-9803998d123d" containerName="registry-server" containerID="cri-o://c4477f3d11b9dbdc892f9f64a54ae0f87e2629ff69f1c7ba15ef689531d02c83" gracePeriod=2 Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.469802 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.626038 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08208c32-e300-442e-a0ac-9803998d123d-catalog-content\") pod \"08208c32-e300-442e-a0ac-9803998d123d\" (UID: \"08208c32-e300-442e-a0ac-9803998d123d\") " Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.626184 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv8f8\" (UniqueName: \"kubernetes.io/projected/08208c32-e300-442e-a0ac-9803998d123d-kube-api-access-nv8f8\") pod \"08208c32-e300-442e-a0ac-9803998d123d\" (UID: \"08208c32-e300-442e-a0ac-9803998d123d\") " Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.626267 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08208c32-e300-442e-a0ac-9803998d123d-utilities\") pod \"08208c32-e300-442e-a0ac-9803998d123d\" (UID: \"08208c32-e300-442e-a0ac-9803998d123d\") " Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.628126 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08208c32-e300-442e-a0ac-9803998d123d-utilities" (OuterVolumeSpecName: "utilities") pod "08208c32-e300-442e-a0ac-9803998d123d" (UID: "08208c32-e300-442e-a0ac-9803998d123d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.642114 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08208c32-e300-442e-a0ac-9803998d123d-kube-api-access-nv8f8" (OuterVolumeSpecName: "kube-api-access-nv8f8") pod "08208c32-e300-442e-a0ac-9803998d123d" (UID: "08208c32-e300-442e-a0ac-9803998d123d"). InnerVolumeSpecName "kube-api-access-nv8f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.661310 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08208c32-e300-442e-a0ac-9803998d123d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08208c32-e300-442e-a0ac-9803998d123d" (UID: "08208c32-e300-442e-a0ac-9803998d123d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.729036 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08208c32-e300-442e-a0ac-9803998d123d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.729079 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv8f8\" (UniqueName: \"kubernetes.io/projected/08208c32-e300-442e-a0ac-9803998d123d-kube-api-access-nv8f8\") on node \"crc\" DevicePath \"\"" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.729094 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08208c32-e300-442e-a0ac-9803998d123d-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.749099 4921 generic.go:334] "Generic (PLEG): container finished" podID="08208c32-e300-442e-a0ac-9803998d123d" containerID="c4477f3d11b9dbdc892f9f64a54ae0f87e2629ff69f1c7ba15ef689531d02c83" exitCode=0 Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.749167 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2v5ck" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.749180 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v5ck" event={"ID":"08208c32-e300-442e-a0ac-9803998d123d","Type":"ContainerDied","Data":"c4477f3d11b9dbdc892f9f64a54ae0f87e2629ff69f1c7ba15ef689531d02c83"} Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.750322 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2v5ck" event={"ID":"08208c32-e300-442e-a0ac-9803998d123d","Type":"ContainerDied","Data":"76a67f97171d9252c1380b9ef954ced1193f11797e942e4334854363f4ff0d9d"} Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.750363 4921 scope.go:117] "RemoveContainer" containerID="c4477f3d11b9dbdc892f9f64a54ae0f87e2629ff69f1c7ba15ef689531d02c83" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.775172 4921 scope.go:117] "RemoveContainer" containerID="9670a18d6cc4e8fbebf4b38ef6e3579a0afc362c1c01acee2b3ced72fc706945" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.797143 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v5ck"] Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.805259 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2v5ck"] Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.812517 4921 scope.go:117] "RemoveContainer" containerID="e05d5daf1d396f6b4e98c5a9544e7e50a85ec7965061d8edaa264f466c974734" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.852992 4921 scope.go:117] "RemoveContainer" containerID="c4477f3d11b9dbdc892f9f64a54ae0f87e2629ff69f1c7ba15ef689531d02c83" Mar 12 14:21:47 crc kubenswrapper[4921]: E0312 14:21:47.853576 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4477f3d11b9dbdc892f9f64a54ae0f87e2629ff69f1c7ba15ef689531d02c83\": container with ID starting with c4477f3d11b9dbdc892f9f64a54ae0f87e2629ff69f1c7ba15ef689531d02c83 not found: ID does not exist" containerID="c4477f3d11b9dbdc892f9f64a54ae0f87e2629ff69f1c7ba15ef689531d02c83" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.853608 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4477f3d11b9dbdc892f9f64a54ae0f87e2629ff69f1c7ba15ef689531d02c83"} err="failed to get container status \"c4477f3d11b9dbdc892f9f64a54ae0f87e2629ff69f1c7ba15ef689531d02c83\": rpc error: code = NotFound desc = could not find container \"c4477f3d11b9dbdc892f9f64a54ae0f87e2629ff69f1c7ba15ef689531d02c83\": container with ID starting with c4477f3d11b9dbdc892f9f64a54ae0f87e2629ff69f1c7ba15ef689531d02c83 not found: ID does not exist" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.853629 4921 scope.go:117] "RemoveContainer" containerID="9670a18d6cc4e8fbebf4b38ef6e3579a0afc362c1c01acee2b3ced72fc706945" Mar 12 14:21:47 crc kubenswrapper[4921]: E0312 14:21:47.853999 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9670a18d6cc4e8fbebf4b38ef6e3579a0afc362c1c01acee2b3ced72fc706945\": container with ID starting with 9670a18d6cc4e8fbebf4b38ef6e3579a0afc362c1c01acee2b3ced72fc706945 not found: ID does not exist" containerID="9670a18d6cc4e8fbebf4b38ef6e3579a0afc362c1c01acee2b3ced72fc706945" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.854043 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9670a18d6cc4e8fbebf4b38ef6e3579a0afc362c1c01acee2b3ced72fc706945"} err="failed to get container status \"9670a18d6cc4e8fbebf4b38ef6e3579a0afc362c1c01acee2b3ced72fc706945\": rpc error: code = NotFound desc = could not find container \"9670a18d6cc4e8fbebf4b38ef6e3579a0afc362c1c01acee2b3ced72fc706945\": container with ID starting with 9670a18d6cc4e8fbebf4b38ef6e3579a0afc362c1c01acee2b3ced72fc706945 not found: ID does not exist" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.854071 4921 scope.go:117] "RemoveContainer" containerID="e05d5daf1d396f6b4e98c5a9544e7e50a85ec7965061d8edaa264f466c974734" Mar 12 14:21:47 crc kubenswrapper[4921]: E0312 14:21:47.854386 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e05d5daf1d396f6b4e98c5a9544e7e50a85ec7965061d8edaa264f466c974734\": container with ID starting with e05d5daf1d396f6b4e98c5a9544e7e50a85ec7965061d8edaa264f466c974734 not found: ID does not exist" containerID="e05d5daf1d396f6b4e98c5a9544e7e50a85ec7965061d8edaa264f466c974734" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.854404 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05d5daf1d396f6b4e98c5a9544e7e50a85ec7965061d8edaa264f466c974734"} err="failed to get container status \"e05d5daf1d396f6b4e98c5a9544e7e50a85ec7965061d8edaa264f466c974734\": rpc error: code = NotFound desc = could not find container \"e05d5daf1d396f6b4e98c5a9544e7e50a85ec7965061d8edaa264f466c974734\": container with ID starting with e05d5daf1d396f6b4e98c5a9544e7e50a85ec7965061d8edaa264f466c974734 not found: ID does not exist" Mar 12 14:21:47 crc kubenswrapper[4921]: I0312 14:21:47.994758 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08208c32-e300-442e-a0ac-9803998d123d" path="/var/lib/kubelet/pods/08208c32-e300-442e-a0ac-9803998d123d/volumes" Mar 12 14:21:56 crc kubenswrapper[4921]: I0312 14:21:56.324131 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:21:56 crc kubenswrapper[4921]: I0312 14:21:56.324915 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.143390 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555422-489vr"] Mar 12 14:22:00 crc kubenswrapper[4921]: E0312 14:22:00.144323 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08208c32-e300-442e-a0ac-9803998d123d" containerName="extract-content" Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.144337 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="08208c32-e300-442e-a0ac-9803998d123d" containerName="extract-content" Mar 12 14:22:00 crc kubenswrapper[4921]: E0312 14:22:00.144355 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08208c32-e300-442e-a0ac-9803998d123d" containerName="registry-server" Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.144363 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="08208c32-e300-442e-a0ac-9803998d123d" containerName="registry-server" Mar 12 14:22:00 crc kubenswrapper[4921]: E0312 14:22:00.144385 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08208c32-e300-442e-a0ac-9803998d123d" containerName="extract-utilities" Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.144393 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="08208c32-e300-442e-a0ac-9803998d123d" containerName="extract-utilities" Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.144624 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="08208c32-e300-442e-a0ac-9803998d123d" containerName="registry-server" Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.145277 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555422-489vr" Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.147838 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.147958 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.155044 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555422-489vr"] Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.156630 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.306219 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnhm4\" (UniqueName: \"kubernetes.io/projected/c2e3284e-67e4-46a0-b710-06ad3ec3ac89-kube-api-access-xnhm4\") pod \"auto-csr-approver-29555422-489vr\" (UID: \"c2e3284e-67e4-46a0-b710-06ad3ec3ac89\") " pod="openshift-infra/auto-csr-approver-29555422-489vr" Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.407547 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnhm4\" (UniqueName: \"kubernetes.io/projected/c2e3284e-67e4-46a0-b710-06ad3ec3ac89-kube-api-access-xnhm4\") pod \"auto-csr-approver-29555422-489vr\" (UID: \"c2e3284e-67e4-46a0-b710-06ad3ec3ac89\") " pod="openshift-infra/auto-csr-approver-29555422-489vr" Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.426301 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnhm4\" (UniqueName: \"kubernetes.io/projected/c2e3284e-67e4-46a0-b710-06ad3ec3ac89-kube-api-access-xnhm4\") pod \"auto-csr-approver-29555422-489vr\" (UID: \"c2e3284e-67e4-46a0-b710-06ad3ec3ac89\") " pod="openshift-infra/auto-csr-approver-29555422-489vr" Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.466396 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555422-489vr" Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.956081 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555422-489vr"] Mar 12 14:22:00 crc kubenswrapper[4921]: I0312 14:22:00.960403 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:22:01 crc kubenswrapper[4921]: I0312 14:22:01.862127 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555422-489vr" event={"ID":"c2e3284e-67e4-46a0-b710-06ad3ec3ac89","Type":"ContainerStarted","Data":"4644823c9578df91b1f115a9a7ffe44b6b682f983c798b107d21227ed7c030f7"} Mar 12 14:22:02 crc kubenswrapper[4921]: I0312 14:22:02.871500 4921 generic.go:334] "Generic (PLEG): container finished" podID="c2e3284e-67e4-46a0-b710-06ad3ec3ac89" containerID="854a7a6c0535bd4d7baa5d328809d02e695426a2a3bcf4be42534dd99ce85415" exitCode=0 Mar 12 14:22:02 crc kubenswrapper[4921]: I0312 14:22:02.871545 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555422-489vr" event={"ID":"c2e3284e-67e4-46a0-b710-06ad3ec3ac89","Type":"ContainerDied","Data":"854a7a6c0535bd4d7baa5d328809d02e695426a2a3bcf4be42534dd99ce85415"} Mar 12 14:22:04 crc kubenswrapper[4921]: I0312 14:22:04.401542 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555422-489vr" Mar 12 14:22:04 crc kubenswrapper[4921]: I0312 14:22:04.590673 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnhm4\" (UniqueName: \"kubernetes.io/projected/c2e3284e-67e4-46a0-b710-06ad3ec3ac89-kube-api-access-xnhm4\") pod \"c2e3284e-67e4-46a0-b710-06ad3ec3ac89\" (UID: \"c2e3284e-67e4-46a0-b710-06ad3ec3ac89\") " Mar 12 14:22:04 crc kubenswrapper[4921]: I0312 14:22:04.603114 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e3284e-67e4-46a0-b710-06ad3ec3ac89-kube-api-access-xnhm4" (OuterVolumeSpecName: "kube-api-access-xnhm4") pod "c2e3284e-67e4-46a0-b710-06ad3ec3ac89" (UID: "c2e3284e-67e4-46a0-b710-06ad3ec3ac89"). InnerVolumeSpecName "kube-api-access-xnhm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:22:04 crc kubenswrapper[4921]: I0312 14:22:04.693281 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnhm4\" (UniqueName: \"kubernetes.io/projected/c2e3284e-67e4-46a0-b710-06ad3ec3ac89-kube-api-access-xnhm4\") on node \"crc\" DevicePath \"\"" Mar 12 14:22:04 crc kubenswrapper[4921]: I0312 14:22:04.888235 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555422-489vr" event={"ID":"c2e3284e-67e4-46a0-b710-06ad3ec3ac89","Type":"ContainerDied","Data":"4644823c9578df91b1f115a9a7ffe44b6b682f983c798b107d21227ed7c030f7"} Mar 12 14:22:04 crc kubenswrapper[4921]: I0312 14:22:04.888289 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555422-489vr" Mar 12 14:22:04 crc kubenswrapper[4921]: I0312 14:22:04.888304 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4644823c9578df91b1f115a9a7ffe44b6b682f983c798b107d21227ed7c030f7" Mar 12 14:22:05 crc kubenswrapper[4921]: I0312 14:22:05.463843 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555416-jwnhq"] Mar 12 14:22:05 crc kubenswrapper[4921]: I0312 14:22:05.472005 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555416-jwnhq"] Mar 12 14:22:06 crc kubenswrapper[4921]: I0312 14:22:06.007081 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1" path="/var/lib/kubelet/pods/0ee3fdcd-f647-4dfb-a4f1-e95b448bf2a1/volumes" Mar 12 14:22:24 crc kubenswrapper[4921]: I0312 14:22:24.284071 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-nq8wj" podUID="c6de3785-ea06-49bb-9b39-d8f2f10bce81" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.65:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 14:22:25 crc kubenswrapper[4921]: I0312 14:22:25.258852 4921 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.693663903s: [/var/lib/containers/storage/overlay/b5fddf834856789a30d40ac03ec79def0b7ee4f7ca97297ef51babfd71369aba/diff /var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-ppq69_e1bd23bf-3c09-41ff-9840-3397219f3f4d/nmstate-console-plugin/0.log]; will not log again for this container unless duration exceeds 2s Mar 12 14:22:25 crc kubenswrapper[4921]: E0312 14:22:25.346796 4921 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.364s" Mar 12 14:22:26 crc kubenswrapper[4921]: I0312 14:22:26.323585 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:22:26 crc kubenswrapper[4921]: I0312 14:22:26.323653 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:22:33 crc kubenswrapper[4921]: I0312 14:22:33.058283 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="69b5525a-14c6-453f-9673-11d9e63dd25a" containerName="galera" probeResult="failure" output="command timed out" Mar 12 14:22:47 crc kubenswrapper[4921]: I0312 14:22:47.028136 4921 scope.go:117] "RemoveContainer" containerID="694358b65ef6dda4bd9b20e54980fd7074ff88dff9bff0887a59ea67649a2ea9" Mar 12 14:22:56 crc kubenswrapper[4921]: I0312 14:22:56.323508 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:22:56 crc kubenswrapper[4921]: I0312 14:22:56.324141 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:22:56 crc kubenswrapper[4921]: I0312 14:22:56.324214 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 14:22:56 crc kubenswrapper[4921]: I0312 14:22:56.325033 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 14:22:56 crc kubenswrapper[4921]: I0312 14:22:56.325087 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" gracePeriod=600 Mar 12 14:22:56 crc kubenswrapper[4921]: E0312 14:22:56.465098 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:22:56 crc kubenswrapper[4921]: I0312 14:22:56.754909 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" exitCode=0 Mar 12 14:22:56 crc kubenswrapper[4921]: I0312 14:22:56.754966 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f"} Mar 12 14:22:56 crc kubenswrapper[4921]: I0312 14:22:56.754997 4921 scope.go:117] "RemoveContainer" containerID="6bc5f01d3dd879fd949dcd43b51ed7002793c74a0fcf4b2431e6945845a731d6" Mar 12 14:22:56 crc kubenswrapper[4921]: I0312 14:22:56.755920 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:22:56 crc kubenswrapper[4921]: E0312 14:22:56.756237 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:23:09 crc kubenswrapper[4921]: I0312 14:23:09.983549 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:23:09 crc kubenswrapper[4921]: E0312 14:23:09.984227 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:23:23 crc kubenswrapper[4921]: I0312 14:23:23.984130 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:23:23 crc kubenswrapper[4921]: E0312 14:23:23.985313 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:23:34 crc kubenswrapper[4921]: I0312 14:23:34.983788 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:23:34 crc kubenswrapper[4921]: E0312 14:23:34.984690 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:23:47 crc kubenswrapper[4921]: I0312 14:23:47.992477 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:23:47 crc kubenswrapper[4921]: E0312 14:23:47.994183 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:24:00 crc kubenswrapper[4921]: I0312 14:24:00.151593 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555424-lx78r"] Mar 12 14:24:00 crc kubenswrapper[4921]: E0312 14:24:00.152742 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e3284e-67e4-46a0-b710-06ad3ec3ac89" containerName="oc" Mar 12 14:24:00 crc kubenswrapper[4921]: I0312 14:24:00.152759 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e3284e-67e4-46a0-b710-06ad3ec3ac89" containerName="oc" Mar 12 14:24:00 crc kubenswrapper[4921]: I0312 14:24:00.153017 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2e3284e-67e4-46a0-b710-06ad3ec3ac89" containerName="oc" Mar 12 14:24:00 crc kubenswrapper[4921]: I0312 14:24:00.153771 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555424-lx78r" Mar 12 14:24:00 crc kubenswrapper[4921]: I0312 14:24:00.156074 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:24:00 crc kubenswrapper[4921]: I0312 14:24:00.156166 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:24:00 crc kubenswrapper[4921]: I0312 14:24:00.156902 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:24:00 crc kubenswrapper[4921]: I0312 14:24:00.166966 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555424-lx78r"] Mar 12 14:24:00 crc kubenswrapper[4921]: I0312 14:24:00.212534 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-728mh\" (UniqueName: \"kubernetes.io/projected/23ba2cd1-8f0a-454b-9fa4-18d7ea84706f-kube-api-access-728mh\") pod \"auto-csr-approver-29555424-lx78r\" (UID: \"23ba2cd1-8f0a-454b-9fa4-18d7ea84706f\") " pod="openshift-infra/auto-csr-approver-29555424-lx78r" Mar 12 14:24:00 crc kubenswrapper[4921]: I0312 14:24:00.314937 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-728mh\" (UniqueName: \"kubernetes.io/projected/23ba2cd1-8f0a-454b-9fa4-18d7ea84706f-kube-api-access-728mh\") pod \"auto-csr-approver-29555424-lx78r\" (UID: \"23ba2cd1-8f0a-454b-9fa4-18d7ea84706f\") " pod="openshift-infra/auto-csr-approver-29555424-lx78r" Mar 12 14:24:00 crc kubenswrapper[4921]: I0312 14:24:00.336783 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-728mh\" (UniqueName: \"kubernetes.io/projected/23ba2cd1-8f0a-454b-9fa4-18d7ea84706f-kube-api-access-728mh\") pod \"auto-csr-approver-29555424-lx78r\" (UID: \"23ba2cd1-8f0a-454b-9fa4-18d7ea84706f\") " pod="openshift-infra/auto-csr-approver-29555424-lx78r" Mar 12 14:24:00 crc kubenswrapper[4921]: I0312 14:24:00.477497 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555424-lx78r" Mar 12 14:24:00 crc kubenswrapper[4921]: I0312 14:24:00.979535 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555424-lx78r"] Mar 12 14:24:01 crc kubenswrapper[4921]: I0312 14:24:01.375377 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555424-lx78r" event={"ID":"23ba2cd1-8f0a-454b-9fa4-18d7ea84706f","Type":"ContainerStarted","Data":"699dbaf71008391f8b355c5f101fa84dc0be08139b7e8b1344245b759cbd2094"} Mar 12 14:24:01 crc kubenswrapper[4921]: I0312 14:24:01.984543 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:24:01 crc kubenswrapper[4921]: E0312 14:24:01.984806 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:24:03 crc kubenswrapper[4921]: I0312 14:24:03.395348 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555424-lx78r" event={"ID":"23ba2cd1-8f0a-454b-9fa4-18d7ea84706f","Type":"ContainerStarted","Data":"4f095873852ad8be7a6dc69a1860ece4812d82b6dc197f225f98f8d32d4b1bf7"} Mar 12 14:24:03 crc kubenswrapper[4921]: I0312 14:24:03.413856 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555424-lx78r" podStartSLOduration=1.548456238 podStartE2EDuration="3.413807762s" podCreationTimestamp="2026-03-12 14:24:00 +0000 UTC" firstStartedPulling="2026-03-12 14:24:00.993032105 +0000 UTC m=+4463.683104076" lastFinishedPulling="2026-03-12 14:24:02.858383629 +0000 UTC m=+4465.548455600" observedRunningTime="2026-03-12 14:24:03.408338033 +0000 UTC m=+4466.098410004" watchObservedRunningTime="2026-03-12 14:24:03.413807762 +0000 UTC m=+4466.103879733" Mar 12 14:24:04 crc kubenswrapper[4921]: I0312 14:24:04.872078 4921 generic.go:334] "Generic (PLEG): container finished" podID="23ba2cd1-8f0a-454b-9fa4-18d7ea84706f" containerID="4f095873852ad8be7a6dc69a1860ece4812d82b6dc197f225f98f8d32d4b1bf7" exitCode=0 Mar 12 14:24:04 crc kubenswrapper[4921]: I0312 14:24:04.872345 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555424-lx78r" event={"ID":"23ba2cd1-8f0a-454b-9fa4-18d7ea84706f","Type":"ContainerDied","Data":"4f095873852ad8be7a6dc69a1860ece4812d82b6dc197f225f98f8d32d4b1bf7"} Mar 12 14:24:06 crc kubenswrapper[4921]: I0312 14:24:06.559524 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555424-lx78r" Mar 12 14:24:06 crc kubenswrapper[4921]: I0312 14:24:06.705130 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-728mh\" (UniqueName: \"kubernetes.io/projected/23ba2cd1-8f0a-454b-9fa4-18d7ea84706f-kube-api-access-728mh\") pod \"23ba2cd1-8f0a-454b-9fa4-18d7ea84706f\" (UID: \"23ba2cd1-8f0a-454b-9fa4-18d7ea84706f\") " Mar 12 14:24:06 crc kubenswrapper[4921]: I0312 14:24:06.716183 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ba2cd1-8f0a-454b-9fa4-18d7ea84706f-kube-api-access-728mh" (OuterVolumeSpecName: "kube-api-access-728mh") pod "23ba2cd1-8f0a-454b-9fa4-18d7ea84706f" (UID: "23ba2cd1-8f0a-454b-9fa4-18d7ea84706f"). InnerVolumeSpecName "kube-api-access-728mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:24:06 crc kubenswrapper[4921]: I0312 14:24:06.808328 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-728mh\" (UniqueName: \"kubernetes.io/projected/23ba2cd1-8f0a-454b-9fa4-18d7ea84706f-kube-api-access-728mh\") on node \"crc\" DevicePath \"\"" Mar 12 14:24:06 crc kubenswrapper[4921]: I0312 14:24:06.876412 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555418-xxgkp"] Mar 12 14:24:06 crc kubenswrapper[4921]: I0312 14:24:06.887709 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555418-xxgkp"] Mar 12 14:24:06 crc kubenswrapper[4921]: I0312 14:24:06.888866 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555424-lx78r" event={"ID":"23ba2cd1-8f0a-454b-9fa4-18d7ea84706f","Type":"ContainerDied","Data":"699dbaf71008391f8b355c5f101fa84dc0be08139b7e8b1344245b759cbd2094"} Mar 12 14:24:06 crc kubenswrapper[4921]: I0312 14:24:06.888908 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="699dbaf71008391f8b355c5f101fa84dc0be08139b7e8b1344245b759cbd2094" Mar 12 14:24:06 crc kubenswrapper[4921]: I0312 14:24:06.888961 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555424-lx78r" Mar 12 14:24:07 crc kubenswrapper[4921]: I0312 14:24:07.992114 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4600b08-7049-4e69-a857-490413d1d7f2" path="/var/lib/kubelet/pods/d4600b08-7049-4e69-a857-490413d1d7f2/volumes" Mar 12 14:24:15 crc kubenswrapper[4921]: I0312 14:24:15.984684 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:24:15 crc kubenswrapper[4921]: E0312 14:24:15.986555 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:24:27 crc kubenswrapper[4921]: I0312 14:24:27.993923 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:24:27 crc kubenswrapper[4921]: E0312 14:24:27.995212 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:24:38 crc kubenswrapper[4921]: I0312 14:24:38.984258 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:24:38 crc kubenswrapper[4921]: E0312 14:24:38.985122 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:24:47 crc kubenswrapper[4921]: I0312 14:24:47.178570 4921 scope.go:117] "RemoveContainer" containerID="d487f8845a170909ae96bd8ca22cf1ad0124a90292dc2c9b1982ffa96a69acd4" Mar 12 14:24:52 crc kubenswrapper[4921]: I0312 14:24:52.984100 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:24:52 crc kubenswrapper[4921]: E0312 14:24:52.984711 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:25:05 crc kubenswrapper[4921]: I0312 14:25:05.989257 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:25:05 crc kubenswrapper[4921]: E0312 14:25:05.991636 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:25:18 crc kubenswrapper[4921]: I0312 14:25:18.983299 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:25:18 crc kubenswrapper[4921]: E0312 14:25:18.984148 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:25:29 crc kubenswrapper[4921]: I0312 14:25:29.983983 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:25:29 crc kubenswrapper[4921]: E0312 14:25:29.984790 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:25:40 crc kubenswrapper[4921]: I0312 14:25:40.986882 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:25:40 crc kubenswrapper[4921]: E0312 14:25:40.987802 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:25:54 crc kubenswrapper[4921]: I0312 14:25:54.984736 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:25:54 crc kubenswrapper[4921]: E0312 14:25:54.985699 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:26:00 crc kubenswrapper[4921]: I0312 14:26:00.154994 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555426-9v2qd"] Mar 12 14:26:00 crc kubenswrapper[4921]: E0312 14:26:00.156030 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ba2cd1-8f0a-454b-9fa4-18d7ea84706f" containerName="oc" Mar 12 14:26:00 crc kubenswrapper[4921]: I0312 14:26:00.156046 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ba2cd1-8f0a-454b-9fa4-18d7ea84706f" containerName="oc" Mar 12 14:26:00 crc kubenswrapper[4921]: I0312 14:26:00.156242 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ba2cd1-8f0a-454b-9fa4-18d7ea84706f" containerName="oc" Mar 12 14:26:00 crc kubenswrapper[4921]: I0312 14:26:00.157123 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555426-9v2qd" Mar 12 14:26:00 crc kubenswrapper[4921]: I0312 14:26:00.159247 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:26:00 crc kubenswrapper[4921]: I0312 14:26:00.159525 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:26:00 crc kubenswrapper[4921]: I0312 14:26:00.159590 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:26:00 crc kubenswrapper[4921]: I0312 14:26:00.179024 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555426-9v2qd"] Mar 12 14:26:00 crc kubenswrapper[4921]: I0312 14:26:00.447635 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wqbt\" (UniqueName: \"kubernetes.io/projected/a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8-kube-api-access-5wqbt\") pod \"auto-csr-approver-29555426-9v2qd\" (UID: \"a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8\") " pod="openshift-infra/auto-csr-approver-29555426-9v2qd" Mar 12 14:26:00 crc kubenswrapper[4921]: I0312 14:26:00.550164 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wqbt\" (UniqueName: \"kubernetes.io/projected/a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8-kube-api-access-5wqbt\") pod \"auto-csr-approver-29555426-9v2qd\" (UID: \"a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8\") " pod="openshift-infra/auto-csr-approver-29555426-9v2qd" Mar 12 14:26:01 crc kubenswrapper[4921]: I0312 14:26:01.008175 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wqbt\" (UniqueName: \"kubernetes.io/projected/a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8-kube-api-access-5wqbt\") pod \"auto-csr-approver-29555426-9v2qd\" (UID: \"a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8\") " pod="openshift-infra/auto-csr-approver-29555426-9v2qd" Mar 12 14:26:01 crc kubenswrapper[4921]: I0312 14:26:01.073452 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555426-9v2qd" Mar 12 14:26:02 crc kubenswrapper[4921]: I0312 14:26:02.009999 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555426-9v2qd"] Mar 12 14:26:02 crc kubenswrapper[4921]: I0312 14:26:02.200616 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555426-9v2qd" event={"ID":"a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8","Type":"ContainerStarted","Data":"0b95f7ec32f7ae536f9dc92c6f6dfaf7986ff6bd28351fde734183a76668a96d"} Mar 12 14:26:04 crc kubenswrapper[4921]: I0312 14:26:04.219431 4921 generic.go:334] "Generic (PLEG): container finished" podID="a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8" containerID="210453b51ac5cc012689db2a82b19d465448f3d0f3e9be90decb327490f5fb75" exitCode=0 Mar 12 14:26:04 crc kubenswrapper[4921]: I0312 14:26:04.219544 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555426-9v2qd" event={"ID":"a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8","Type":"ContainerDied","Data":"210453b51ac5cc012689db2a82b19d465448f3d0f3e9be90decb327490f5fb75"} Mar 12 14:26:05 crc kubenswrapper[4921]: I0312 14:26:05.713390 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555426-9v2qd" Mar 12 14:26:05 crc kubenswrapper[4921]: I0312 14:26:05.860302 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wqbt\" (UniqueName: \"kubernetes.io/projected/a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8-kube-api-access-5wqbt\") pod \"a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8\" (UID: \"a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8\") " Mar 12 14:26:05 crc kubenswrapper[4921]: I0312 14:26:05.866458 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8-kube-api-access-5wqbt" (OuterVolumeSpecName: "kube-api-access-5wqbt") pod "a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8" (UID: "a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8"). InnerVolumeSpecName "kube-api-access-5wqbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:26:05 crc kubenswrapper[4921]: I0312 14:26:05.968712 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wqbt\" (UniqueName: \"kubernetes.io/projected/a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8-kube-api-access-5wqbt\") on node \"crc\" DevicePath \"\"" Mar 12 14:26:06 crc kubenswrapper[4921]: I0312 14:26:06.243669 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555426-9v2qd" event={"ID":"a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8","Type":"ContainerDied","Data":"0b95f7ec32f7ae536f9dc92c6f6dfaf7986ff6bd28351fde734183a76668a96d"} Mar 12 14:26:06 crc kubenswrapper[4921]: I0312 14:26:06.243716 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b95f7ec32f7ae536f9dc92c6f6dfaf7986ff6bd28351fde734183a76668a96d" Mar 12 14:26:06 crc kubenswrapper[4921]: I0312 14:26:06.243719 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555426-9v2qd" Mar 12 14:26:06 crc kubenswrapper[4921]: I0312 14:26:06.785931 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555420-7gn8f"] Mar 12 14:26:06 crc kubenswrapper[4921]: I0312 14:26:06.793193 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555420-7gn8f"] Mar 12 14:26:07 crc kubenswrapper[4921]: I0312 14:26:07.989225 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:26:07 crc kubenswrapper[4921]: E0312 14:26:07.989702 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:26:07 crc kubenswrapper[4921]: I0312 14:26:07.995342 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab3d5978-4182-4e5c-8209-fa565cffb370" path="/var/lib/kubelet/pods/ab3d5978-4182-4e5c-8209-fa565cffb370/volumes" Mar 12 14:26:19 crc kubenswrapper[4921]: I0312 14:26:19.983423 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:26:19 crc kubenswrapper[4921]: E0312 14:26:19.984265 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:26:30 crc kubenswrapper[4921]: I0312 14:26:30.983716 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:26:30 crc kubenswrapper[4921]: E0312 14:26:30.984612 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:26:43 crc kubenswrapper[4921]: I0312 14:26:43.984073 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:26:43 crc kubenswrapper[4921]: E0312 14:26:43.984771 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:26:47 crc kubenswrapper[4921]: I0312 14:26:47.302419 4921 scope.go:117] "RemoveContainer" containerID="ec1a19fd24e4c77fb11a655546e12880d9b699921f729ae3c94c9c2ae4922d29" Mar 12 14:26:55 crc kubenswrapper[4921]: I0312 14:26:55.983630 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:26:55 crc kubenswrapper[4921]: E0312 14:26:55.984548 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:27:10 crc kubenswrapper[4921]: I0312 14:27:10.984340 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:27:10 crc kubenswrapper[4921]: E0312 14:27:10.985376 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:27:23 crc kubenswrapper[4921]: I0312 14:27:23.984188 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:27:23 crc kubenswrapper[4921]: E0312 14:27:23.985083 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.398137 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cdk9w"] Mar 12 14:27:31 crc kubenswrapper[4921]: E0312 14:27:31.399162 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8" containerName="oc" Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.399181 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8" containerName="oc" Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.399454 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8" containerName="oc" Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.401100 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.422845 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cdk9w"] Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.547597 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfe1a746-8428-467a-a8c9-c7ce289d4e37-catalog-content\") pod \"redhat-operators-cdk9w\" (UID: \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\") " pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.548006 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzr29\" (UniqueName: \"kubernetes.io/projected/cfe1a746-8428-467a-a8c9-c7ce289d4e37-kube-api-access-tzr29\") pod \"redhat-operators-cdk9w\" (UID: \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\") " pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.548291 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfe1a746-8428-467a-a8c9-c7ce289d4e37-utilities\") pod \"redhat-operators-cdk9w\" (UID: \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\") " pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.650636 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfe1a746-8428-467a-a8c9-c7ce289d4e37-utilities\") pod \"redhat-operators-cdk9w\" (UID: \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\") " pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.650749 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfe1a746-8428-467a-a8c9-c7ce289d4e37-catalog-content\") pod \"redhat-operators-cdk9w\" (UID: \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\") " pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.650900 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzr29\" (UniqueName: \"kubernetes.io/projected/cfe1a746-8428-467a-a8c9-c7ce289d4e37-kube-api-access-tzr29\") pod \"redhat-operators-cdk9w\" (UID: \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\") " pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.651436 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfe1a746-8428-467a-a8c9-c7ce289d4e37-utilities\") pod \"redhat-operators-cdk9w\" (UID: \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\") " pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.651474 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfe1a746-8428-467a-a8c9-c7ce289d4e37-catalog-content\") pod \"redhat-operators-cdk9w\" (UID: \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\") " pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.675088 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzr29\" (UniqueName: \"kubernetes.io/projected/cfe1a746-8428-467a-a8c9-c7ce289d4e37-kube-api-access-tzr29\") pod \"redhat-operators-cdk9w\" (UID: \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\") " pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:27:31 crc kubenswrapper[4921]: I0312 14:27:31.736296 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:27:32 crc kubenswrapper[4921]: I0312 14:27:32.203938 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cdk9w"] Mar 12 14:27:32 crc kubenswrapper[4921]: I0312 14:27:32.245296 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdk9w" event={"ID":"cfe1a746-8428-467a-a8c9-c7ce289d4e37","Type":"ContainerStarted","Data":"54aa49d77b0ef5ceed87a7702cab4125408f87e54f610ada370bdb6abb737107"} Mar 12 14:27:33 crc kubenswrapper[4921]: I0312 14:27:33.255613 4921 generic.go:334] "Generic (PLEG): container finished" podID="cfe1a746-8428-467a-a8c9-c7ce289d4e37" containerID="254f0df2ffcb3f6146f875ff185ab99d0c78a44b5b2c7ae0dfef734099f58270" exitCode=0 Mar 12 14:27:33 crc kubenswrapper[4921]: I0312 14:27:33.255678 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdk9w" event={"ID":"cfe1a746-8428-467a-a8c9-c7ce289d4e37","Type":"ContainerDied","Data":"254f0df2ffcb3f6146f875ff185ab99d0c78a44b5b2c7ae0dfef734099f58270"} Mar 12 14:27:33 crc kubenswrapper[4921]: I0312 14:27:33.258757 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:27:35 crc kubenswrapper[4921]: I0312 14:27:35.281112 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdk9w" event={"ID":"cfe1a746-8428-467a-a8c9-c7ce289d4e37","Type":"ContainerStarted","Data":"9b45e778aa554dc03342da866c5b4b95ce27ac621aeaa7cf4dd49e245b659cd7"} Mar 12 14:27:37 crc kubenswrapper[4921]: I0312 14:27:37.989899 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:27:37 crc kubenswrapper[4921]: E0312 14:27:37.990930 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:27:40 crc kubenswrapper[4921]: I0312 14:27:40.332627 4921 generic.go:334] "Generic (PLEG): container finished" podID="cfe1a746-8428-467a-a8c9-c7ce289d4e37" containerID="9b45e778aa554dc03342da866c5b4b95ce27ac621aeaa7cf4dd49e245b659cd7" exitCode=0 Mar 12 14:27:40 crc kubenswrapper[4921]: I0312 14:27:40.333185 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdk9w" event={"ID":"cfe1a746-8428-467a-a8c9-c7ce289d4e37","Type":"ContainerDied","Data":"9b45e778aa554dc03342da866c5b4b95ce27ac621aeaa7cf4dd49e245b659cd7"} Mar 12 14:27:41 crc kubenswrapper[4921]: I0312 14:27:41.342943 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdk9w" event={"ID":"cfe1a746-8428-467a-a8c9-c7ce289d4e37","Type":"ContainerStarted","Data":"bba2ced13b5c65fc907aa78f8b8c117871aad6ef476d37b23ad3aac4d8648b46"} Mar 12 14:27:41 crc kubenswrapper[4921]: I0312 14:27:41.368588 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cdk9w" podStartSLOduration=2.872145094 podStartE2EDuration="10.368569926s" podCreationTimestamp="2026-03-12 14:27:31 +0000 UTC" firstStartedPulling="2026-03-12 14:27:33.258556186 +0000 UTC m=+4675.948628157" lastFinishedPulling="2026-03-12 14:27:40.754981018 +0000 UTC m=+4683.445052989" observedRunningTime="2026-03-12 14:27:41.35995211 +0000 UTC m=+4684.050024081" watchObservedRunningTime="2026-03-12 14:27:41.368569926 +0000 UTC m=+4684.058641897" Mar 12 14:27:41 crc kubenswrapper[4921]: I0312 14:27:41.737613 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:27:41 crc kubenswrapper[4921]: I0312 14:27:41.737664 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:27:42 crc kubenswrapper[4921]: I0312 14:27:42.791594 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cdk9w" podUID="cfe1a746-8428-467a-a8c9-c7ce289d4e37" containerName="registry-server" probeResult="failure" output=< Mar 12 14:27:42 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 14:27:42 crc kubenswrapper[4921]: > Mar 12 14:27:50 crc kubenswrapper[4921]: I0312 14:27:50.982981 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:27:50 crc kubenswrapper[4921]: E0312 14:27:50.983587 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:27:52 crc kubenswrapper[4921]: I0312 14:27:52.794909 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cdk9w" podUID="cfe1a746-8428-467a-a8c9-c7ce289d4e37" containerName="registry-server" probeResult="failure" output=< Mar 12 14:27:52 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 14:27:52 crc kubenswrapper[4921]: > Mar 12 14:27:54 crc kubenswrapper[4921]: I0312 14:27:54.029976 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g6xq6"] Mar 12 14:27:54 crc kubenswrapper[4921]: I0312 14:27:54.032296 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:27:54 crc kubenswrapper[4921]: I0312 14:27:54.042784 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g6xq6"] Mar 12 14:27:54 crc kubenswrapper[4921]: I0312 14:27:54.172628 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21ce62d2-eeff-4060-b114-e4f59406c80d-catalog-content\") pod \"certified-operators-g6xq6\" (UID: \"21ce62d2-eeff-4060-b114-e4f59406c80d\") " pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:27:54 crc kubenswrapper[4921]: I0312 14:27:54.172771 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21ce62d2-eeff-4060-b114-e4f59406c80d-utilities\") pod \"certified-operators-g6xq6\" (UID: \"21ce62d2-eeff-4060-b114-e4f59406c80d\") " pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:27:54 crc kubenswrapper[4921]: I0312 14:27:54.172849 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j92s\" (UniqueName: \"kubernetes.io/projected/21ce62d2-eeff-4060-b114-e4f59406c80d-kube-api-access-2j92s\") pod \"certified-operators-g6xq6\" (UID: \"21ce62d2-eeff-4060-b114-e4f59406c80d\") " pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:27:54 crc kubenswrapper[4921]: I0312 14:27:54.275001 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21ce62d2-eeff-4060-b114-e4f59406c80d-catalog-content\") pod \"certified-operators-g6xq6\" (UID: \"21ce62d2-eeff-4060-b114-e4f59406c80d\") " pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:27:54 crc kubenswrapper[4921]: I0312 14:27:54.275548 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21ce62d2-eeff-4060-b114-e4f59406c80d-utilities\") pod \"certified-operators-g6xq6\" (UID: \"21ce62d2-eeff-4060-b114-e4f59406c80d\") " pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:27:54 crc kubenswrapper[4921]: I0312 14:27:54.275602 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j92s\" (UniqueName: \"kubernetes.io/projected/21ce62d2-eeff-4060-b114-e4f59406c80d-kube-api-access-2j92s\") pod \"certified-operators-g6xq6\" (UID: \"21ce62d2-eeff-4060-b114-e4f59406c80d\") " pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:27:54 crc kubenswrapper[4921]: I0312 14:27:54.276035 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21ce62d2-eeff-4060-b114-e4f59406c80d-catalog-content\") pod \"certified-operators-g6xq6\" (UID: \"21ce62d2-eeff-4060-b114-e4f59406c80d\") " pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:27:54 crc kubenswrapper[4921]: I0312 14:27:54.276060 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21ce62d2-eeff-4060-b114-e4f59406c80d-utilities\") pod \"certified-operators-g6xq6\" (UID: \"21ce62d2-eeff-4060-b114-e4f59406c80d\") " pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:27:54 crc kubenswrapper[4921]: I0312 14:27:54.305919 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j92s\" (UniqueName: \"kubernetes.io/projected/21ce62d2-eeff-4060-b114-e4f59406c80d-kube-api-access-2j92s\") pod \"certified-operators-g6xq6\" (UID: \"21ce62d2-eeff-4060-b114-e4f59406c80d\") " pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:27:54 crc kubenswrapper[4921]: I0312 14:27:54.370960 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:27:54 crc kubenswrapper[4921]: I0312 14:27:54.880604 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g6xq6"] Mar 12 14:27:55 crc kubenswrapper[4921]: I0312 14:27:55.467834 4921 generic.go:334] "Generic (PLEG): container finished" podID="21ce62d2-eeff-4060-b114-e4f59406c80d" containerID="4f0d5db6a0b10b72067787ed634983cfc40e8f368d7e51db7e3cb30813629c09" exitCode=0 Mar 12 14:27:55 crc kubenswrapper[4921]: I0312 14:27:55.468040 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6xq6" event={"ID":"21ce62d2-eeff-4060-b114-e4f59406c80d","Type":"ContainerDied","Data":"4f0d5db6a0b10b72067787ed634983cfc40e8f368d7e51db7e3cb30813629c09"} Mar 12 14:27:55 crc kubenswrapper[4921]: I0312 14:27:55.468126 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6xq6" event={"ID":"21ce62d2-eeff-4060-b114-e4f59406c80d","Type":"ContainerStarted","Data":"97898929bff18db27399fdb5ae6d25b94ab0cddc3e230c73274daf2f5274d109"} Mar 12 14:27:56 crc kubenswrapper[4921]: I0312 14:27:56.478245 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6xq6" event={"ID":"21ce62d2-eeff-4060-b114-e4f59406c80d","Type":"ContainerStarted","Data":"84c7600aaedbf459fa6c71514ec5b77b9c858b63037ee39d1126be2a955e3d7c"} Mar 12 14:27:58 crc kubenswrapper[4921]: I0312 14:27:58.494783 4921 generic.go:334] "Generic (PLEG): container finished" podID="21ce62d2-eeff-4060-b114-e4f59406c80d" containerID="84c7600aaedbf459fa6c71514ec5b77b9c858b63037ee39d1126be2a955e3d7c" exitCode=0 Mar 12 14:27:58 crc kubenswrapper[4921]: I0312 14:27:58.494854 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6xq6" event={"ID":"21ce62d2-eeff-4060-b114-e4f59406c80d","Type":"ContainerDied","Data":"84c7600aaedbf459fa6c71514ec5b77b9c858b63037ee39d1126be2a955e3d7c"} Mar 12 14:28:00 crc kubenswrapper[4921]: I0312 14:28:00.139841 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555428-zclwx"] Mar 12 14:28:00 crc kubenswrapper[4921]: I0312 14:28:00.141775 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555428-zclwx" Mar 12 14:28:00 crc kubenswrapper[4921]: I0312 14:28:00.148619 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555428-zclwx"] Mar 12 14:28:00 crc kubenswrapper[4921]: I0312 14:28:00.149584 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:28:00 crc kubenswrapper[4921]: I0312 14:28:00.149875 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:28:00 crc kubenswrapper[4921]: I0312 14:28:00.153009 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:28:00 crc kubenswrapper[4921]: I0312 14:28:00.290582 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5pfz\" (UniqueName: \"kubernetes.io/projected/3d9874e8-164a-4006-8a19-b2c7266e9c3a-kube-api-access-f5pfz\") pod \"auto-csr-approver-29555428-zclwx\" (UID: \"3d9874e8-164a-4006-8a19-b2c7266e9c3a\") " pod="openshift-infra/auto-csr-approver-29555428-zclwx" Mar 12 14:28:00 crc kubenswrapper[4921]: I0312 14:28:00.392414 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5pfz\" (UniqueName: \"kubernetes.io/projected/3d9874e8-164a-4006-8a19-b2c7266e9c3a-kube-api-access-f5pfz\") pod \"auto-csr-approver-29555428-zclwx\" (UID: \"3d9874e8-164a-4006-8a19-b2c7266e9c3a\") " pod="openshift-infra/auto-csr-approver-29555428-zclwx" Mar 12 14:28:00 crc kubenswrapper[4921]: I0312 14:28:00.412234 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5pfz\" (UniqueName: \"kubernetes.io/projected/3d9874e8-164a-4006-8a19-b2c7266e9c3a-kube-api-access-f5pfz\") pod \"auto-csr-approver-29555428-zclwx\" (UID: \"3d9874e8-164a-4006-8a19-b2c7266e9c3a\") " pod="openshift-infra/auto-csr-approver-29555428-zclwx" Mar 12 14:28:00 crc kubenswrapper[4921]: I0312 14:28:00.494686 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555428-zclwx" Mar 12 14:28:00 crc kubenswrapper[4921]: I0312 14:28:00.514441 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6xq6" event={"ID":"21ce62d2-eeff-4060-b114-e4f59406c80d","Type":"ContainerStarted","Data":"21efa06b051991d468c6588af42b8506b40a4c9d34987290e0ff00ad0206d7cb"} Mar 12 14:28:00 crc kubenswrapper[4921]: I0312 14:28:00.544323 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g6xq6" podStartSLOduration=3.927807551 podStartE2EDuration="7.544302714s" podCreationTimestamp="2026-03-12 14:27:53 +0000 UTC" firstStartedPulling="2026-03-12 14:27:55.46998074 +0000 UTC m=+4698.160052711" lastFinishedPulling="2026-03-12 14:27:59.086475903 +0000 UTC m=+4701.776547874" observedRunningTime="2026-03-12 14:28:00.535101329 +0000 UTC m=+4703.225173290" watchObservedRunningTime="2026-03-12 14:28:00.544302714 +0000 UTC m=+4703.234374705" Mar 12 14:28:00 crc kubenswrapper[4921]: I0312 14:28:00.990125 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555428-zclwx"] Mar 12 14:28:01 crc kubenswrapper[4921]: I0312 14:28:01.524421 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555428-zclwx" event={"ID":"3d9874e8-164a-4006-8a19-b2c7266e9c3a","Type":"ContainerStarted","Data":"e63f5422f0fb9857535a3a35a7b0d3b3b68095f43a1709175218533d6313ef54"} Mar 12 14:28:01 crc kubenswrapper[4921]: I0312 14:28:01.799615 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:28:01 crc kubenswrapper[4921]: I0312 14:28:01.847858 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:28:02 crc kubenswrapper[4921]: I0312 14:28:02.372633 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cdk9w"] Mar 12 14:28:03 crc kubenswrapper[4921]: I0312 14:28:03.543143 4921 generic.go:334] "Generic (PLEG): container finished" podID="3d9874e8-164a-4006-8a19-b2c7266e9c3a" containerID="d3959cbbd4c53a61345b0ed46e3c5018eaf7fa6e911925b7012b44d91b685fc4" exitCode=0 Mar 12 14:28:03 crc kubenswrapper[4921]: I0312 14:28:03.543192 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555428-zclwx" event={"ID":"3d9874e8-164a-4006-8a19-b2c7266e9c3a","Type":"ContainerDied","Data":"d3959cbbd4c53a61345b0ed46e3c5018eaf7fa6e911925b7012b44d91b685fc4"} Mar 12 14:28:03 crc kubenswrapper[4921]: I0312 14:28:03.543663 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cdk9w" podUID="cfe1a746-8428-467a-a8c9-c7ce289d4e37" containerName="registry-server" containerID="cri-o://bba2ced13b5c65fc907aa78f8b8c117871aad6ef476d37b23ad3aac4d8648b46" gracePeriod=2 Mar 12 14:28:04 crc kubenswrapper[4921]: I0312 14:28:04.235657 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:28:04 crc kubenswrapper[4921]: I0312 14:28:04.371939 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:28:04 crc kubenswrapper[4921]: I0312 14:28:04.372278 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:28:04 crc kubenswrapper[4921]: I0312 14:28:04.386428 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfe1a746-8428-467a-a8c9-c7ce289d4e37-catalog-content\") pod \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\" (UID: \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\") " Mar 12 14:28:04 crc kubenswrapper[4921]: I0312 14:28:04.386670 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfe1a746-8428-467a-a8c9-c7ce289d4e37-utilities\") pod \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\" (UID: \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\") " Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.387081 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzr29\" (UniqueName: \"kubernetes.io/projected/cfe1a746-8428-467a-a8c9-c7ce289d4e37-kube-api-access-tzr29\") pod \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\" (UID: \"cfe1a746-8428-467a-a8c9-c7ce289d4e37\") " Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.387302 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfe1a746-8428-467a-a8c9-c7ce289d4e37-utilities" (OuterVolumeSpecName: "utilities") pod "cfe1a746-8428-467a-a8c9-c7ce289d4e37" (UID: "cfe1a746-8428-467a-a8c9-c7ce289d4e37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.400543 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe1a746-8428-467a-a8c9-c7ce289d4e37-kube-api-access-tzr29" (OuterVolumeSpecName: "kube-api-access-tzr29") pod "cfe1a746-8428-467a-a8c9-c7ce289d4e37" (UID: "cfe1a746-8428-467a-a8c9-c7ce289d4e37"). InnerVolumeSpecName "kube-api-access-tzr29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.489333 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfe1a746-8428-467a-a8c9-c7ce289d4e37-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.489360 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzr29\" (UniqueName: \"kubernetes.io/projected/cfe1a746-8428-467a-a8c9-c7ce289d4e37-kube-api-access-tzr29\") on node \"crc\" DevicePath \"\"" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.523347 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfe1a746-8428-467a-a8c9-c7ce289d4e37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfe1a746-8428-467a-a8c9-c7ce289d4e37" (UID: "cfe1a746-8428-467a-a8c9-c7ce289d4e37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.552279 4921 generic.go:334] "Generic (PLEG): container finished" podID="cfe1a746-8428-467a-a8c9-c7ce289d4e37" containerID="bba2ced13b5c65fc907aa78f8b8c117871aad6ef476d37b23ad3aac4d8648b46" exitCode=0 Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.552478 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cdk9w" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.554254 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdk9w" event={"ID":"cfe1a746-8428-467a-a8c9-c7ce289d4e37","Type":"ContainerDied","Data":"bba2ced13b5c65fc907aa78f8b8c117871aad6ef476d37b23ad3aac4d8648b46"} Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.554293 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cdk9w" event={"ID":"cfe1a746-8428-467a-a8c9-c7ce289d4e37","Type":"ContainerDied","Data":"54aa49d77b0ef5ceed87a7702cab4125408f87e54f610ada370bdb6abb737107"} Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.554314 4921 scope.go:117] "RemoveContainer" containerID="bba2ced13b5c65fc907aa78f8b8c117871aad6ef476d37b23ad3aac4d8648b46" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.578274 4921 scope.go:117] "RemoveContainer" containerID="9b45e778aa554dc03342da866c5b4b95ce27ac621aeaa7cf4dd49e245b659cd7" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.587938 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cdk9w"] Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.593199 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfe1a746-8428-467a-a8c9-c7ce289d4e37-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.596094 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cdk9w"] Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.628994 4921 scope.go:117] "RemoveContainer" containerID="254f0df2ffcb3f6146f875ff185ab99d0c78a44b5b2c7ae0dfef734099f58270" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.680700 4921 scope.go:117] "RemoveContainer" containerID="bba2ced13b5c65fc907aa78f8b8c117871aad6ef476d37b23ad3aac4d8648b46" Mar 12 14:28:05 crc kubenswrapper[4921]: E0312 14:28:04.681709 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba2ced13b5c65fc907aa78f8b8c117871aad6ef476d37b23ad3aac4d8648b46\": container with ID starting with bba2ced13b5c65fc907aa78f8b8c117871aad6ef476d37b23ad3aac4d8648b46 not found: ID does not exist" containerID="bba2ced13b5c65fc907aa78f8b8c117871aad6ef476d37b23ad3aac4d8648b46" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.681750 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba2ced13b5c65fc907aa78f8b8c117871aad6ef476d37b23ad3aac4d8648b46"} err="failed to get container status \"bba2ced13b5c65fc907aa78f8b8c117871aad6ef476d37b23ad3aac4d8648b46\": rpc error: code = NotFound desc = could not find container \"bba2ced13b5c65fc907aa78f8b8c117871aad6ef476d37b23ad3aac4d8648b46\": container with ID starting with bba2ced13b5c65fc907aa78f8b8c117871aad6ef476d37b23ad3aac4d8648b46 not found: ID does not exist" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.681771 4921 scope.go:117] "RemoveContainer" containerID="9b45e778aa554dc03342da866c5b4b95ce27ac621aeaa7cf4dd49e245b659cd7" Mar 12 14:28:05 crc kubenswrapper[4921]: E0312 14:28:04.681999 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b45e778aa554dc03342da866c5b4b95ce27ac621aeaa7cf4dd49e245b659cd7\": container with ID starting with 9b45e778aa554dc03342da866c5b4b95ce27ac621aeaa7cf4dd49e245b659cd7 not found: ID does not exist" containerID="9b45e778aa554dc03342da866c5b4b95ce27ac621aeaa7cf4dd49e245b659cd7" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.682015 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b45e778aa554dc03342da866c5b4b95ce27ac621aeaa7cf4dd49e245b659cd7"} err="failed to get container status \"9b45e778aa554dc03342da866c5b4b95ce27ac621aeaa7cf4dd49e245b659cd7\": rpc error: code = NotFound desc = could not find container \"9b45e778aa554dc03342da866c5b4b95ce27ac621aeaa7cf4dd49e245b659cd7\": container with ID starting with 9b45e778aa554dc03342da866c5b4b95ce27ac621aeaa7cf4dd49e245b659cd7 not found: ID does not exist" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.682028 4921 scope.go:117] "RemoveContainer" containerID="254f0df2ffcb3f6146f875ff185ab99d0c78a44b5b2c7ae0dfef734099f58270" Mar 12 14:28:05 crc kubenswrapper[4921]: E0312 14:28:04.682210 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"254f0df2ffcb3f6146f875ff185ab99d0c78a44b5b2c7ae0dfef734099f58270\": container with ID starting with 254f0df2ffcb3f6146f875ff185ab99d0c78a44b5b2c7ae0dfef734099f58270 not found: ID does not exist" containerID="254f0df2ffcb3f6146f875ff185ab99d0c78a44b5b2c7ae0dfef734099f58270" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.682226 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"254f0df2ffcb3f6146f875ff185ab99d0c78a44b5b2c7ae0dfef734099f58270"} err="failed to get container status \"254f0df2ffcb3f6146f875ff185ab99d0c78a44b5b2c7ae0dfef734099f58270\": rpc error: code = NotFound desc = could not find container \"254f0df2ffcb3f6146f875ff185ab99d0c78a44b5b2c7ae0dfef734099f58270\": container with ID starting with 254f0df2ffcb3f6146f875ff185ab99d0c78a44b5b2c7ae0dfef734099f58270 not found: ID does not exist" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:04.976997 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555428-zclwx" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:05.103587 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5pfz\" (UniqueName: \"kubernetes.io/projected/3d9874e8-164a-4006-8a19-b2c7266e9c3a-kube-api-access-f5pfz\") pod \"3d9874e8-164a-4006-8a19-b2c7266e9c3a\" (UID: \"3d9874e8-164a-4006-8a19-b2c7266e9c3a\") " Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:05.120795 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d9874e8-164a-4006-8a19-b2c7266e9c3a-kube-api-access-f5pfz" (OuterVolumeSpecName: "kube-api-access-f5pfz") pod "3d9874e8-164a-4006-8a19-b2c7266e9c3a" (UID: "3d9874e8-164a-4006-8a19-b2c7266e9c3a"). InnerVolumeSpecName "kube-api-access-f5pfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:05.206205 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5pfz\" (UniqueName: \"kubernetes.io/projected/3d9874e8-164a-4006-8a19-b2c7266e9c3a-kube-api-access-f5pfz\") on node \"crc\" DevicePath \"\"" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:05.428500 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-g6xq6" podUID="21ce62d2-eeff-4060-b114-e4f59406c80d" containerName="registry-server" probeResult="failure" output=< Mar 12 14:28:05 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 14:28:05 crc kubenswrapper[4921]: > Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:05.564455 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555428-zclwx" event={"ID":"3d9874e8-164a-4006-8a19-b2c7266e9c3a","Type":"ContainerDied","Data":"e63f5422f0fb9857535a3a35a7b0d3b3b68095f43a1709175218533d6313ef54"} Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:05.564508 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e63f5422f0fb9857535a3a35a7b0d3b3b68095f43a1709175218533d6313ef54" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:05.564558 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555428-zclwx" Mar 12 14:28:05 crc kubenswrapper[4921]: I0312 14:28:05.988032 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:28:06 crc kubenswrapper[4921]: I0312 14:28:06.012726 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe1a746-8428-467a-a8c9-c7ce289d4e37" path="/var/lib/kubelet/pods/cfe1a746-8428-467a-a8c9-c7ce289d4e37/volumes" Mar 12 14:28:06 crc kubenswrapper[4921]: I0312 14:28:06.065906 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555422-489vr"] Mar 12 14:28:06 crc kubenswrapper[4921]: I0312 14:28:06.073291 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555422-489vr"] Mar 12 14:28:06 crc kubenswrapper[4921]: I0312 14:28:06.575848 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"4aff2f6b38d448fe182dbeed50a960e793420d5a55853297536d4fb1c83afee3"} Mar 12 14:28:07 crc kubenswrapper[4921]: I0312 14:28:07.995532 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2e3284e-67e4-46a0-b710-06ad3ec3ac89" path="/var/lib/kubelet/pods/c2e3284e-67e4-46a0-b710-06ad3ec3ac89/volumes" Mar 12 14:28:14 crc kubenswrapper[4921]: I0312 14:28:14.420960 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:28:14 crc kubenswrapper[4921]: I0312 14:28:14.480887 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:28:14 crc kubenswrapper[4921]: I0312 14:28:14.671259 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g6xq6"] Mar 12 14:28:15 crc kubenswrapper[4921]: I0312 14:28:15.644413 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g6xq6" podUID="21ce62d2-eeff-4060-b114-e4f59406c80d" containerName="registry-server" containerID="cri-o://21efa06b051991d468c6588af42b8506b40a4c9d34987290e0ff00ad0206d7cb" gracePeriod=2 Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.352038 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.478507 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j92s\" (UniqueName: \"kubernetes.io/projected/21ce62d2-eeff-4060-b114-e4f59406c80d-kube-api-access-2j92s\") pod \"21ce62d2-eeff-4060-b114-e4f59406c80d\" (UID: \"21ce62d2-eeff-4060-b114-e4f59406c80d\") " Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.478639 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21ce62d2-eeff-4060-b114-e4f59406c80d-utilities\") pod \"21ce62d2-eeff-4060-b114-e4f59406c80d\" (UID: \"21ce62d2-eeff-4060-b114-e4f59406c80d\") " Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.478841 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21ce62d2-eeff-4060-b114-e4f59406c80d-catalog-content\") pod \"21ce62d2-eeff-4060-b114-e4f59406c80d\" (UID: \"21ce62d2-eeff-4060-b114-e4f59406c80d\") " Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.479435 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21ce62d2-eeff-4060-b114-e4f59406c80d-utilities" (OuterVolumeSpecName: "utilities") pod "21ce62d2-eeff-4060-b114-e4f59406c80d" (UID: "21ce62d2-eeff-4060-b114-e4f59406c80d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.485002 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ce62d2-eeff-4060-b114-e4f59406c80d-kube-api-access-2j92s" (OuterVolumeSpecName: "kube-api-access-2j92s") pod "21ce62d2-eeff-4060-b114-e4f59406c80d" (UID: "21ce62d2-eeff-4060-b114-e4f59406c80d"). InnerVolumeSpecName "kube-api-access-2j92s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.547528 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21ce62d2-eeff-4060-b114-e4f59406c80d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21ce62d2-eeff-4060-b114-e4f59406c80d" (UID: "21ce62d2-eeff-4060-b114-e4f59406c80d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.582035 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j92s\" (UniqueName: \"kubernetes.io/projected/21ce62d2-eeff-4060-b114-e4f59406c80d-kube-api-access-2j92s\") on node \"crc\" DevicePath \"\"" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.582075 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21ce62d2-eeff-4060-b114-e4f59406c80d-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.582090 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21ce62d2-eeff-4060-b114-e4f59406c80d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.656663 4921 generic.go:334] "Generic (PLEG): container finished" podID="21ce62d2-eeff-4060-b114-e4f59406c80d" containerID="21efa06b051991d468c6588af42b8506b40a4c9d34987290e0ff00ad0206d7cb" exitCode=0 Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.656711 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g6xq6" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.656711 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6xq6" event={"ID":"21ce62d2-eeff-4060-b114-e4f59406c80d","Type":"ContainerDied","Data":"21efa06b051991d468c6588af42b8506b40a4c9d34987290e0ff00ad0206d7cb"} Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.656859 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6xq6" event={"ID":"21ce62d2-eeff-4060-b114-e4f59406c80d","Type":"ContainerDied","Data":"97898929bff18db27399fdb5ae6d25b94ab0cddc3e230c73274daf2f5274d109"} Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.656885 4921 scope.go:117] "RemoveContainer" containerID="21efa06b051991d468c6588af42b8506b40a4c9d34987290e0ff00ad0206d7cb" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.692950 4921 scope.go:117] "RemoveContainer" containerID="84c7600aaedbf459fa6c71514ec5b77b9c858b63037ee39d1126be2a955e3d7c" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.694757 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g6xq6"] Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.709301 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g6xq6"] Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.739185 4921 scope.go:117] "RemoveContainer" containerID="4f0d5db6a0b10b72067787ed634983cfc40e8f368d7e51db7e3cb30813629c09" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.773173 4921 scope.go:117] "RemoveContainer" containerID="21efa06b051991d468c6588af42b8506b40a4c9d34987290e0ff00ad0206d7cb" Mar 12 14:28:16 crc kubenswrapper[4921]: E0312 14:28:16.773605 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21efa06b051991d468c6588af42b8506b40a4c9d34987290e0ff00ad0206d7cb\": container with ID starting with 21efa06b051991d468c6588af42b8506b40a4c9d34987290e0ff00ad0206d7cb not found: ID does not exist" containerID="21efa06b051991d468c6588af42b8506b40a4c9d34987290e0ff00ad0206d7cb" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.773641 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21efa06b051991d468c6588af42b8506b40a4c9d34987290e0ff00ad0206d7cb"} err="failed to get container status \"21efa06b051991d468c6588af42b8506b40a4c9d34987290e0ff00ad0206d7cb\": rpc error: code = NotFound desc = could not find container \"21efa06b051991d468c6588af42b8506b40a4c9d34987290e0ff00ad0206d7cb\": container with ID starting with 21efa06b051991d468c6588af42b8506b40a4c9d34987290e0ff00ad0206d7cb not found: ID does not exist" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.773666 4921 scope.go:117] "RemoveContainer" containerID="84c7600aaedbf459fa6c71514ec5b77b9c858b63037ee39d1126be2a955e3d7c" Mar 12 14:28:16 crc kubenswrapper[4921]: E0312 14:28:16.774021 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84c7600aaedbf459fa6c71514ec5b77b9c858b63037ee39d1126be2a955e3d7c\": container with ID starting with 84c7600aaedbf459fa6c71514ec5b77b9c858b63037ee39d1126be2a955e3d7c not found: ID does not exist" containerID="84c7600aaedbf459fa6c71514ec5b77b9c858b63037ee39d1126be2a955e3d7c" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.774133 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84c7600aaedbf459fa6c71514ec5b77b9c858b63037ee39d1126be2a955e3d7c"} err="failed to get container status \"84c7600aaedbf459fa6c71514ec5b77b9c858b63037ee39d1126be2a955e3d7c\": rpc error: code = NotFound desc = could not find container \"84c7600aaedbf459fa6c71514ec5b77b9c858b63037ee39d1126be2a955e3d7c\": container with ID starting with 84c7600aaedbf459fa6c71514ec5b77b9c858b63037ee39d1126be2a955e3d7c not found: ID does not exist" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.774225 4921 scope.go:117] "RemoveContainer" containerID="4f0d5db6a0b10b72067787ed634983cfc40e8f368d7e51db7e3cb30813629c09" Mar 12 14:28:16 crc kubenswrapper[4921]: E0312 14:28:16.774600 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0d5db6a0b10b72067787ed634983cfc40e8f368d7e51db7e3cb30813629c09\": container with ID starting with 4f0d5db6a0b10b72067787ed634983cfc40e8f368d7e51db7e3cb30813629c09 not found: ID does not exist" containerID="4f0d5db6a0b10b72067787ed634983cfc40e8f368d7e51db7e3cb30813629c09" Mar 12 14:28:16 crc kubenswrapper[4921]: I0312 14:28:16.774696 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0d5db6a0b10b72067787ed634983cfc40e8f368d7e51db7e3cb30813629c09"} err="failed to get container status \"4f0d5db6a0b10b72067787ed634983cfc40e8f368d7e51db7e3cb30813629c09\": rpc error: code = NotFound desc = could not find container \"4f0d5db6a0b10b72067787ed634983cfc40e8f368d7e51db7e3cb30813629c09\": container with ID starting with 4f0d5db6a0b10b72067787ed634983cfc40e8f368d7e51db7e3cb30813629c09 not found: ID does not exist" Mar 12 14:28:17 crc kubenswrapper[4921]: I0312 14:28:17.995530 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21ce62d2-eeff-4060-b114-e4f59406c80d" path="/var/lib/kubelet/pods/21ce62d2-eeff-4060-b114-e4f59406c80d/volumes" Mar 12 14:28:47 crc kubenswrapper[4921]: I0312 14:28:47.411844 4921 scope.go:117] "RemoveContainer" containerID="854a7a6c0535bd4d7baa5d328809d02e695426a2a3bcf4be42534dd99ce85415" Mar 12 14:29:09 crc kubenswrapper[4921]: I0312 14:29:09.535479 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-dctml" podUID="9a31a895-ced3-4285-8105-448501c3ceac" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.179227 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555430-f5nxw"] Mar 12 14:30:00 crc kubenswrapper[4921]: E0312 14:30:00.181928 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe1a746-8428-467a-a8c9-c7ce289d4e37" containerName="registry-server" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.182311 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe1a746-8428-467a-a8c9-c7ce289d4e37" containerName="registry-server" Mar 12 14:30:00 crc kubenswrapper[4921]: E0312 14:30:00.182427 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ce62d2-eeff-4060-b114-e4f59406c80d" containerName="extract-utilities" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.182522 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ce62d2-eeff-4060-b114-e4f59406c80d" containerName="extract-utilities" Mar 12 14:30:00 crc kubenswrapper[4921]: E0312 14:30:00.182652 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe1a746-8428-467a-a8c9-c7ce289d4e37" containerName="extract-content" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.182745 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe1a746-8428-467a-a8c9-c7ce289d4e37" containerName="extract-content" Mar 12 14:30:00 crc kubenswrapper[4921]: E0312 14:30:00.182883 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ce62d2-eeff-4060-b114-e4f59406c80d" containerName="registry-server" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.182989 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ce62d2-eeff-4060-b114-e4f59406c80d" containerName="registry-server" Mar 12 14:30:00 crc kubenswrapper[4921]: E0312 14:30:00.183093 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe1a746-8428-467a-a8c9-c7ce289d4e37" containerName="extract-utilities" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.183190 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe1a746-8428-467a-a8c9-c7ce289d4e37" containerName="extract-utilities" Mar 12 14:30:00 crc kubenswrapper[4921]: E0312 14:30:00.183300 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ce62d2-eeff-4060-b114-e4f59406c80d" containerName="extract-content" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.183395 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ce62d2-eeff-4060-b114-e4f59406c80d" containerName="extract-content" Mar 12 14:30:00 crc kubenswrapper[4921]: E0312 14:30:00.183501 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d9874e8-164a-4006-8a19-b2c7266e9c3a" containerName="oc" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.183614 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d9874e8-164a-4006-8a19-b2c7266e9c3a" containerName="oc" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.184113 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d9874e8-164a-4006-8a19-b2c7266e9c3a" containerName="oc" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.184268 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe1a746-8428-467a-a8c9-c7ce289d4e37" containerName="registry-server" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.184390 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ce62d2-eeff-4060-b114-e4f59406c80d" containerName="registry-server" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.185544 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555430-f5nxw" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.188555 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.188894 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.189043 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.191490 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6"] Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.193478 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.195107 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.197000 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.200568 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555430-f5nxw"] Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.224967 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6"] Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.272870 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lhmv\" (UniqueName: \"kubernetes.io/projected/118e714c-50a7-422c-9c55-a03871013348-kube-api-access-4lhmv\") pod \"collect-profiles-29555430-cjlq6\" (UID: \"118e714c-50a7-422c-9c55-a03871013348\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.273047 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/118e714c-50a7-422c-9c55-a03871013348-config-volume\") pod \"collect-profiles-29555430-cjlq6\" (UID: \"118e714c-50a7-422c-9c55-a03871013348\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.273101 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/118e714c-50a7-422c-9c55-a03871013348-secret-volume\") pod \"collect-profiles-29555430-cjlq6\" (UID: \"118e714c-50a7-422c-9c55-a03871013348\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.273205 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84wkp\" (UniqueName: \"kubernetes.io/projected/fb5d8d7f-c30c-49d1-bbae-15118735c509-kube-api-access-84wkp\") pod \"auto-csr-approver-29555430-f5nxw\" (UID: \"fb5d8d7f-c30c-49d1-bbae-15118735c509\") " pod="openshift-infra/auto-csr-approver-29555430-f5nxw" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.375272 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/118e714c-50a7-422c-9c55-a03871013348-secret-volume\") pod \"collect-profiles-29555430-cjlq6\" (UID: \"118e714c-50a7-422c-9c55-a03871013348\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.375380 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84wkp\" (UniqueName: \"kubernetes.io/projected/fb5d8d7f-c30c-49d1-bbae-15118735c509-kube-api-access-84wkp\") pod \"auto-csr-approver-29555430-f5nxw\" (UID: \"fb5d8d7f-c30c-49d1-bbae-15118735c509\") " pod="openshift-infra/auto-csr-approver-29555430-f5nxw" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.375441 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lhmv\" (UniqueName: \"kubernetes.io/projected/118e714c-50a7-422c-9c55-a03871013348-kube-api-access-4lhmv\") pod \"collect-profiles-29555430-cjlq6\" (UID: \"118e714c-50a7-422c-9c55-a03871013348\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.375503 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/118e714c-50a7-422c-9c55-a03871013348-config-volume\") pod \"collect-profiles-29555430-cjlq6\" (UID: \"118e714c-50a7-422c-9c55-a03871013348\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.376491 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/118e714c-50a7-422c-9c55-a03871013348-config-volume\") pod \"collect-profiles-29555430-cjlq6\" (UID: \"118e714c-50a7-422c-9c55-a03871013348\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.387465 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/118e714c-50a7-422c-9c55-a03871013348-secret-volume\") pod \"collect-profiles-29555430-cjlq6\" (UID: \"118e714c-50a7-422c-9c55-a03871013348\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.393540 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lhmv\" (UniqueName: \"kubernetes.io/projected/118e714c-50a7-422c-9c55-a03871013348-kube-api-access-4lhmv\") pod \"collect-profiles-29555430-cjlq6\" (UID: \"118e714c-50a7-422c-9c55-a03871013348\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.394999 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84wkp\" (UniqueName: \"kubernetes.io/projected/fb5d8d7f-c30c-49d1-bbae-15118735c509-kube-api-access-84wkp\") pod \"auto-csr-approver-29555430-f5nxw\" (UID: \"fb5d8d7f-c30c-49d1-bbae-15118735c509\") " pod="openshift-infra/auto-csr-approver-29555430-f5nxw" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.519749 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555430-f5nxw" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.529365 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" Mar 12 14:30:00 crc kubenswrapper[4921]: I0312 14:30:00.999571 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555430-f5nxw"] Mar 12 14:30:01 crc kubenswrapper[4921]: I0312 14:30:01.039627 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555430-f5nxw" event={"ID":"fb5d8d7f-c30c-49d1-bbae-15118735c509","Type":"ContainerStarted","Data":"a934c7c2d94fc577e8c9a03cd2ea938d45c7bce24702dcbe4f47e5897dd97249"} Mar 12 14:30:01 crc kubenswrapper[4921]: I0312 14:30:01.146401 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6"] Mar 12 14:30:02 crc kubenswrapper[4921]: I0312 14:30:02.050553 4921 generic.go:334] "Generic (PLEG): container finished" podID="118e714c-50a7-422c-9c55-a03871013348" containerID="b6b57bc1dbe0c66620682ae98b002231d25a3501b243d6cccc1010358e35ad2a" exitCode=0 Mar 12 14:30:02 crc kubenswrapper[4921]: I0312 14:30:02.050670 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" event={"ID":"118e714c-50a7-422c-9c55-a03871013348","Type":"ContainerDied","Data":"b6b57bc1dbe0c66620682ae98b002231d25a3501b243d6cccc1010358e35ad2a"} Mar 12 14:30:02 crc kubenswrapper[4921]: I0312 14:30:02.052142 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" event={"ID":"118e714c-50a7-422c-9c55-a03871013348","Type":"ContainerStarted","Data":"41de3624b7d89e1a6aca33fdd44121d7ce92c348925b5686fdc3654bb3aa642b"} Mar 12 14:30:03 crc kubenswrapper[4921]: I0312 14:30:03.779629 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" Mar 12 14:30:03 crc kubenswrapper[4921]: I0312 14:30:03.847995 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/118e714c-50a7-422c-9c55-a03871013348-secret-volume\") pod \"118e714c-50a7-422c-9c55-a03871013348\" (UID: \"118e714c-50a7-422c-9c55-a03871013348\") " Mar 12 14:30:03 crc kubenswrapper[4921]: I0312 14:30:03.848438 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/118e714c-50a7-422c-9c55-a03871013348-config-volume\") pod \"118e714c-50a7-422c-9c55-a03871013348\" (UID: \"118e714c-50a7-422c-9c55-a03871013348\") " Mar 12 14:30:03 crc kubenswrapper[4921]: I0312 14:30:03.848556 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lhmv\" (UniqueName: \"kubernetes.io/projected/118e714c-50a7-422c-9c55-a03871013348-kube-api-access-4lhmv\") pod \"118e714c-50a7-422c-9c55-a03871013348\" (UID: \"118e714c-50a7-422c-9c55-a03871013348\") " Mar 12 14:30:03 crc kubenswrapper[4921]: I0312 14:30:03.848926 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/118e714c-50a7-422c-9c55-a03871013348-config-volume" (OuterVolumeSpecName: "config-volume") pod "118e714c-50a7-422c-9c55-a03871013348" (UID: "118e714c-50a7-422c-9c55-a03871013348"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:30:03 crc kubenswrapper[4921]: I0312 14:30:03.849372 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/118e714c-50a7-422c-9c55-a03871013348-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 14:30:03 crc kubenswrapper[4921]: I0312 14:30:03.855889 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/118e714c-50a7-422c-9c55-a03871013348-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "118e714c-50a7-422c-9c55-a03871013348" (UID: "118e714c-50a7-422c-9c55-a03871013348"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:30:03 crc kubenswrapper[4921]: I0312 14:30:03.856093 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/118e714c-50a7-422c-9c55-a03871013348-kube-api-access-4lhmv" (OuterVolumeSpecName: "kube-api-access-4lhmv") pod "118e714c-50a7-422c-9c55-a03871013348" (UID: "118e714c-50a7-422c-9c55-a03871013348"). InnerVolumeSpecName "kube-api-access-4lhmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:30:03 crc kubenswrapper[4921]: I0312 14:30:03.950927 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/118e714c-50a7-422c-9c55-a03871013348-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 14:30:03 crc kubenswrapper[4921]: I0312 14:30:03.950971 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lhmv\" (UniqueName: \"kubernetes.io/projected/118e714c-50a7-422c-9c55-a03871013348-kube-api-access-4lhmv\") on node \"crc\" DevicePath \"\"" Mar 12 14:30:04 crc kubenswrapper[4921]: I0312 14:30:04.070703 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555430-f5nxw" event={"ID":"fb5d8d7f-c30c-49d1-bbae-15118735c509","Type":"ContainerStarted","Data":"08a7da22d34f11c808f6e7ec4dd44145d9f2e4e8817574c3f8a3a61a792500bf"} Mar 12 14:30:04 crc kubenswrapper[4921]: I0312 14:30:04.072761 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" event={"ID":"118e714c-50a7-422c-9c55-a03871013348","Type":"ContainerDied","Data":"41de3624b7d89e1a6aca33fdd44121d7ce92c348925b5686fdc3654bb3aa642b"} Mar 12 14:30:04 crc kubenswrapper[4921]: I0312 14:30:04.072826 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41de3624b7d89e1a6aca33fdd44121d7ce92c348925b5686fdc3654bb3aa642b" Mar 12 14:30:04 crc kubenswrapper[4921]: I0312 14:30:04.072891 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6" Mar 12 14:30:04 crc kubenswrapper[4921]: I0312 14:30:04.092922 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555430-f5nxw" podStartSLOduration=1.667348031 podStartE2EDuration="4.092896806s" podCreationTimestamp="2026-03-12 14:30:00 +0000 UTC" firstStartedPulling="2026-03-12 14:30:01.002769888 +0000 UTC m=+4823.692841929" lastFinishedPulling="2026-03-12 14:30:03.428318733 +0000 UTC m=+4826.118390704" observedRunningTime="2026-03-12 14:30:04.088870892 +0000 UTC m=+4826.778942873" watchObservedRunningTime="2026-03-12 14:30:04.092896806 +0000 UTC m=+4826.782968807" Mar 12 14:30:04 crc kubenswrapper[4921]: I0312 14:30:04.853917 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7"] Mar 12 14:30:04 crc kubenswrapper[4921]: I0312 14:30:04.861409 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555385-q57v7"] Mar 12 14:30:05 crc kubenswrapper[4921]: I0312 14:30:05.082247 4921 generic.go:334] "Generic (PLEG): container finished" podID="fb5d8d7f-c30c-49d1-bbae-15118735c509" containerID="08a7da22d34f11c808f6e7ec4dd44145d9f2e4e8817574c3f8a3a61a792500bf" exitCode=0 Mar 12 14:30:05 crc kubenswrapper[4921]: I0312 14:30:05.082293 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555430-f5nxw" event={"ID":"fb5d8d7f-c30c-49d1-bbae-15118735c509","Type":"ContainerDied","Data":"08a7da22d34f11c808f6e7ec4dd44145d9f2e4e8817574c3f8a3a61a792500bf"} Mar 12 14:30:05 crc kubenswrapper[4921]: I0312 14:30:05.996902 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="816dce54-4563-4471-815c-2c8ecbb5bad1" path="/var/lib/kubelet/pods/816dce54-4563-4471-815c-2c8ecbb5bad1/volumes" Mar 12 14:30:07 crc kubenswrapper[4921]: I0312 14:30:07.100217 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555430-f5nxw" event={"ID":"fb5d8d7f-c30c-49d1-bbae-15118735c509","Type":"ContainerDied","Data":"a934c7c2d94fc577e8c9a03cd2ea938d45c7bce24702dcbe4f47e5897dd97249"} Mar 12 14:30:07 crc kubenswrapper[4921]: I0312 14:30:07.100523 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a934c7c2d94fc577e8c9a03cd2ea938d45c7bce24702dcbe4f47e5897dd97249" Mar 12 14:30:07 crc kubenswrapper[4921]: I0312 14:30:07.169342 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555430-f5nxw" Mar 12 14:30:07 crc kubenswrapper[4921]: I0312 14:30:07.334266 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84wkp\" (UniqueName: \"kubernetes.io/projected/fb5d8d7f-c30c-49d1-bbae-15118735c509-kube-api-access-84wkp\") pod \"fb5d8d7f-c30c-49d1-bbae-15118735c509\" (UID: \"fb5d8d7f-c30c-49d1-bbae-15118735c509\") " Mar 12 14:30:07 crc kubenswrapper[4921]: I0312 14:30:07.354589 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb5d8d7f-c30c-49d1-bbae-15118735c509-kube-api-access-84wkp" (OuterVolumeSpecName: "kube-api-access-84wkp") pod "fb5d8d7f-c30c-49d1-bbae-15118735c509" (UID: "fb5d8d7f-c30c-49d1-bbae-15118735c509"). InnerVolumeSpecName "kube-api-access-84wkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:30:07 crc kubenswrapper[4921]: I0312 14:30:07.436807 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84wkp\" (UniqueName: \"kubernetes.io/projected/fb5d8d7f-c30c-49d1-bbae-15118735c509-kube-api-access-84wkp\") on node \"crc\" DevicePath \"\"" Mar 12 14:30:08 crc kubenswrapper[4921]: I0312 14:30:08.107402 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555430-f5nxw" Mar 12 14:30:08 crc kubenswrapper[4921]: I0312 14:30:08.233903 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555424-lx78r"] Mar 12 14:30:08 crc kubenswrapper[4921]: I0312 14:30:08.242903 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555424-lx78r"] Mar 12 14:30:09 crc kubenswrapper[4921]: I0312 14:30:09.993797 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ba2cd1-8f0a-454b-9fa4-18d7ea84706f" path="/var/lib/kubelet/pods/23ba2cd1-8f0a-454b-9fa4-18d7ea84706f/volumes" Mar 12 14:30:26 crc kubenswrapper[4921]: I0312 14:30:26.323940 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:30:26 crc kubenswrapper[4921]: I0312 14:30:26.325068 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:30:47 crc kubenswrapper[4921]: I0312 14:30:47.551319 4921 scope.go:117] "RemoveContainer" containerID="4f095873852ad8be7a6dc69a1860ece4812d82b6dc197f225f98f8d32d4b1bf7" Mar 12 14:30:47 crc kubenswrapper[4921]: I0312 14:30:47.601640 4921 scope.go:117] "RemoveContainer" containerID="a48add2b7ff4f878626c8146881de1bad7dd0ec1adfda661b4e4fad8a56794fe" Mar 12 14:30:56 crc kubenswrapper[4921]: I0312 14:30:56.323855 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:30:56 crc kubenswrapper[4921]: I0312 14:30:56.325270 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:31:26 crc kubenswrapper[4921]: I0312 14:31:26.323715 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:31:26 crc kubenswrapper[4921]: I0312 14:31:26.324371 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:31:26 crc kubenswrapper[4921]: I0312 14:31:26.324422 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 14:31:26 crc kubenswrapper[4921]: I0312 14:31:26.325211 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4aff2f6b38d448fe182dbeed50a960e793420d5a55853297536d4fb1c83afee3"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 14:31:26 crc kubenswrapper[4921]: I0312 14:31:26.325280 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://4aff2f6b38d448fe182dbeed50a960e793420d5a55853297536d4fb1c83afee3" gracePeriod=600 Mar 12 14:31:26 crc kubenswrapper[4921]: I0312 14:31:26.808662 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="4aff2f6b38d448fe182dbeed50a960e793420d5a55853297536d4fb1c83afee3" exitCode=0 Mar 12 14:31:26 crc kubenswrapper[4921]: I0312 14:31:26.808777 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"4aff2f6b38d448fe182dbeed50a960e793420d5a55853297536d4fb1c83afee3"} Mar 12 14:31:26 crc kubenswrapper[4921]: I0312 14:31:26.808971 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6"} Mar 12 14:31:26 crc kubenswrapper[4921]: I0312 14:31:26.808993 4921 scope.go:117] "RemoveContainer" containerID="12370389863c3af766e7472c47055f71694070a2a03921a4ebd3ad148001649f" Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.153835 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555432-v6b5m"] Mar 12 14:32:00 crc kubenswrapper[4921]: E0312 14:32:00.154955 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5d8d7f-c30c-49d1-bbae-15118735c509" containerName="oc" Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.154974 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5d8d7f-c30c-49d1-bbae-15118735c509" containerName="oc" Mar 12 14:32:00 crc kubenswrapper[4921]: E0312 14:32:00.155013 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118e714c-50a7-422c-9c55-a03871013348" containerName="collect-profiles" Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.155022 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="118e714c-50a7-422c-9c55-a03871013348" containerName="collect-profiles" Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.155244 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5d8d7f-c30c-49d1-bbae-15118735c509" containerName="oc" Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.155268 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="118e714c-50a7-422c-9c55-a03871013348" containerName="collect-profiles" Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.155909 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555432-v6b5m" Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.163002 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555432-v6b5m"] Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.168263 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.168284 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.174643 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.257104 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkwsh\" (UniqueName: \"kubernetes.io/projected/b9c0c6dc-1279-4413-b50b-51bc90052ee4-kube-api-access-hkwsh\") pod \"auto-csr-approver-29555432-v6b5m\" (UID: \"b9c0c6dc-1279-4413-b50b-51bc90052ee4\") " pod="openshift-infra/auto-csr-approver-29555432-v6b5m" Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.360853 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkwsh\" (UniqueName: \"kubernetes.io/projected/b9c0c6dc-1279-4413-b50b-51bc90052ee4-kube-api-access-hkwsh\") pod \"auto-csr-approver-29555432-v6b5m\" (UID: \"b9c0c6dc-1279-4413-b50b-51bc90052ee4\") " pod="openshift-infra/auto-csr-approver-29555432-v6b5m" Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.395857 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkwsh\" (UniqueName: \"kubernetes.io/projected/b9c0c6dc-1279-4413-b50b-51bc90052ee4-kube-api-access-hkwsh\") pod \"auto-csr-approver-29555432-v6b5m\" (UID: \"b9c0c6dc-1279-4413-b50b-51bc90052ee4\") " pod="openshift-infra/auto-csr-approver-29555432-v6b5m" Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.485453 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555432-v6b5m" Mar 12 14:32:00 crc kubenswrapper[4921]: I0312 14:32:00.991578 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555432-v6b5m"] Mar 12 14:32:00 crc kubenswrapper[4921]: W0312 14:32:00.996289 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9c0c6dc_1279_4413_b50b_51bc90052ee4.slice/crio-8be81edb84ddd2229282e70fa7e7dfed1ee0a5b75c7bacba6dcf960ba93bfb4a WatchSource:0}: Error finding container 8be81edb84ddd2229282e70fa7e7dfed1ee0a5b75c7bacba6dcf960ba93bfb4a: Status 404 returned error can't find the container with id 8be81edb84ddd2229282e70fa7e7dfed1ee0a5b75c7bacba6dcf960ba93bfb4a Mar 12 14:32:01 crc kubenswrapper[4921]: I0312 14:32:01.118897 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555432-v6b5m" event={"ID":"b9c0c6dc-1279-4413-b50b-51bc90052ee4","Type":"ContainerStarted","Data":"8be81edb84ddd2229282e70fa7e7dfed1ee0a5b75c7bacba6dcf960ba93bfb4a"} Mar 12 14:32:03 crc kubenswrapper[4921]: I0312 14:32:03.133364 4921 generic.go:334] "Generic (PLEG): container finished" podID="b9c0c6dc-1279-4413-b50b-51bc90052ee4" containerID="85335a632c5b9f99465a185589b65d59f09576dadde18531099e1346ada07db3" exitCode=0 Mar 12 14:32:03 crc kubenswrapper[4921]: I0312 14:32:03.133504 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555432-v6b5m" event={"ID":"b9c0c6dc-1279-4413-b50b-51bc90052ee4","Type":"ContainerDied","Data":"85335a632c5b9f99465a185589b65d59f09576dadde18531099e1346ada07db3"} Mar 12 14:32:04 crc kubenswrapper[4921]: I0312 14:32:04.815531 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555432-v6b5m" Mar 12 14:32:04 crc kubenswrapper[4921]: I0312 14:32:04.883090 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkwsh\" (UniqueName: \"kubernetes.io/projected/b9c0c6dc-1279-4413-b50b-51bc90052ee4-kube-api-access-hkwsh\") pod \"b9c0c6dc-1279-4413-b50b-51bc90052ee4\" (UID: \"b9c0c6dc-1279-4413-b50b-51bc90052ee4\") " Mar 12 14:32:04 crc kubenswrapper[4921]: I0312 14:32:04.890065 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9c0c6dc-1279-4413-b50b-51bc90052ee4-kube-api-access-hkwsh" (OuterVolumeSpecName: "kube-api-access-hkwsh") pod "b9c0c6dc-1279-4413-b50b-51bc90052ee4" (UID: "b9c0c6dc-1279-4413-b50b-51bc90052ee4"). InnerVolumeSpecName "kube-api-access-hkwsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:32:04 crc kubenswrapper[4921]: I0312 14:32:04.985259 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkwsh\" (UniqueName: \"kubernetes.io/projected/b9c0c6dc-1279-4413-b50b-51bc90052ee4-kube-api-access-hkwsh\") on node \"crc\" DevicePath \"\"" Mar 12 14:32:05 crc kubenswrapper[4921]: I0312 14:32:05.151597 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555432-v6b5m" event={"ID":"b9c0c6dc-1279-4413-b50b-51bc90052ee4","Type":"ContainerDied","Data":"8be81edb84ddd2229282e70fa7e7dfed1ee0a5b75c7bacba6dcf960ba93bfb4a"} Mar 12 14:32:05 crc kubenswrapper[4921]: I0312 14:32:05.151912 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8be81edb84ddd2229282e70fa7e7dfed1ee0a5b75c7bacba6dcf960ba93bfb4a" Mar 12 14:32:05 crc kubenswrapper[4921]: I0312 14:32:05.151678 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555432-v6b5m" Mar 12 14:32:05 crc kubenswrapper[4921]: I0312 14:32:05.889926 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555426-9v2qd"] Mar 12 14:32:05 crc kubenswrapper[4921]: I0312 14:32:05.898580 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555426-9v2qd"] Mar 12 14:32:05 crc kubenswrapper[4921]: I0312 14:32:05.992878 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8" path="/var/lib/kubelet/pods/a9b1133d-3b9b-4db4-bd83-c5fee1cf95b8/volumes" Mar 12 14:32:08 crc kubenswrapper[4921]: I0312 14:32:08.860144 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p9jwl"] Mar 12 14:32:08 crc kubenswrapper[4921]: E0312 14:32:08.861891 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9c0c6dc-1279-4413-b50b-51bc90052ee4" containerName="oc" Mar 12 14:32:08 crc kubenswrapper[4921]: I0312 14:32:08.861966 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c0c6dc-1279-4413-b50b-51bc90052ee4" containerName="oc" Mar 12 14:32:08 crc kubenswrapper[4921]: I0312 14:32:08.862252 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9c0c6dc-1279-4413-b50b-51bc90052ee4" containerName="oc" Mar 12 14:32:08 crc kubenswrapper[4921]: I0312 14:32:08.865402 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:08 crc kubenswrapper[4921]: I0312 14:32:08.872957 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p9jwl"] Mar 12 14:32:08 crc kubenswrapper[4921]: I0312 14:32:08.959915 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-catalog-content\") pod \"community-operators-p9jwl\" (UID: \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\") " pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:08 crc kubenswrapper[4921]: I0312 14:32:08.959967 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-utilities\") pod \"community-operators-p9jwl\" (UID: \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\") " pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:08 crc kubenswrapper[4921]: I0312 14:32:08.960043 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r87xh\" (UniqueName: \"kubernetes.io/projected/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-kube-api-access-r87xh\") pod \"community-operators-p9jwl\" (UID: \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\") " pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:09 crc kubenswrapper[4921]: I0312 14:32:09.062406 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-catalog-content\") pod \"community-operators-p9jwl\" (UID: \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\") " pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:09 crc kubenswrapper[4921]: I0312 14:32:09.062983 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-utilities\") pod \"community-operators-p9jwl\" (UID: \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\") " pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:09 crc kubenswrapper[4921]: I0312 14:32:09.063359 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r87xh\" (UniqueName: \"kubernetes.io/projected/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-kube-api-access-r87xh\") pod \"community-operators-p9jwl\" (UID: \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\") " pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:09 crc kubenswrapper[4921]: I0312 14:32:09.063246 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-utilities\") pod \"community-operators-p9jwl\" (UID: \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\") " pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:09 crc kubenswrapper[4921]: I0312 14:32:09.062944 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-catalog-content\") pod \"community-operators-p9jwl\" (UID: \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\") " pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:09 crc kubenswrapper[4921]: I0312 14:32:09.312477 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r87xh\" (UniqueName: \"kubernetes.io/projected/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-kube-api-access-r87xh\") pod \"community-operators-p9jwl\" (UID: \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\") " pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:09 crc kubenswrapper[4921]: I0312 14:32:09.489282 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:09 crc kubenswrapper[4921]: I0312 14:32:09.951060 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p9jwl"] Mar 12 14:32:09 crc kubenswrapper[4921]: W0312 14:32:09.955807 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3676f34a_5a2a_49ce_96fc_ec150d8cd6d1.slice/crio-47ac42119835c1fffacb0ded8f5954f03b09b09fa408da7ca743b1940c6b4f02 WatchSource:0}: Error finding container 47ac42119835c1fffacb0ded8f5954f03b09b09fa408da7ca743b1940c6b4f02: Status 404 returned error can't find the container with id 47ac42119835c1fffacb0ded8f5954f03b09b09fa408da7ca743b1940c6b4f02 Mar 12 14:32:10 crc kubenswrapper[4921]: I0312 14:32:10.190993 4921 generic.go:334] "Generic (PLEG): container finished" podID="3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" containerID="f9205d5e51f29134dc1104be11b480e5c6d1de432f6dfa9b9f1b3901ce2eabaf" exitCode=0 Mar 12 14:32:10 crc kubenswrapper[4921]: I0312 14:32:10.191041 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9jwl" event={"ID":"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1","Type":"ContainerDied","Data":"f9205d5e51f29134dc1104be11b480e5c6d1de432f6dfa9b9f1b3901ce2eabaf"} Mar 12 14:32:10 crc kubenswrapper[4921]: I0312 14:32:10.191067 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9jwl" event={"ID":"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1","Type":"ContainerStarted","Data":"47ac42119835c1fffacb0ded8f5954f03b09b09fa408da7ca743b1940c6b4f02"} Mar 12 14:32:11 crc kubenswrapper[4921]: I0312 14:32:11.254118 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ktj6h"] Mar 12 14:32:11 crc kubenswrapper[4921]: I0312 14:32:11.256542 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:11 crc kubenswrapper[4921]: I0312 14:32:11.263968 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktj6h"] Mar 12 14:32:11 crc kubenswrapper[4921]: I0312 14:32:11.308544 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f49a4347-0cb5-4716-95a0-d93667f7e1ca-catalog-content\") pod \"redhat-marketplace-ktj6h\" (UID: \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\") " pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:11 crc kubenswrapper[4921]: I0312 14:32:11.308664 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w27pv\" (UniqueName: \"kubernetes.io/projected/f49a4347-0cb5-4716-95a0-d93667f7e1ca-kube-api-access-w27pv\") pod \"redhat-marketplace-ktj6h\" (UID: \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\") " pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:11 crc kubenswrapper[4921]: I0312 14:32:11.308788 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f49a4347-0cb5-4716-95a0-d93667f7e1ca-utilities\") pod \"redhat-marketplace-ktj6h\" (UID: \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\") " pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:11 crc kubenswrapper[4921]: I0312 14:32:11.410866 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f49a4347-0cb5-4716-95a0-d93667f7e1ca-catalog-content\") pod \"redhat-marketplace-ktj6h\" (UID: \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\") " pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:11 crc kubenswrapper[4921]: I0312 14:32:11.410963 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w27pv\" (UniqueName: \"kubernetes.io/projected/f49a4347-0cb5-4716-95a0-d93667f7e1ca-kube-api-access-w27pv\") pod \"redhat-marketplace-ktj6h\" (UID: \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\") " pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:11 crc kubenswrapper[4921]: I0312 14:32:11.411067 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f49a4347-0cb5-4716-95a0-d93667f7e1ca-utilities\") pod \"redhat-marketplace-ktj6h\" (UID: \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\") " pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:11 crc kubenswrapper[4921]: I0312 14:32:11.411349 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f49a4347-0cb5-4716-95a0-d93667f7e1ca-catalog-content\") pod \"redhat-marketplace-ktj6h\" (UID: \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\") " pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:11 crc kubenswrapper[4921]: I0312 14:32:11.411486 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f49a4347-0cb5-4716-95a0-d93667f7e1ca-utilities\") pod \"redhat-marketplace-ktj6h\" (UID: \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\") " pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:11 crc kubenswrapper[4921]: I0312 14:32:11.445475 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w27pv\" (UniqueName: \"kubernetes.io/projected/f49a4347-0cb5-4716-95a0-d93667f7e1ca-kube-api-access-w27pv\") pod \"redhat-marketplace-ktj6h\" (UID: \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\") " pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:11 crc kubenswrapper[4921]: I0312 14:32:11.612885 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:12 crc kubenswrapper[4921]: I0312 14:32:12.112264 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktj6h"] Mar 12 14:32:12 crc kubenswrapper[4921]: I0312 14:32:12.209656 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktj6h" event={"ID":"f49a4347-0cb5-4716-95a0-d93667f7e1ca","Type":"ContainerStarted","Data":"cdb44a338119df1e1305c3f979253bf2bc3e467b3c216ad7ca3448464df930ff"} Mar 12 14:32:13 crc kubenswrapper[4921]: I0312 14:32:13.221043 4921 generic.go:334] "Generic (PLEG): container finished" podID="f49a4347-0cb5-4716-95a0-d93667f7e1ca" containerID="a61998a773a4a3ee9d3ab4d6f1ceb5850d8bd7ad9b64709700cd89b2bf7dfd7b" exitCode=0 Mar 12 14:32:13 crc kubenswrapper[4921]: I0312 14:32:13.221210 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktj6h" event={"ID":"f49a4347-0cb5-4716-95a0-d93667f7e1ca","Type":"ContainerDied","Data":"a61998a773a4a3ee9d3ab4d6f1ceb5850d8bd7ad9b64709700cd89b2bf7dfd7b"} Mar 12 14:32:17 crc kubenswrapper[4921]: I0312 14:32:17.269477 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktj6h" event={"ID":"f49a4347-0cb5-4716-95a0-d93667f7e1ca","Type":"ContainerStarted","Data":"d948ed2bfacc3484a4deb934a24feea8669e6a7dba1162a417946ace9e39e2c4"} Mar 12 14:32:17 crc kubenswrapper[4921]: I0312 14:32:17.277150 4921 generic.go:334] "Generic (PLEG): container finished" podID="3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" containerID="11b4fed238329cc5f698a62cbfcae7a83cdede498550ed9f6cd995d608dc9e95" exitCode=0 Mar 12 14:32:17 crc kubenswrapper[4921]: I0312 14:32:17.277213 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9jwl" event={"ID":"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1","Type":"ContainerDied","Data":"11b4fed238329cc5f698a62cbfcae7a83cdede498550ed9f6cd995d608dc9e95"} Mar 12 14:32:19 crc kubenswrapper[4921]: I0312 14:32:19.299363 4921 generic.go:334] "Generic (PLEG): container finished" podID="f49a4347-0cb5-4716-95a0-d93667f7e1ca" containerID="d948ed2bfacc3484a4deb934a24feea8669e6a7dba1162a417946ace9e39e2c4" exitCode=0 Mar 12 14:32:19 crc kubenswrapper[4921]: I0312 14:32:19.299430 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktj6h" event={"ID":"f49a4347-0cb5-4716-95a0-d93667f7e1ca","Type":"ContainerDied","Data":"d948ed2bfacc3484a4deb934a24feea8669e6a7dba1162a417946ace9e39e2c4"} Mar 12 14:32:20 crc kubenswrapper[4921]: I0312 14:32:20.310414 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9jwl" event={"ID":"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1","Type":"ContainerStarted","Data":"9f946deaa317dec1a6d085307e0b2b2510010f28bb5468f01c87f677fbac90ee"} Mar 12 14:32:20 crc kubenswrapper[4921]: I0312 14:32:20.337280 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p9jwl" podStartSLOduration=3.430918294 podStartE2EDuration="12.337258315s" podCreationTimestamp="2026-03-12 14:32:08 +0000 UTC" firstStartedPulling="2026-03-12 14:32:10.193188561 +0000 UTC m=+4952.883260532" lastFinishedPulling="2026-03-12 14:32:19.099528582 +0000 UTC m=+4961.789600553" observedRunningTime="2026-03-12 14:32:20.328634039 +0000 UTC m=+4963.018706010" watchObservedRunningTime="2026-03-12 14:32:20.337258315 +0000 UTC m=+4963.027330286" Mar 12 14:32:21 crc kubenswrapper[4921]: I0312 14:32:21.321831 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktj6h" event={"ID":"f49a4347-0cb5-4716-95a0-d93667f7e1ca","Type":"ContainerStarted","Data":"df637adff59d306d2d3ac060fa6f7a5d3554bad512d257319482e9cb26db2abd"} Mar 12 14:32:21 crc kubenswrapper[4921]: I0312 14:32:21.341334 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ktj6h" podStartSLOduration=3.779226804 podStartE2EDuration="10.341315552s" podCreationTimestamp="2026-03-12 14:32:11 +0000 UTC" firstStartedPulling="2026-03-12 14:32:13.225094721 +0000 UTC m=+4955.915166682" lastFinishedPulling="2026-03-12 14:32:19.787183459 +0000 UTC m=+4962.477255430" observedRunningTime="2026-03-12 14:32:21.33768753 +0000 UTC m=+4964.027759521" watchObservedRunningTime="2026-03-12 14:32:21.341315552 +0000 UTC m=+4964.031387523" Mar 12 14:32:21 crc kubenswrapper[4921]: I0312 14:32:21.613395 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:21 crc kubenswrapper[4921]: I0312 14:32:21.613446 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:22 crc kubenswrapper[4921]: I0312 14:32:22.667431 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-ktj6h" podUID="f49a4347-0cb5-4716-95a0-d93667f7e1ca" containerName="registry-server" probeResult="failure" output=< Mar 12 14:32:22 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 14:32:22 crc kubenswrapper[4921]: > Mar 12 14:32:29 crc kubenswrapper[4921]: I0312 14:32:29.490296 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:29 crc kubenswrapper[4921]: I0312 14:32:29.491970 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:29 crc kubenswrapper[4921]: I0312 14:32:29.539280 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:31 crc kubenswrapper[4921]: I0312 14:32:31.247047 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p9jwl" Mar 12 14:32:31 crc kubenswrapper[4921]: I0312 14:32:31.316428 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p9jwl"] Mar 12 14:32:31 crc kubenswrapper[4921]: I0312 14:32:31.379193 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cfsth"] Mar 12 14:32:31 crc kubenswrapper[4921]: I0312 14:32:31.379472 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cfsth" podUID="74d62bfa-7599-4992-b5e7-4220aa3a6443" containerName="registry-server" containerID="cri-o://612cdfde1ebeecd20e592969d30f2fae7d04770150f6259ab7b606ecfa59ccc8" gracePeriod=2 Mar 12 14:32:31 crc kubenswrapper[4921]: I0312 14:32:31.662887 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:31 crc kubenswrapper[4921]: I0312 14:32:31.741375 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.024571 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfsth" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.153681 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d62bfa-7599-4992-b5e7-4220aa3a6443-catalog-content\") pod \"74d62bfa-7599-4992-b5e7-4220aa3a6443\" (UID: \"74d62bfa-7599-4992-b5e7-4220aa3a6443\") " Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.153740 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d62bfa-7599-4992-b5e7-4220aa3a6443-utilities\") pod \"74d62bfa-7599-4992-b5e7-4220aa3a6443\" (UID: \"74d62bfa-7599-4992-b5e7-4220aa3a6443\") " Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.153850 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55chk\" (UniqueName: \"kubernetes.io/projected/74d62bfa-7599-4992-b5e7-4220aa3a6443-kube-api-access-55chk\") pod \"74d62bfa-7599-4992-b5e7-4220aa3a6443\" (UID: \"74d62bfa-7599-4992-b5e7-4220aa3a6443\") " Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.155436 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74d62bfa-7599-4992-b5e7-4220aa3a6443-utilities" (OuterVolumeSpecName: "utilities") pod "74d62bfa-7599-4992-b5e7-4220aa3a6443" (UID: "74d62bfa-7599-4992-b5e7-4220aa3a6443"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.179268 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d62bfa-7599-4992-b5e7-4220aa3a6443-kube-api-access-55chk" (OuterVolumeSpecName: "kube-api-access-55chk") pod "74d62bfa-7599-4992-b5e7-4220aa3a6443" (UID: "74d62bfa-7599-4992-b5e7-4220aa3a6443"). InnerVolumeSpecName "kube-api-access-55chk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.224215 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74d62bfa-7599-4992-b5e7-4220aa3a6443-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74d62bfa-7599-4992-b5e7-4220aa3a6443" (UID: "74d62bfa-7599-4992-b5e7-4220aa3a6443"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.255966 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74d62bfa-7599-4992-b5e7-4220aa3a6443-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.256009 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74d62bfa-7599-4992-b5e7-4220aa3a6443-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.256018 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55chk\" (UniqueName: \"kubernetes.io/projected/74d62bfa-7599-4992-b5e7-4220aa3a6443-kube-api-access-55chk\") on node \"crc\" DevicePath \"\"" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.422291 4921 generic.go:334] "Generic (PLEG): container finished" podID="74d62bfa-7599-4992-b5e7-4220aa3a6443" containerID="612cdfde1ebeecd20e592969d30f2fae7d04770150f6259ab7b606ecfa59ccc8" exitCode=0 Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.422342 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfsth" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.422406 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfsth" event={"ID":"74d62bfa-7599-4992-b5e7-4220aa3a6443","Type":"ContainerDied","Data":"612cdfde1ebeecd20e592969d30f2fae7d04770150f6259ab7b606ecfa59ccc8"} Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.422434 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfsth" event={"ID":"74d62bfa-7599-4992-b5e7-4220aa3a6443","Type":"ContainerDied","Data":"2f3c90d7299a4227e42a6b82283f5f543968e4b0e5a6ede7e1d3e0f66ca1e02f"} Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.422450 4921 scope.go:117] "RemoveContainer" containerID="612cdfde1ebeecd20e592969d30f2fae7d04770150f6259ab7b606ecfa59ccc8" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.455685 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cfsth"] Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.456615 4921 scope.go:117] "RemoveContainer" containerID="f4e5d20dd222ff23b06c587f636838e8ddd6d6d72bda507cec02d43cdaa02bd8" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.465300 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cfsth"] Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.485773 4921 scope.go:117] "RemoveContainer" containerID="58a54eba602eb05bf019583308cd86314f6a1713355b869c1988d88f51460562" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.533197 4921 scope.go:117] "RemoveContainer" containerID="612cdfde1ebeecd20e592969d30f2fae7d04770150f6259ab7b606ecfa59ccc8" Mar 12 14:32:32 crc kubenswrapper[4921]: E0312 14:32:32.533565 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"612cdfde1ebeecd20e592969d30f2fae7d04770150f6259ab7b606ecfa59ccc8\": container with ID starting with 612cdfde1ebeecd20e592969d30f2fae7d04770150f6259ab7b606ecfa59ccc8 not found: ID does not exist" containerID="612cdfde1ebeecd20e592969d30f2fae7d04770150f6259ab7b606ecfa59ccc8" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.533591 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"612cdfde1ebeecd20e592969d30f2fae7d04770150f6259ab7b606ecfa59ccc8"} err="failed to get container status \"612cdfde1ebeecd20e592969d30f2fae7d04770150f6259ab7b606ecfa59ccc8\": rpc error: code = NotFound desc = could not find container \"612cdfde1ebeecd20e592969d30f2fae7d04770150f6259ab7b606ecfa59ccc8\": container with ID starting with 612cdfde1ebeecd20e592969d30f2fae7d04770150f6259ab7b606ecfa59ccc8 not found: ID does not exist" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.533617 4921 scope.go:117] "RemoveContainer" containerID="f4e5d20dd222ff23b06c587f636838e8ddd6d6d72bda507cec02d43cdaa02bd8" Mar 12 14:32:32 crc kubenswrapper[4921]: E0312 14:32:32.533957 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e5d20dd222ff23b06c587f636838e8ddd6d6d72bda507cec02d43cdaa02bd8\": container with ID starting with f4e5d20dd222ff23b06c587f636838e8ddd6d6d72bda507cec02d43cdaa02bd8 not found: ID does not exist" containerID="f4e5d20dd222ff23b06c587f636838e8ddd6d6d72bda507cec02d43cdaa02bd8" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.534014 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e5d20dd222ff23b06c587f636838e8ddd6d6d72bda507cec02d43cdaa02bd8"} err="failed to get container status \"f4e5d20dd222ff23b06c587f636838e8ddd6d6d72bda507cec02d43cdaa02bd8\": rpc error: code = NotFound desc = could not find container \"f4e5d20dd222ff23b06c587f636838e8ddd6d6d72bda507cec02d43cdaa02bd8\": container with ID starting with f4e5d20dd222ff23b06c587f636838e8ddd6d6d72bda507cec02d43cdaa02bd8 not found: ID does not exist" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.534046 4921 scope.go:117] "RemoveContainer" containerID="58a54eba602eb05bf019583308cd86314f6a1713355b869c1988d88f51460562" Mar 12 14:32:32 crc kubenswrapper[4921]: E0312 14:32:32.534378 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58a54eba602eb05bf019583308cd86314f6a1713355b869c1988d88f51460562\": container with ID starting with 58a54eba602eb05bf019583308cd86314f6a1713355b869c1988d88f51460562 not found: ID does not exist" containerID="58a54eba602eb05bf019583308cd86314f6a1713355b869c1988d88f51460562" Mar 12 14:32:32 crc kubenswrapper[4921]: I0312 14:32:32.534412 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a54eba602eb05bf019583308cd86314f6a1713355b869c1988d88f51460562"} err="failed to get container status \"58a54eba602eb05bf019583308cd86314f6a1713355b869c1988d88f51460562\": rpc error: code = NotFound desc = could not find container \"58a54eba602eb05bf019583308cd86314f6a1713355b869c1988d88f51460562\": container with ID starting with 58a54eba602eb05bf019583308cd86314f6a1713355b869c1988d88f51460562 not found: ID does not exist" Mar 12 14:32:33 crc kubenswrapper[4921]: I0312 14:32:33.922499 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktj6h"] Mar 12 14:32:33 crc kubenswrapper[4921]: I0312 14:32:33.923082 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ktj6h" podUID="f49a4347-0cb5-4716-95a0-d93667f7e1ca" containerName="registry-server" containerID="cri-o://df637adff59d306d2d3ac060fa6f7a5d3554bad512d257319482e9cb26db2abd" gracePeriod=2 Mar 12 14:32:33 crc kubenswrapper[4921]: I0312 14:32:33.994720 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d62bfa-7599-4992-b5e7-4220aa3a6443" path="/var/lib/kubelet/pods/74d62bfa-7599-4992-b5e7-4220aa3a6443/volumes" Mar 12 14:32:34 crc kubenswrapper[4921]: I0312 14:32:34.439349 4921 generic.go:334] "Generic (PLEG): container finished" podID="f49a4347-0cb5-4716-95a0-d93667f7e1ca" containerID="df637adff59d306d2d3ac060fa6f7a5d3554bad512d257319482e9cb26db2abd" exitCode=0 Mar 12 14:32:34 crc kubenswrapper[4921]: I0312 14:32:34.439613 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktj6h" event={"ID":"f49a4347-0cb5-4716-95a0-d93667f7e1ca","Type":"ContainerDied","Data":"df637adff59d306d2d3ac060fa6f7a5d3554bad512d257319482e9cb26db2abd"} Mar 12 14:32:34 crc kubenswrapper[4921]: I0312 14:32:34.621115 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:34 crc kubenswrapper[4921]: I0312 14:32:34.806176 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f49a4347-0cb5-4716-95a0-d93667f7e1ca-catalog-content\") pod \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\" (UID: \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\") " Mar 12 14:32:34 crc kubenswrapper[4921]: I0312 14:32:34.806256 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f49a4347-0cb5-4716-95a0-d93667f7e1ca-utilities\") pod \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\" (UID: \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\") " Mar 12 14:32:34 crc kubenswrapper[4921]: I0312 14:32:34.806333 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w27pv\" (UniqueName: \"kubernetes.io/projected/f49a4347-0cb5-4716-95a0-d93667f7e1ca-kube-api-access-w27pv\") pod \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\" (UID: \"f49a4347-0cb5-4716-95a0-d93667f7e1ca\") " Mar 12 14:32:34 crc kubenswrapper[4921]: I0312 14:32:34.806922 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f49a4347-0cb5-4716-95a0-d93667f7e1ca-utilities" (OuterVolumeSpecName: "utilities") pod "f49a4347-0cb5-4716-95a0-d93667f7e1ca" (UID: "f49a4347-0cb5-4716-95a0-d93667f7e1ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:32:34 crc kubenswrapper[4921]: I0312 14:32:34.813037 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f49a4347-0cb5-4716-95a0-d93667f7e1ca-kube-api-access-w27pv" (OuterVolumeSpecName: "kube-api-access-w27pv") pod "f49a4347-0cb5-4716-95a0-d93667f7e1ca" (UID: "f49a4347-0cb5-4716-95a0-d93667f7e1ca"). InnerVolumeSpecName "kube-api-access-w27pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:32:34 crc kubenswrapper[4921]: I0312 14:32:34.851314 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f49a4347-0cb5-4716-95a0-d93667f7e1ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f49a4347-0cb5-4716-95a0-d93667f7e1ca" (UID: "f49a4347-0cb5-4716-95a0-d93667f7e1ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:32:34 crc kubenswrapper[4921]: I0312 14:32:34.908450 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w27pv\" (UniqueName: \"kubernetes.io/projected/f49a4347-0cb5-4716-95a0-d93667f7e1ca-kube-api-access-w27pv\") on node \"crc\" DevicePath \"\"" Mar 12 14:32:34 crc kubenswrapper[4921]: I0312 14:32:34.908479 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f49a4347-0cb5-4716-95a0-d93667f7e1ca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:32:34 crc kubenswrapper[4921]: I0312 14:32:34.908492 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f49a4347-0cb5-4716-95a0-d93667f7e1ca-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:32:35 crc kubenswrapper[4921]: I0312 14:32:35.454432 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ktj6h" event={"ID":"f49a4347-0cb5-4716-95a0-d93667f7e1ca","Type":"ContainerDied","Data":"cdb44a338119df1e1305c3f979253bf2bc3e467b3c216ad7ca3448464df930ff"} Mar 12 14:32:35 crc kubenswrapper[4921]: I0312 14:32:35.454683 4921 scope.go:117] "RemoveContainer" containerID="df637adff59d306d2d3ac060fa6f7a5d3554bad512d257319482e9cb26db2abd" Mar 12 14:32:35 crc kubenswrapper[4921]: I0312 14:32:35.454531 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ktj6h" Mar 12 14:32:35 crc kubenswrapper[4921]: I0312 14:32:35.475869 4921 scope.go:117] "RemoveContainer" containerID="d948ed2bfacc3484a4deb934a24feea8669e6a7dba1162a417946ace9e39e2c4" Mar 12 14:32:35 crc kubenswrapper[4921]: I0312 14:32:35.507456 4921 scope.go:117] "RemoveContainer" containerID="a61998a773a4a3ee9d3ab4d6f1ceb5850d8bd7ad9b64709700cd89b2bf7dfd7b" Mar 12 14:32:35 crc kubenswrapper[4921]: I0312 14:32:35.514549 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktj6h"] Mar 12 14:32:35 crc kubenswrapper[4921]: I0312 14:32:35.525267 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ktj6h"] Mar 12 14:32:35 crc kubenswrapper[4921]: I0312 14:32:35.998180 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f49a4347-0cb5-4716-95a0-d93667f7e1ca" path="/var/lib/kubelet/pods/f49a4347-0cb5-4716-95a0-d93667f7e1ca/volumes" Mar 12 14:32:47 crc kubenswrapper[4921]: I0312 14:32:47.699650 4921 scope.go:117] "RemoveContainer" containerID="210453b51ac5cc012689db2a82b19d465448f3d0f3e9be90decb327490f5fb75" Mar 12 14:33:26 crc kubenswrapper[4921]: I0312 14:33:26.323706 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:33:26 crc kubenswrapper[4921]: I0312 14:33:26.324275 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:33:56 crc kubenswrapper[4921]: I0312 14:33:56.324467 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:33:56 crc kubenswrapper[4921]: I0312 14:33:56.325573 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.142280 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555434-tv7kn"] Mar 12 14:34:00 crc kubenswrapper[4921]: E0312 14:34:00.143338 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d62bfa-7599-4992-b5e7-4220aa3a6443" containerName="extract-utilities" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.143357 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d62bfa-7599-4992-b5e7-4220aa3a6443" containerName="extract-utilities" Mar 12 14:34:00 crc kubenswrapper[4921]: E0312 14:34:00.143371 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49a4347-0cb5-4716-95a0-d93667f7e1ca" containerName="extract-content" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.143379 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49a4347-0cb5-4716-95a0-d93667f7e1ca" containerName="extract-content" Mar 12 14:34:00 crc kubenswrapper[4921]: E0312 14:34:00.143399 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d62bfa-7599-4992-b5e7-4220aa3a6443" containerName="registry-server" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.143408 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d62bfa-7599-4992-b5e7-4220aa3a6443" containerName="registry-server" Mar 12 14:34:00 crc kubenswrapper[4921]: E0312 14:34:00.143421 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d62bfa-7599-4992-b5e7-4220aa3a6443" containerName="extract-content" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.143428 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d62bfa-7599-4992-b5e7-4220aa3a6443" containerName="extract-content" Mar 12 14:34:00 crc kubenswrapper[4921]: E0312 14:34:00.143445 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49a4347-0cb5-4716-95a0-d93667f7e1ca" containerName="registry-server" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.143454 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49a4347-0cb5-4716-95a0-d93667f7e1ca" containerName="registry-server" Mar 12 14:34:00 crc kubenswrapper[4921]: E0312 14:34:00.143465 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f49a4347-0cb5-4716-95a0-d93667f7e1ca" containerName="extract-utilities" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.143473 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f49a4347-0cb5-4716-95a0-d93667f7e1ca" containerName="extract-utilities" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.143712 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d62bfa-7599-4992-b5e7-4220aa3a6443" containerName="registry-server" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.143729 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f49a4347-0cb5-4716-95a0-d93667f7e1ca" containerName="registry-server" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.144613 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555434-tv7kn" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.146535 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.146937 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.147168 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.155205 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555434-tv7kn"] Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.264006 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqw9j\" (UniqueName: \"kubernetes.io/projected/fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7-kube-api-access-tqw9j\") pod \"auto-csr-approver-29555434-tv7kn\" (UID: \"fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7\") " pod="openshift-infra/auto-csr-approver-29555434-tv7kn" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.365911 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqw9j\" (UniqueName: \"kubernetes.io/projected/fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7-kube-api-access-tqw9j\") pod \"auto-csr-approver-29555434-tv7kn\" (UID: \"fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7\") " pod="openshift-infra/auto-csr-approver-29555434-tv7kn" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.390606 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqw9j\" (UniqueName: \"kubernetes.io/projected/fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7-kube-api-access-tqw9j\") pod \"auto-csr-approver-29555434-tv7kn\" (UID: \"fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7\") " pod="openshift-infra/auto-csr-approver-29555434-tv7kn" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.466111 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555434-tv7kn" Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.943388 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:34:00 crc kubenswrapper[4921]: I0312 14:34:00.944413 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555434-tv7kn"] Mar 12 14:34:01 crc kubenswrapper[4921]: I0312 14:34:01.221022 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555434-tv7kn" event={"ID":"fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7","Type":"ContainerStarted","Data":"2cabfc82e9c1fa2ce8dc95063b89ecd76ea4938c1f3f91393b3c027f80009b8b"} Mar 12 14:34:03 crc kubenswrapper[4921]: I0312 14:34:03.239498 4921 generic.go:334] "Generic (PLEG): container finished" podID="fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7" containerID="0b1f50333fe75d1be69571abf0c9d5d76f8705f468340cc7dde63fb3f9705965" exitCode=0 Mar 12 14:34:03 crc kubenswrapper[4921]: I0312 14:34:03.239571 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555434-tv7kn" event={"ID":"fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7","Type":"ContainerDied","Data":"0b1f50333fe75d1be69571abf0c9d5d76f8705f468340cc7dde63fb3f9705965"} Mar 12 14:34:04 crc kubenswrapper[4921]: I0312 14:34:04.802200 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555434-tv7kn" Mar 12 14:34:04 crc kubenswrapper[4921]: I0312 14:34:04.860045 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqw9j\" (UniqueName: \"kubernetes.io/projected/fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7-kube-api-access-tqw9j\") pod \"fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7\" (UID: \"fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7\") " Mar 12 14:34:04 crc kubenswrapper[4921]: I0312 14:34:04.866303 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7-kube-api-access-tqw9j" (OuterVolumeSpecName: "kube-api-access-tqw9j") pod "fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7" (UID: "fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7"). InnerVolumeSpecName "kube-api-access-tqw9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:34:04 crc kubenswrapper[4921]: I0312 14:34:04.962537 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqw9j\" (UniqueName: \"kubernetes.io/projected/fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7-kube-api-access-tqw9j\") on node \"crc\" DevicePath \"\"" Mar 12 14:34:05 crc kubenswrapper[4921]: I0312 14:34:05.256880 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555434-tv7kn" event={"ID":"fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7","Type":"ContainerDied","Data":"2cabfc82e9c1fa2ce8dc95063b89ecd76ea4938c1f3f91393b3c027f80009b8b"} Mar 12 14:34:05 crc kubenswrapper[4921]: I0312 14:34:05.257103 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cabfc82e9c1fa2ce8dc95063b89ecd76ea4938c1f3f91393b3c027f80009b8b" Mar 12 14:34:05 crc kubenswrapper[4921]: I0312 14:34:05.257104 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555434-tv7kn" Mar 12 14:34:05 crc kubenswrapper[4921]: I0312 14:34:05.866599 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555428-zclwx"] Mar 12 14:34:05 crc kubenswrapper[4921]: I0312 14:34:05.877257 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555428-zclwx"] Mar 12 14:34:05 crc kubenswrapper[4921]: I0312 14:34:05.993920 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d9874e8-164a-4006-8a19-b2c7266e9c3a" path="/var/lib/kubelet/pods/3d9874e8-164a-4006-8a19-b2c7266e9c3a/volumes" Mar 12 14:34:26 crc kubenswrapper[4921]: I0312 14:34:26.323465 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:34:26 crc kubenswrapper[4921]: I0312 14:34:26.324035 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:34:26 crc kubenswrapper[4921]: I0312 14:34:26.324083 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 14:34:26 crc kubenswrapper[4921]: I0312 14:34:26.324855 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 14:34:26 crc kubenswrapper[4921]: I0312 14:34:26.324909 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" gracePeriod=600 Mar 12 14:34:27 crc kubenswrapper[4921]: E0312 14:34:27.038451 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:34:27 crc kubenswrapper[4921]: I0312 14:34:27.442677 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" exitCode=0 Mar 12 14:34:27 crc kubenswrapper[4921]: I0312 14:34:27.442724 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6"} Mar 12 14:34:27 crc kubenswrapper[4921]: I0312 14:34:27.442762 4921 scope.go:117] "RemoveContainer" containerID="4aff2f6b38d448fe182dbeed50a960e793420d5a55853297536d4fb1c83afee3" Mar 12 14:34:27 crc kubenswrapper[4921]: I0312 14:34:27.443777 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:34:27 crc kubenswrapper[4921]: E0312 14:34:27.444282 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:34:42 crc kubenswrapper[4921]: I0312 14:34:42.983754 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:34:42 crc kubenswrapper[4921]: E0312 14:34:42.984488 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:34:47 crc kubenswrapper[4921]: I0312 14:34:47.856143 4921 scope.go:117] "RemoveContainer" containerID="d3959cbbd4c53a61345b0ed46e3c5018eaf7fa6e911925b7012b44d91b685fc4" Mar 12 14:34:57 crc kubenswrapper[4921]: I0312 14:34:57.989491 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:34:57 crc kubenswrapper[4921]: E0312 14:34:57.990358 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:35:10 crc kubenswrapper[4921]: I0312 14:35:10.983697 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:35:10 crc kubenswrapper[4921]: E0312 14:35:10.985737 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:35:25 crc kubenswrapper[4921]: I0312 14:35:25.983995 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:35:25 crc kubenswrapper[4921]: E0312 14:35:25.985033 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:35:38 crc kubenswrapper[4921]: I0312 14:35:38.983795 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:35:38 crc kubenswrapper[4921]: E0312 14:35:38.984901 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:35:51 crc kubenswrapper[4921]: I0312 14:35:51.983110 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:35:51 crc kubenswrapper[4921]: E0312 14:35:51.983981 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:36:00 crc kubenswrapper[4921]: I0312 14:36:00.154521 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555436-zv8kj"] Mar 12 14:36:00 crc kubenswrapper[4921]: E0312 14:36:00.155479 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7" containerName="oc" Mar 12 14:36:00 crc kubenswrapper[4921]: I0312 14:36:00.155492 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7" containerName="oc" Mar 12 14:36:00 crc kubenswrapper[4921]: I0312 14:36:00.155669 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7" containerName="oc" Mar 12 14:36:00 crc kubenswrapper[4921]: I0312 14:36:00.156592 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555436-zv8kj" Mar 12 14:36:00 crc kubenswrapper[4921]: I0312 14:36:00.159545 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:36:00 crc kubenswrapper[4921]: I0312 14:36:00.159604 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:36:00 crc kubenswrapper[4921]: I0312 14:36:00.159664 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:36:00 crc kubenswrapper[4921]: I0312 14:36:00.182615 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555436-zv8kj"] Mar 12 14:36:00 crc kubenswrapper[4921]: I0312 14:36:00.327486 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbf85\" (UniqueName: \"kubernetes.io/projected/0cd4fe7e-4531-412b-a9f3-03e79a87d30d-kube-api-access-wbf85\") pod \"auto-csr-approver-29555436-zv8kj\" (UID: \"0cd4fe7e-4531-412b-a9f3-03e79a87d30d\") " pod="openshift-infra/auto-csr-approver-29555436-zv8kj" Mar 12 14:36:00 crc kubenswrapper[4921]: I0312 14:36:00.429850 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbf85\" (UniqueName: \"kubernetes.io/projected/0cd4fe7e-4531-412b-a9f3-03e79a87d30d-kube-api-access-wbf85\") pod \"auto-csr-approver-29555436-zv8kj\" (UID: \"0cd4fe7e-4531-412b-a9f3-03e79a87d30d\") " pod="openshift-infra/auto-csr-approver-29555436-zv8kj" Mar 12 14:36:00 crc kubenswrapper[4921]: I0312 14:36:00.452842 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbf85\" (UniqueName: \"kubernetes.io/projected/0cd4fe7e-4531-412b-a9f3-03e79a87d30d-kube-api-access-wbf85\") pod \"auto-csr-approver-29555436-zv8kj\" (UID: \"0cd4fe7e-4531-412b-a9f3-03e79a87d30d\") " pod="openshift-infra/auto-csr-approver-29555436-zv8kj" Mar 12 14:36:00 crc kubenswrapper[4921]: I0312 14:36:00.482045 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555436-zv8kj" Mar 12 14:36:00 crc kubenswrapper[4921]: I0312 14:36:00.931517 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555436-zv8kj"] Mar 12 14:36:01 crc kubenswrapper[4921]: I0312 14:36:01.273240 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555436-zv8kj" event={"ID":"0cd4fe7e-4531-412b-a9f3-03e79a87d30d","Type":"ContainerStarted","Data":"e2c9380e228699fcaf54fc24d36ae066c42dfdc5ad5fcf270dd3f67b807f34fa"} Mar 12 14:36:02 crc kubenswrapper[4921]: I0312 14:36:02.286986 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555436-zv8kj" event={"ID":"0cd4fe7e-4531-412b-a9f3-03e79a87d30d","Type":"ContainerStarted","Data":"d6e68c525465f58a87589a60ffc38fe19df203fef38a67c218cdbb778d8335aa"} Mar 12 14:36:02 crc kubenswrapper[4921]: I0312 14:36:02.321164 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555436-zv8kj" podStartSLOduration=1.421650068 podStartE2EDuration="2.321140616s" podCreationTimestamp="2026-03-12 14:36:00 +0000 UTC" firstStartedPulling="2026-03-12 14:36:00.940745436 +0000 UTC m=+5183.630817407" lastFinishedPulling="2026-03-12 14:36:01.840235984 +0000 UTC m=+5184.530307955" observedRunningTime="2026-03-12 14:36:02.313239001 +0000 UTC m=+5185.003310982" watchObservedRunningTime="2026-03-12 14:36:02.321140616 +0000 UTC m=+5185.011212577" Mar 12 14:36:02 crc kubenswrapper[4921]: E0312 14:36:02.572513 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cd4fe7e_4531_412b_a9f3_03e79a87d30d.slice/crio-conmon-d6e68c525465f58a87589a60ffc38fe19df203fef38a67c218cdbb778d8335aa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cd4fe7e_4531_412b_a9f3_03e79a87d30d.slice/crio-d6e68c525465f58a87589a60ffc38fe19df203fef38a67c218cdbb778d8335aa.scope\": RecentStats: unable to find data in memory cache]" Mar 12 14:36:03 crc kubenswrapper[4921]: I0312 14:36:03.297576 4921 generic.go:334] "Generic (PLEG): container finished" podID="0cd4fe7e-4531-412b-a9f3-03e79a87d30d" containerID="d6e68c525465f58a87589a60ffc38fe19df203fef38a67c218cdbb778d8335aa" exitCode=0 Mar 12 14:36:03 crc kubenswrapper[4921]: I0312 14:36:03.297749 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555436-zv8kj" event={"ID":"0cd4fe7e-4531-412b-a9f3-03e79a87d30d","Type":"ContainerDied","Data":"d6e68c525465f58a87589a60ffc38fe19df203fef38a67c218cdbb778d8335aa"} Mar 12 14:36:05 crc kubenswrapper[4921]: I0312 14:36:05.318569 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555436-zv8kj" event={"ID":"0cd4fe7e-4531-412b-a9f3-03e79a87d30d","Type":"ContainerDied","Data":"e2c9380e228699fcaf54fc24d36ae066c42dfdc5ad5fcf270dd3f67b807f34fa"} Mar 12 14:36:05 crc kubenswrapper[4921]: I0312 14:36:05.319132 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2c9380e228699fcaf54fc24d36ae066c42dfdc5ad5fcf270dd3f67b807f34fa" Mar 12 14:36:05 crc kubenswrapper[4921]: I0312 14:36:05.336343 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555436-zv8kj" Mar 12 14:36:05 crc kubenswrapper[4921]: I0312 14:36:05.536641 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbf85\" (UniqueName: \"kubernetes.io/projected/0cd4fe7e-4531-412b-a9f3-03e79a87d30d-kube-api-access-wbf85\") pod \"0cd4fe7e-4531-412b-a9f3-03e79a87d30d\" (UID: \"0cd4fe7e-4531-412b-a9f3-03e79a87d30d\") " Mar 12 14:36:05 crc kubenswrapper[4921]: I0312 14:36:05.543025 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cd4fe7e-4531-412b-a9f3-03e79a87d30d-kube-api-access-wbf85" (OuterVolumeSpecName: "kube-api-access-wbf85") pod "0cd4fe7e-4531-412b-a9f3-03e79a87d30d" (UID: "0cd4fe7e-4531-412b-a9f3-03e79a87d30d"). InnerVolumeSpecName "kube-api-access-wbf85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:36:05 crc kubenswrapper[4921]: I0312 14:36:05.638696 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbf85\" (UniqueName: \"kubernetes.io/projected/0cd4fe7e-4531-412b-a9f3-03e79a87d30d-kube-api-access-wbf85\") on node \"crc\" DevicePath \"\"" Mar 12 14:36:05 crc kubenswrapper[4921]: I0312 14:36:05.983151 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:36:05 crc kubenswrapper[4921]: E0312 14:36:05.983552 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:36:06 crc kubenswrapper[4921]: I0312 14:36:06.327738 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555436-zv8kj" Mar 12 14:36:06 crc kubenswrapper[4921]: I0312 14:36:06.415211 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555430-f5nxw"] Mar 12 14:36:06 crc kubenswrapper[4921]: I0312 14:36:06.443980 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555430-f5nxw"] Mar 12 14:36:08 crc kubenswrapper[4921]: I0312 14:36:08.000392 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb5d8d7f-c30c-49d1-bbae-15118735c509" path="/var/lib/kubelet/pods/fb5d8d7f-c30c-49d1-bbae-15118735c509/volumes" Mar 12 14:36:19 crc kubenswrapper[4921]: I0312 14:36:19.983940 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:36:19 crc kubenswrapper[4921]: E0312 14:36:19.984688 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:36:34 crc kubenswrapper[4921]: I0312 14:36:34.983447 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:36:34 crc kubenswrapper[4921]: E0312 14:36:34.984302 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:36:47 crc kubenswrapper[4921]: I0312 14:36:47.939704 4921 scope.go:117] "RemoveContainer" containerID="08a7da22d34f11c808f6e7ec4dd44145d9f2e4e8817574c3f8a3a61a792500bf" Mar 12 14:36:48 crc kubenswrapper[4921]: I0312 14:36:48.984845 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:36:48 crc kubenswrapper[4921]: E0312 14:36:48.986694 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:37:01 crc kubenswrapper[4921]: I0312 14:37:01.984159 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:37:01 crc kubenswrapper[4921]: E0312 14:37:01.986156 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:37:13 crc kubenswrapper[4921]: I0312 14:37:13.983829 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:37:13 crc kubenswrapper[4921]: E0312 14:37:13.984548 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:37:26 crc kubenswrapper[4921]: I0312 14:37:26.984533 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:37:26 crc kubenswrapper[4921]: E0312 14:37:26.985978 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:37:37 crc kubenswrapper[4921]: I0312 14:37:37.983484 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:37:37 crc kubenswrapper[4921]: E0312 14:37:37.984353 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:37:50 crc kubenswrapper[4921]: I0312 14:37:50.983657 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:37:50 crc kubenswrapper[4921]: E0312 14:37:50.984422 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:38:00 crc kubenswrapper[4921]: I0312 14:38:00.143331 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555438-4cqxb"] Mar 12 14:38:00 crc kubenswrapper[4921]: E0312 14:38:00.144362 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd4fe7e-4531-412b-a9f3-03e79a87d30d" containerName="oc" Mar 12 14:38:00 crc kubenswrapper[4921]: I0312 14:38:00.144379 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd4fe7e-4531-412b-a9f3-03e79a87d30d" containerName="oc" Mar 12 14:38:00 crc kubenswrapper[4921]: I0312 14:38:00.144621 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd4fe7e-4531-412b-a9f3-03e79a87d30d" containerName="oc" Mar 12 14:38:00 crc kubenswrapper[4921]: I0312 14:38:00.145414 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555438-4cqxb" Mar 12 14:38:00 crc kubenswrapper[4921]: I0312 14:38:00.149327 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:38:00 crc kubenswrapper[4921]: I0312 14:38:00.149519 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:38:00 crc kubenswrapper[4921]: I0312 14:38:00.149639 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:38:00 crc kubenswrapper[4921]: I0312 14:38:00.153649 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555438-4cqxb"] Mar 12 14:38:00 crc kubenswrapper[4921]: I0312 14:38:00.280207 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgtzw\" (UniqueName: \"kubernetes.io/projected/c865d942-d9a9-4a66-8b9e-b19c82064aeb-kube-api-access-bgtzw\") pod \"auto-csr-approver-29555438-4cqxb\" (UID: \"c865d942-d9a9-4a66-8b9e-b19c82064aeb\") " pod="openshift-infra/auto-csr-approver-29555438-4cqxb" Mar 12 14:38:00 crc kubenswrapper[4921]: I0312 14:38:00.383022 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgtzw\" (UniqueName: \"kubernetes.io/projected/c865d942-d9a9-4a66-8b9e-b19c82064aeb-kube-api-access-bgtzw\") pod \"auto-csr-approver-29555438-4cqxb\" (UID: \"c865d942-d9a9-4a66-8b9e-b19c82064aeb\") " pod="openshift-infra/auto-csr-approver-29555438-4cqxb" Mar 12 14:38:00 crc kubenswrapper[4921]: I0312 14:38:00.401336 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgtzw\" (UniqueName: \"kubernetes.io/projected/c865d942-d9a9-4a66-8b9e-b19c82064aeb-kube-api-access-bgtzw\") pod \"auto-csr-approver-29555438-4cqxb\" (UID: \"c865d942-d9a9-4a66-8b9e-b19c82064aeb\") " pod="openshift-infra/auto-csr-approver-29555438-4cqxb" Mar 12 14:38:00 crc kubenswrapper[4921]: I0312 14:38:00.466133 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555438-4cqxb" Mar 12 14:38:00 crc kubenswrapper[4921]: I0312 14:38:00.956365 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555438-4cqxb"] Mar 12 14:38:00 crc kubenswrapper[4921]: W0312 14:38:00.974835 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc865d942_d9a9_4a66_8b9e_b19c82064aeb.slice/crio-6eb22d32793537f6d140638a80dda50f2ccf69da6aca11faab4147d0a85ea40e WatchSource:0}: Error finding container 6eb22d32793537f6d140638a80dda50f2ccf69da6aca11faab4147d0a85ea40e: Status 404 returned error can't find the container with id 6eb22d32793537f6d140638a80dda50f2ccf69da6aca11faab4147d0a85ea40e Mar 12 14:38:01 crc kubenswrapper[4921]: I0312 14:38:01.307092 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555438-4cqxb" event={"ID":"c865d942-d9a9-4a66-8b9e-b19c82064aeb","Type":"ContainerStarted","Data":"6eb22d32793537f6d140638a80dda50f2ccf69da6aca11faab4147d0a85ea40e"} Mar 12 14:38:03 crc kubenswrapper[4921]: I0312 14:38:03.326181 4921 generic.go:334] "Generic (PLEG): container finished" podID="c865d942-d9a9-4a66-8b9e-b19c82064aeb" containerID="9576560a889bf00c9d3fac5b181891be9e41b593c5e589d9923ab7ec6d4f81f8" exitCode=0 Mar 12 14:38:03 crc kubenswrapper[4921]: I0312 14:38:03.326397 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555438-4cqxb" event={"ID":"c865d942-d9a9-4a66-8b9e-b19c82064aeb","Type":"ContainerDied","Data":"9576560a889bf00c9d3fac5b181891be9e41b593c5e589d9923ab7ec6d4f81f8"} Mar 12 14:38:04 crc kubenswrapper[4921]: I0312 14:38:04.884353 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555438-4cqxb" Mar 12 14:38:05 crc kubenswrapper[4921]: I0312 14:38:05.075766 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgtzw\" (UniqueName: \"kubernetes.io/projected/c865d942-d9a9-4a66-8b9e-b19c82064aeb-kube-api-access-bgtzw\") pod \"c865d942-d9a9-4a66-8b9e-b19c82064aeb\" (UID: \"c865d942-d9a9-4a66-8b9e-b19c82064aeb\") " Mar 12 14:38:05 crc kubenswrapper[4921]: I0312 14:38:05.096553 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c865d942-d9a9-4a66-8b9e-b19c82064aeb-kube-api-access-bgtzw" (OuterVolumeSpecName: "kube-api-access-bgtzw") pod "c865d942-d9a9-4a66-8b9e-b19c82064aeb" (UID: "c865d942-d9a9-4a66-8b9e-b19c82064aeb"). InnerVolumeSpecName "kube-api-access-bgtzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:38:05 crc kubenswrapper[4921]: I0312 14:38:05.181454 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgtzw\" (UniqueName: \"kubernetes.io/projected/c865d942-d9a9-4a66-8b9e-b19c82064aeb-kube-api-access-bgtzw\") on node \"crc\" DevicePath \"\"" Mar 12 14:38:05 crc kubenswrapper[4921]: I0312 14:38:05.350513 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555438-4cqxb" event={"ID":"c865d942-d9a9-4a66-8b9e-b19c82064aeb","Type":"ContainerDied","Data":"6eb22d32793537f6d140638a80dda50f2ccf69da6aca11faab4147d0a85ea40e"} Mar 12 14:38:05 crc kubenswrapper[4921]: I0312 14:38:05.350555 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eb22d32793537f6d140638a80dda50f2ccf69da6aca11faab4147d0a85ea40e" Mar 12 14:38:05 crc kubenswrapper[4921]: I0312 14:38:05.350622 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555438-4cqxb" Mar 12 14:38:05 crc kubenswrapper[4921]: E0312 14:38:05.490326 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc865d942_d9a9_4a66_8b9e_b19c82064aeb.slice\": RecentStats: unable to find data in memory cache]" Mar 12 14:38:05 crc kubenswrapper[4921]: I0312 14:38:05.965092 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555432-v6b5m"] Mar 12 14:38:05 crc kubenswrapper[4921]: I0312 14:38:05.975642 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555432-v6b5m"] Mar 12 14:38:05 crc kubenswrapper[4921]: I0312 14:38:05.984120 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:38:05 crc kubenswrapper[4921]: E0312 14:38:05.984581 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:38:05 crc kubenswrapper[4921]: I0312 14:38:05.997485 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9c0c6dc-1279-4413-b50b-51bc90052ee4" path="/var/lib/kubelet/pods/b9c0c6dc-1279-4413-b50b-51bc90052ee4/volumes" Mar 12 14:38:16 crc kubenswrapper[4921]: I0312 14:38:16.984040 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:38:16 crc kubenswrapper[4921]: E0312 14:38:16.984901 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:38:19 crc kubenswrapper[4921]: I0312 14:38:19.907399 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vzc8j"] Mar 12 14:38:19 crc kubenswrapper[4921]: E0312 14:38:19.908001 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c865d942-d9a9-4a66-8b9e-b19c82064aeb" containerName="oc" Mar 12 14:38:19 crc kubenswrapper[4921]: I0312 14:38:19.908014 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c865d942-d9a9-4a66-8b9e-b19c82064aeb" containerName="oc" Mar 12 14:38:19 crc kubenswrapper[4921]: I0312 14:38:19.908207 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c865d942-d9a9-4a66-8b9e-b19c82064aeb" containerName="oc" Mar 12 14:38:19 crc kubenswrapper[4921]: I0312 14:38:19.909439 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:19 crc kubenswrapper[4921]: I0312 14:38:19.925727 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzc8j"] Mar 12 14:38:20 crc kubenswrapper[4921]: I0312 14:38:20.002799 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13d46769-0d17-4322-99ad-4d7a9cd1b24a-utilities\") pod \"certified-operators-vzc8j\" (UID: \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\") " pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:20 crc kubenswrapper[4921]: I0312 14:38:20.002958 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13d46769-0d17-4322-99ad-4d7a9cd1b24a-catalog-content\") pod \"certified-operators-vzc8j\" (UID: \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\") " pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:20 crc kubenswrapper[4921]: I0312 14:38:20.003038 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbbr7\" (UniqueName: \"kubernetes.io/projected/13d46769-0d17-4322-99ad-4d7a9cd1b24a-kube-api-access-nbbr7\") pod \"certified-operators-vzc8j\" (UID: \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\") " pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:20 crc kubenswrapper[4921]: I0312 14:38:20.104444 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13d46769-0d17-4322-99ad-4d7a9cd1b24a-catalog-content\") pod \"certified-operators-vzc8j\" (UID: \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\") " pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:20 crc kubenswrapper[4921]: I0312 14:38:20.104552 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbbr7\" (UniqueName: \"kubernetes.io/projected/13d46769-0d17-4322-99ad-4d7a9cd1b24a-kube-api-access-nbbr7\") pod \"certified-operators-vzc8j\" (UID: \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\") " pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:20 crc kubenswrapper[4921]: I0312 14:38:20.104607 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13d46769-0d17-4322-99ad-4d7a9cd1b24a-utilities\") pod \"certified-operators-vzc8j\" (UID: \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\") " pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:20 crc kubenswrapper[4921]: I0312 14:38:20.105078 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13d46769-0d17-4322-99ad-4d7a9cd1b24a-utilities\") pod \"certified-operators-vzc8j\" (UID: \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\") " pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:20 crc kubenswrapper[4921]: I0312 14:38:20.105292 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13d46769-0d17-4322-99ad-4d7a9cd1b24a-catalog-content\") pod \"certified-operators-vzc8j\" (UID: \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\") " pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:20 crc kubenswrapper[4921]: I0312 14:38:20.129582 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbbr7\" (UniqueName: \"kubernetes.io/projected/13d46769-0d17-4322-99ad-4d7a9cd1b24a-kube-api-access-nbbr7\") pod \"certified-operators-vzc8j\" (UID: \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\") " pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:20 crc kubenswrapper[4921]: I0312 14:38:20.226711 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:20 crc kubenswrapper[4921]: I0312 14:38:20.726894 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzc8j"] Mar 12 14:38:21 crc kubenswrapper[4921]: I0312 14:38:21.538600 4921 generic.go:334] "Generic (PLEG): container finished" podID="13d46769-0d17-4322-99ad-4d7a9cd1b24a" containerID="3227f3b6a309de8dd7b9a8072d84e7dffe3f2360286dd3a484c2bccf9f11c4fc" exitCode=0 Mar 12 14:38:21 crc kubenswrapper[4921]: I0312 14:38:21.538676 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzc8j" event={"ID":"13d46769-0d17-4322-99ad-4d7a9cd1b24a","Type":"ContainerDied","Data":"3227f3b6a309de8dd7b9a8072d84e7dffe3f2360286dd3a484c2bccf9f11c4fc"} Mar 12 14:38:21 crc kubenswrapper[4921]: I0312 14:38:21.540570 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzc8j" event={"ID":"13d46769-0d17-4322-99ad-4d7a9cd1b24a","Type":"ContainerStarted","Data":"c7753a7c8f49ae846d4703f84d6f19d4d681dd5ac79d831b4ae81cb08caea059"} Mar 12 14:38:22 crc kubenswrapper[4921]: I0312 14:38:22.553626 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzc8j" event={"ID":"13d46769-0d17-4322-99ad-4d7a9cd1b24a","Type":"ContainerStarted","Data":"da4de546212d481e764a1aa5cb7de68077c707c02fbb7e150edb4d8f6527eaa4"} Mar 12 14:38:24 crc kubenswrapper[4921]: I0312 14:38:24.577926 4921 generic.go:334] "Generic (PLEG): container finished" podID="13d46769-0d17-4322-99ad-4d7a9cd1b24a" containerID="da4de546212d481e764a1aa5cb7de68077c707c02fbb7e150edb4d8f6527eaa4" exitCode=0 Mar 12 14:38:24 crc kubenswrapper[4921]: I0312 14:38:24.578015 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzc8j" event={"ID":"13d46769-0d17-4322-99ad-4d7a9cd1b24a","Type":"ContainerDied","Data":"da4de546212d481e764a1aa5cb7de68077c707c02fbb7e150edb4d8f6527eaa4"} Mar 12 14:38:25 crc kubenswrapper[4921]: I0312 14:38:25.590582 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzc8j" event={"ID":"13d46769-0d17-4322-99ad-4d7a9cd1b24a","Type":"ContainerStarted","Data":"874e452d91c39c1e2bab2e4242fe208abd23ae1fbaa259aadd0365c6330f1a46"} Mar 12 14:38:25 crc kubenswrapper[4921]: I0312 14:38:25.620376 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vzc8j" podStartSLOduration=3.171950591 podStartE2EDuration="6.620353275s" podCreationTimestamp="2026-03-12 14:38:19 +0000 UTC" firstStartedPulling="2026-03-12 14:38:21.540403949 +0000 UTC m=+5324.230475920" lastFinishedPulling="2026-03-12 14:38:24.988806633 +0000 UTC m=+5327.678878604" observedRunningTime="2026-03-12 14:38:25.611873863 +0000 UTC m=+5328.301945844" watchObservedRunningTime="2026-03-12 14:38:25.620353275 +0000 UTC m=+5328.310425256" Mar 12 14:38:27 crc kubenswrapper[4921]: I0312 14:38:27.991324 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:38:27 crc kubenswrapper[4921]: E0312 14:38:27.992267 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:38:30 crc kubenswrapper[4921]: I0312 14:38:30.227938 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:30 crc kubenswrapper[4921]: I0312 14:38:30.228322 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:30 crc kubenswrapper[4921]: I0312 14:38:30.273648 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:30 crc kubenswrapper[4921]: I0312 14:38:30.687723 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:30 crc kubenswrapper[4921]: I0312 14:38:30.742270 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzc8j"] Mar 12 14:38:32 crc kubenswrapper[4921]: I0312 14:38:32.654181 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vzc8j" podUID="13d46769-0d17-4322-99ad-4d7a9cd1b24a" containerName="registry-server" containerID="cri-o://874e452d91c39c1e2bab2e4242fe208abd23ae1fbaa259aadd0365c6330f1a46" gracePeriod=2 Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.255773 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.369699 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbbr7\" (UniqueName: \"kubernetes.io/projected/13d46769-0d17-4322-99ad-4d7a9cd1b24a-kube-api-access-nbbr7\") pod \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\" (UID: \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\") " Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.369777 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13d46769-0d17-4322-99ad-4d7a9cd1b24a-utilities\") pod \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\" (UID: \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\") " Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.369917 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13d46769-0d17-4322-99ad-4d7a9cd1b24a-catalog-content\") pod \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\" (UID: \"13d46769-0d17-4322-99ad-4d7a9cd1b24a\") " Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.372469 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13d46769-0d17-4322-99ad-4d7a9cd1b24a-utilities" (OuterVolumeSpecName: "utilities") pod "13d46769-0d17-4322-99ad-4d7a9cd1b24a" (UID: "13d46769-0d17-4322-99ad-4d7a9cd1b24a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.472369 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13d46769-0d17-4322-99ad-4d7a9cd1b24a-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.476162 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13d46769-0d17-4322-99ad-4d7a9cd1b24a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13d46769-0d17-4322-99ad-4d7a9cd1b24a" (UID: "13d46769-0d17-4322-99ad-4d7a9cd1b24a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.574509 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13d46769-0d17-4322-99ad-4d7a9cd1b24a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.663388 4921 generic.go:334] "Generic (PLEG): container finished" podID="13d46769-0d17-4322-99ad-4d7a9cd1b24a" containerID="874e452d91c39c1e2bab2e4242fe208abd23ae1fbaa259aadd0365c6330f1a46" exitCode=0 Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.663430 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzc8j" event={"ID":"13d46769-0d17-4322-99ad-4d7a9cd1b24a","Type":"ContainerDied","Data":"874e452d91c39c1e2bab2e4242fe208abd23ae1fbaa259aadd0365c6330f1a46"} Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.663457 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzc8j" event={"ID":"13d46769-0d17-4322-99ad-4d7a9cd1b24a","Type":"ContainerDied","Data":"c7753a7c8f49ae846d4703f84d6f19d4d681dd5ac79d831b4ae81cb08caea059"} Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.663473 4921 scope.go:117] "RemoveContainer" containerID="874e452d91c39c1e2bab2e4242fe208abd23ae1fbaa259aadd0365c6330f1a46" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.664498 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzc8j" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.682094 4921 scope.go:117] "RemoveContainer" containerID="da4de546212d481e764a1aa5cb7de68077c707c02fbb7e150edb4d8f6527eaa4" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.708482 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13d46769-0d17-4322-99ad-4d7a9cd1b24a-kube-api-access-nbbr7" (OuterVolumeSpecName: "kube-api-access-nbbr7") pod "13d46769-0d17-4322-99ad-4d7a9cd1b24a" (UID: "13d46769-0d17-4322-99ad-4d7a9cd1b24a"). InnerVolumeSpecName "kube-api-access-nbbr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.721294 4921 scope.go:117] "RemoveContainer" containerID="3227f3b6a309de8dd7b9a8072d84e7dffe3f2360286dd3a484c2bccf9f11c4fc" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.780874 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbbr7\" (UniqueName: \"kubernetes.io/projected/13d46769-0d17-4322-99ad-4d7a9cd1b24a-kube-api-access-nbbr7\") on node \"crc\" DevicePath \"\"" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.804520 4921 scope.go:117] "RemoveContainer" containerID="874e452d91c39c1e2bab2e4242fe208abd23ae1fbaa259aadd0365c6330f1a46" Mar 12 14:38:33 crc kubenswrapper[4921]: E0312 14:38:33.805131 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"874e452d91c39c1e2bab2e4242fe208abd23ae1fbaa259aadd0365c6330f1a46\": container with ID starting with 874e452d91c39c1e2bab2e4242fe208abd23ae1fbaa259aadd0365c6330f1a46 not found: ID does not exist" containerID="874e452d91c39c1e2bab2e4242fe208abd23ae1fbaa259aadd0365c6330f1a46" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.805182 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"874e452d91c39c1e2bab2e4242fe208abd23ae1fbaa259aadd0365c6330f1a46"} err="failed to get container status \"874e452d91c39c1e2bab2e4242fe208abd23ae1fbaa259aadd0365c6330f1a46\": rpc error: code = NotFound desc = could not find container \"874e452d91c39c1e2bab2e4242fe208abd23ae1fbaa259aadd0365c6330f1a46\": container with ID starting with 874e452d91c39c1e2bab2e4242fe208abd23ae1fbaa259aadd0365c6330f1a46 not found: ID does not exist" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.805207 4921 scope.go:117] "RemoveContainer" containerID="da4de546212d481e764a1aa5cb7de68077c707c02fbb7e150edb4d8f6527eaa4" Mar 12 14:38:33 crc kubenswrapper[4921]: E0312 14:38:33.805469 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da4de546212d481e764a1aa5cb7de68077c707c02fbb7e150edb4d8f6527eaa4\": container with ID starting with da4de546212d481e764a1aa5cb7de68077c707c02fbb7e150edb4d8f6527eaa4 not found: ID does not exist" containerID="da4de546212d481e764a1aa5cb7de68077c707c02fbb7e150edb4d8f6527eaa4" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.805498 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da4de546212d481e764a1aa5cb7de68077c707c02fbb7e150edb4d8f6527eaa4"} err="failed to get container status \"da4de546212d481e764a1aa5cb7de68077c707c02fbb7e150edb4d8f6527eaa4\": rpc error: code = NotFound desc = could not find container \"da4de546212d481e764a1aa5cb7de68077c707c02fbb7e150edb4d8f6527eaa4\": container with ID starting with da4de546212d481e764a1aa5cb7de68077c707c02fbb7e150edb4d8f6527eaa4 not found: ID does not exist" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.805518 4921 scope.go:117] "RemoveContainer" containerID="3227f3b6a309de8dd7b9a8072d84e7dffe3f2360286dd3a484c2bccf9f11c4fc" Mar 12 14:38:33 crc kubenswrapper[4921]: E0312 14:38:33.805750 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3227f3b6a309de8dd7b9a8072d84e7dffe3f2360286dd3a484c2bccf9f11c4fc\": container with ID starting with 3227f3b6a309de8dd7b9a8072d84e7dffe3f2360286dd3a484c2bccf9f11c4fc not found: ID does not exist" containerID="3227f3b6a309de8dd7b9a8072d84e7dffe3f2360286dd3a484c2bccf9f11c4fc" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.805771 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3227f3b6a309de8dd7b9a8072d84e7dffe3f2360286dd3a484c2bccf9f11c4fc"} err="failed to get container status \"3227f3b6a309de8dd7b9a8072d84e7dffe3f2360286dd3a484c2bccf9f11c4fc\": rpc error: code = NotFound desc = could not find container \"3227f3b6a309de8dd7b9a8072d84e7dffe3f2360286dd3a484c2bccf9f11c4fc\": container with ID starting with 3227f3b6a309de8dd7b9a8072d84e7dffe3f2360286dd3a484c2bccf9f11c4fc not found: ID does not exist" Mar 12 14:38:33 crc kubenswrapper[4921]: I0312 14:38:33.995875 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzc8j"] Mar 12 14:38:34 crc kubenswrapper[4921]: I0312 14:38:34.011110 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vzc8j"] Mar 12 14:38:35 crc kubenswrapper[4921]: I0312 14:38:35.995125 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13d46769-0d17-4322-99ad-4d7a9cd1b24a" path="/var/lib/kubelet/pods/13d46769-0d17-4322-99ad-4d7a9cd1b24a/volumes" Mar 12 14:38:39 crc kubenswrapper[4921]: I0312 14:38:39.983477 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:38:39 crc kubenswrapper[4921]: E0312 14:38:39.985266 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:38:48 crc kubenswrapper[4921]: I0312 14:38:48.039683 4921 scope.go:117] "RemoveContainer" containerID="85335a632c5b9f99465a185589b65d59f09576dadde18531099e1346ada07db3" Mar 12 14:38:50 crc kubenswrapper[4921]: I0312 14:38:50.983881 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:38:50 crc kubenswrapper[4921]: E0312 14:38:50.984531 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.376950 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vcvtv"] Mar 12 14:38:55 crc kubenswrapper[4921]: E0312 14:38:55.378029 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d46769-0d17-4322-99ad-4d7a9cd1b24a" containerName="registry-server" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.378045 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d46769-0d17-4322-99ad-4d7a9cd1b24a" containerName="registry-server" Mar 12 14:38:55 crc kubenswrapper[4921]: E0312 14:38:55.378065 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d46769-0d17-4322-99ad-4d7a9cd1b24a" containerName="extract-utilities" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.378074 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d46769-0d17-4322-99ad-4d7a9cd1b24a" containerName="extract-utilities" Mar 12 14:38:55 crc kubenswrapper[4921]: E0312 14:38:55.378091 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13d46769-0d17-4322-99ad-4d7a9cd1b24a" containerName="extract-content" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.378098 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d46769-0d17-4322-99ad-4d7a9cd1b24a" containerName="extract-content" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.378331 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="13d46769-0d17-4322-99ad-4d7a9cd1b24a" containerName="registry-server" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.380064 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.390853 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vcvtv"] Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.563861 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed89d84-fa38-42fa-82d6-54912e1f72d4-catalog-content\") pod \"redhat-operators-vcvtv\" (UID: \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\") " pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.564226 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lhrm\" (UniqueName: \"kubernetes.io/projected/3ed89d84-fa38-42fa-82d6-54912e1f72d4-kube-api-access-6lhrm\") pod \"redhat-operators-vcvtv\" (UID: \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\") " pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.564426 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed89d84-fa38-42fa-82d6-54912e1f72d4-utilities\") pod \"redhat-operators-vcvtv\" (UID: \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\") " pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.666108 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed89d84-fa38-42fa-82d6-54912e1f72d4-utilities\") pod \"redhat-operators-vcvtv\" (UID: \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\") " pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.666421 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed89d84-fa38-42fa-82d6-54912e1f72d4-catalog-content\") pod \"redhat-operators-vcvtv\" (UID: \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\") " pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.666502 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lhrm\" (UniqueName: \"kubernetes.io/projected/3ed89d84-fa38-42fa-82d6-54912e1f72d4-kube-api-access-6lhrm\") pod \"redhat-operators-vcvtv\" (UID: \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\") " pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.666774 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed89d84-fa38-42fa-82d6-54912e1f72d4-utilities\") pod \"redhat-operators-vcvtv\" (UID: \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\") " pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.667001 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed89d84-fa38-42fa-82d6-54912e1f72d4-catalog-content\") pod \"redhat-operators-vcvtv\" (UID: \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\") " pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.690722 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lhrm\" (UniqueName: \"kubernetes.io/projected/3ed89d84-fa38-42fa-82d6-54912e1f72d4-kube-api-access-6lhrm\") pod \"redhat-operators-vcvtv\" (UID: \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\") " pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:38:55 crc kubenswrapper[4921]: I0312 14:38:55.720420 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:38:56 crc kubenswrapper[4921]: I0312 14:38:56.190102 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vcvtv"] Mar 12 14:38:56 crc kubenswrapper[4921]: I0312 14:38:56.891243 4921 generic.go:334] "Generic (PLEG): container finished" podID="3ed89d84-fa38-42fa-82d6-54912e1f72d4" containerID="7c9f902bc912a384167a1bf81ed2c247db7f33243fcd16a3a20622e44df92499" exitCode=0 Mar 12 14:38:56 crc kubenswrapper[4921]: I0312 14:38:56.891318 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcvtv" event={"ID":"3ed89d84-fa38-42fa-82d6-54912e1f72d4","Type":"ContainerDied","Data":"7c9f902bc912a384167a1bf81ed2c247db7f33243fcd16a3a20622e44df92499"} Mar 12 14:38:56 crc kubenswrapper[4921]: I0312 14:38:56.891669 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcvtv" event={"ID":"3ed89d84-fa38-42fa-82d6-54912e1f72d4","Type":"ContainerStarted","Data":"b3efdc2bd732c3cc8c2bf584e9fc59e82bd2c44cc38fc3aee3129eb0df609a87"} Mar 12 14:38:58 crc kubenswrapper[4921]: I0312 14:38:58.914501 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcvtv" event={"ID":"3ed89d84-fa38-42fa-82d6-54912e1f72d4","Type":"ContainerStarted","Data":"412b04bf6c37a42a6dc103f563cf6d4a98953388890b434d02ad38ccc892b2c6"} Mar 12 14:39:03 crc kubenswrapper[4921]: I0312 14:39:03.956298 4921 generic.go:334] "Generic (PLEG): container finished" podID="3ed89d84-fa38-42fa-82d6-54912e1f72d4" containerID="412b04bf6c37a42a6dc103f563cf6d4a98953388890b434d02ad38ccc892b2c6" exitCode=0 Mar 12 14:39:03 crc kubenswrapper[4921]: I0312 14:39:03.956841 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcvtv" event={"ID":"3ed89d84-fa38-42fa-82d6-54912e1f72d4","Type":"ContainerDied","Data":"412b04bf6c37a42a6dc103f563cf6d4a98953388890b434d02ad38ccc892b2c6"} Mar 12 14:39:03 crc kubenswrapper[4921]: I0312 14:39:03.959351 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:39:04 crc kubenswrapper[4921]: I0312 14:39:04.968431 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcvtv" event={"ID":"3ed89d84-fa38-42fa-82d6-54912e1f72d4","Type":"ContainerStarted","Data":"e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8"} Mar 12 14:39:04 crc kubenswrapper[4921]: I0312 14:39:04.988977 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:39:04 crc kubenswrapper[4921]: E0312 14:39:04.989238 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:39:04 crc kubenswrapper[4921]: I0312 14:39:04.990957 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vcvtv" podStartSLOduration=2.509991728 podStartE2EDuration="9.990937403s" podCreationTimestamp="2026-03-12 14:38:55 +0000 UTC" firstStartedPulling="2026-03-12 14:38:56.893401799 +0000 UTC m=+5359.583473760" lastFinishedPulling="2026-03-12 14:39:04.374347464 +0000 UTC m=+5367.064419435" observedRunningTime="2026-03-12 14:39:04.988448906 +0000 UTC m=+5367.678520867" watchObservedRunningTime="2026-03-12 14:39:04.990937403 +0000 UTC m=+5367.681009364" Mar 12 14:39:05 crc kubenswrapper[4921]: I0312 14:39:05.721524 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:39:05 crc kubenswrapper[4921]: I0312 14:39:05.722073 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:39:06 crc kubenswrapper[4921]: I0312 14:39:06.797342 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vcvtv" podUID="3ed89d84-fa38-42fa-82d6-54912e1f72d4" containerName="registry-server" probeResult="failure" output=< Mar 12 14:39:06 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 14:39:06 crc kubenswrapper[4921]: > Mar 12 14:39:15 crc kubenswrapper[4921]: I0312 14:39:15.779963 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:39:15 crc kubenswrapper[4921]: I0312 14:39:15.835257 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:39:15 crc kubenswrapper[4921]: I0312 14:39:15.983976 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:39:15 crc kubenswrapper[4921]: E0312 14:39:15.984297 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:39:16 crc kubenswrapper[4921]: I0312 14:39:16.023864 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vcvtv"] Mar 12 14:39:17 crc kubenswrapper[4921]: I0312 14:39:17.113835 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vcvtv" podUID="3ed89d84-fa38-42fa-82d6-54912e1f72d4" containerName="registry-server" containerID="cri-o://e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8" gracePeriod=2 Mar 12 14:39:17 crc kubenswrapper[4921]: E0312 14:39:17.306028 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ed89d84_fa38_42fa_82d6_54912e1f72d4.slice/crio-e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ed89d84_fa38_42fa_82d6_54912e1f72d4.slice/crio-conmon-e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8.scope\": RecentStats: unable to find data in memory cache]" Mar 12 14:39:17 crc kubenswrapper[4921]: I0312 14:39:17.702399 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:39:17 crc kubenswrapper[4921]: I0312 14:39:17.740107 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed89d84-fa38-42fa-82d6-54912e1f72d4-utilities\") pod \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\" (UID: \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\") " Mar 12 14:39:17 crc kubenswrapper[4921]: I0312 14:39:17.740286 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed89d84-fa38-42fa-82d6-54912e1f72d4-catalog-content\") pod \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\" (UID: \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\") " Mar 12 14:39:17 crc kubenswrapper[4921]: I0312 14:39:17.740378 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lhrm\" (UniqueName: \"kubernetes.io/projected/3ed89d84-fa38-42fa-82d6-54912e1f72d4-kube-api-access-6lhrm\") pod \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\" (UID: \"3ed89d84-fa38-42fa-82d6-54912e1f72d4\") " Mar 12 14:39:17 crc kubenswrapper[4921]: I0312 14:39:17.740941 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed89d84-fa38-42fa-82d6-54912e1f72d4-utilities" (OuterVolumeSpecName: "utilities") pod "3ed89d84-fa38-42fa-82d6-54912e1f72d4" (UID: "3ed89d84-fa38-42fa-82d6-54912e1f72d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:39:17 crc kubenswrapper[4921]: I0312 14:39:17.741168 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed89d84-fa38-42fa-82d6-54912e1f72d4-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:39:17 crc kubenswrapper[4921]: I0312 14:39:17.749328 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed89d84-fa38-42fa-82d6-54912e1f72d4-kube-api-access-6lhrm" (OuterVolumeSpecName: "kube-api-access-6lhrm") pod "3ed89d84-fa38-42fa-82d6-54912e1f72d4" (UID: "3ed89d84-fa38-42fa-82d6-54912e1f72d4"). InnerVolumeSpecName "kube-api-access-6lhrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:39:17 crc kubenswrapper[4921]: I0312 14:39:17.842691 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lhrm\" (UniqueName: \"kubernetes.io/projected/3ed89d84-fa38-42fa-82d6-54912e1f72d4-kube-api-access-6lhrm\") on node \"crc\" DevicePath \"\"" Mar 12 14:39:17 crc kubenswrapper[4921]: I0312 14:39:17.883273 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed89d84-fa38-42fa-82d6-54912e1f72d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ed89d84-fa38-42fa-82d6-54912e1f72d4" (UID: "3ed89d84-fa38-42fa-82d6-54912e1f72d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:39:17 crc kubenswrapper[4921]: I0312 14:39:17.944468 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed89d84-fa38-42fa-82d6-54912e1f72d4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.123657 4921 generic.go:334] "Generic (PLEG): container finished" podID="3ed89d84-fa38-42fa-82d6-54912e1f72d4" containerID="e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8" exitCode=0 Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.123710 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcvtv" event={"ID":"3ed89d84-fa38-42fa-82d6-54912e1f72d4","Type":"ContainerDied","Data":"e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8"} Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.123739 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcvtv" event={"ID":"3ed89d84-fa38-42fa-82d6-54912e1f72d4","Type":"ContainerDied","Data":"b3efdc2bd732c3cc8c2bf584e9fc59e82bd2c44cc38fc3aee3129eb0df609a87"} Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.123757 4921 scope.go:117] "RemoveContainer" containerID="e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8" Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.123898 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcvtv" Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.149141 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vcvtv"] Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.152227 4921 scope.go:117] "RemoveContainer" containerID="412b04bf6c37a42a6dc103f563cf6d4a98953388890b434d02ad38ccc892b2c6" Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.160078 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vcvtv"] Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.422581 4921 scope.go:117] "RemoveContainer" containerID="7c9f902bc912a384167a1bf81ed2c247db7f33243fcd16a3a20622e44df92499" Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.491939 4921 scope.go:117] "RemoveContainer" containerID="e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8" Mar 12 14:39:18 crc kubenswrapper[4921]: E0312 14:39:18.492628 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8\": container with ID starting with e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8 not found: ID does not exist" containerID="e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8" Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.492665 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8"} err="failed to get container status \"e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8\": rpc error: code = NotFound desc = could not find container \"e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8\": container with ID starting with e207b93f75b75b847a3afe9294532298e64713db6614f0e38b5657700e822ca8 not found: ID does not exist" Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.492692 4921 scope.go:117] "RemoveContainer" containerID="412b04bf6c37a42a6dc103f563cf6d4a98953388890b434d02ad38ccc892b2c6" Mar 12 14:39:18 crc kubenswrapper[4921]: E0312 14:39:18.493123 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"412b04bf6c37a42a6dc103f563cf6d4a98953388890b434d02ad38ccc892b2c6\": container with ID starting with 412b04bf6c37a42a6dc103f563cf6d4a98953388890b434d02ad38ccc892b2c6 not found: ID does not exist" containerID="412b04bf6c37a42a6dc103f563cf6d4a98953388890b434d02ad38ccc892b2c6" Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.493152 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"412b04bf6c37a42a6dc103f563cf6d4a98953388890b434d02ad38ccc892b2c6"} err="failed to get container status \"412b04bf6c37a42a6dc103f563cf6d4a98953388890b434d02ad38ccc892b2c6\": rpc error: code = NotFound desc = could not find container \"412b04bf6c37a42a6dc103f563cf6d4a98953388890b434d02ad38ccc892b2c6\": container with ID starting with 412b04bf6c37a42a6dc103f563cf6d4a98953388890b434d02ad38ccc892b2c6 not found: ID does not exist" Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.493170 4921 scope.go:117] "RemoveContainer" containerID="7c9f902bc912a384167a1bf81ed2c247db7f33243fcd16a3a20622e44df92499" Mar 12 14:39:18 crc kubenswrapper[4921]: E0312 14:39:18.493404 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c9f902bc912a384167a1bf81ed2c247db7f33243fcd16a3a20622e44df92499\": container with ID starting with 7c9f902bc912a384167a1bf81ed2c247db7f33243fcd16a3a20622e44df92499 not found: ID does not exist" containerID="7c9f902bc912a384167a1bf81ed2c247db7f33243fcd16a3a20622e44df92499" Mar 12 14:39:18 crc kubenswrapper[4921]: I0312 14:39:18.493432 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c9f902bc912a384167a1bf81ed2c247db7f33243fcd16a3a20622e44df92499"} err="failed to get container status \"7c9f902bc912a384167a1bf81ed2c247db7f33243fcd16a3a20622e44df92499\": rpc error: code = NotFound desc = could not find container \"7c9f902bc912a384167a1bf81ed2c247db7f33243fcd16a3a20622e44df92499\": container with ID starting with 7c9f902bc912a384167a1bf81ed2c247db7f33243fcd16a3a20622e44df92499 not found: ID does not exist" Mar 12 14:39:19 crc kubenswrapper[4921]: I0312 14:39:19.997949 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed89d84-fa38-42fa-82d6-54912e1f72d4" path="/var/lib/kubelet/pods/3ed89d84-fa38-42fa-82d6-54912e1f72d4/volumes" Mar 12 14:39:29 crc kubenswrapper[4921]: I0312 14:39:29.989001 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:39:30 crc kubenswrapper[4921]: I0312 14:39:30.386798 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"d788b2bf8430364601978f85d83f69873cfb108662e13e91e5249e48c7f0a12b"} Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.142504 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555440-crc6g"] Mar 12 14:40:00 crc kubenswrapper[4921]: E0312 14:40:00.143525 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed89d84-fa38-42fa-82d6-54912e1f72d4" containerName="registry-server" Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.143542 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed89d84-fa38-42fa-82d6-54912e1f72d4" containerName="registry-server" Mar 12 14:40:00 crc kubenswrapper[4921]: E0312 14:40:00.143552 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed89d84-fa38-42fa-82d6-54912e1f72d4" containerName="extract-content" Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.143560 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed89d84-fa38-42fa-82d6-54912e1f72d4" containerName="extract-content" Mar 12 14:40:00 crc kubenswrapper[4921]: E0312 14:40:00.143606 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed89d84-fa38-42fa-82d6-54912e1f72d4" containerName="extract-utilities" Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.143615 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed89d84-fa38-42fa-82d6-54912e1f72d4" containerName="extract-utilities" Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.143860 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed89d84-fa38-42fa-82d6-54912e1f72d4" containerName="registry-server" Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.144640 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555440-crc6g" Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.146854 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.147200 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.149135 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.154396 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555440-crc6g"] Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.253699 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58sr7\" (UniqueName: \"kubernetes.io/projected/44afce34-b62f-457b-a225-f8be11c8b20b-kube-api-access-58sr7\") pod \"auto-csr-approver-29555440-crc6g\" (UID: \"44afce34-b62f-457b-a225-f8be11c8b20b\") " pod="openshift-infra/auto-csr-approver-29555440-crc6g" Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.356476 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58sr7\" (UniqueName: \"kubernetes.io/projected/44afce34-b62f-457b-a225-f8be11c8b20b-kube-api-access-58sr7\") pod \"auto-csr-approver-29555440-crc6g\" (UID: \"44afce34-b62f-457b-a225-f8be11c8b20b\") " pod="openshift-infra/auto-csr-approver-29555440-crc6g" Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.378587 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58sr7\" (UniqueName: \"kubernetes.io/projected/44afce34-b62f-457b-a225-f8be11c8b20b-kube-api-access-58sr7\") pod \"auto-csr-approver-29555440-crc6g\" (UID: \"44afce34-b62f-457b-a225-f8be11c8b20b\") " pod="openshift-infra/auto-csr-approver-29555440-crc6g" Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.463176 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555440-crc6g" Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.936309 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555440-crc6g"] Mar 12 14:40:00 crc kubenswrapper[4921]: I0312 14:40:00.992418 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555440-crc6g" event={"ID":"44afce34-b62f-457b-a225-f8be11c8b20b","Type":"ContainerStarted","Data":"5c732d141f33e89cf1f4fcf8f0ee34af0c4fda6e4bebaa70d273d047f3e000cd"} Mar 12 14:40:04 crc kubenswrapper[4921]: I0312 14:40:04.021126 4921 generic.go:334] "Generic (PLEG): container finished" podID="44afce34-b62f-457b-a225-f8be11c8b20b" containerID="04aa8d94f68de59ff537c18cc1213af985c0e558e84cb1291be0fbc0a881d4a1" exitCode=0 Mar 12 14:40:04 crc kubenswrapper[4921]: I0312 14:40:04.021253 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555440-crc6g" event={"ID":"44afce34-b62f-457b-a225-f8be11c8b20b","Type":"ContainerDied","Data":"04aa8d94f68de59ff537c18cc1213af985c0e558e84cb1291be0fbc0a881d4a1"} Mar 12 14:40:05 crc kubenswrapper[4921]: I0312 14:40:05.604925 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555440-crc6g" Mar 12 14:40:05 crc kubenswrapper[4921]: I0312 14:40:05.685408 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58sr7\" (UniqueName: \"kubernetes.io/projected/44afce34-b62f-457b-a225-f8be11c8b20b-kube-api-access-58sr7\") pod \"44afce34-b62f-457b-a225-f8be11c8b20b\" (UID: \"44afce34-b62f-457b-a225-f8be11c8b20b\") " Mar 12 14:40:05 crc kubenswrapper[4921]: I0312 14:40:05.691975 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44afce34-b62f-457b-a225-f8be11c8b20b-kube-api-access-58sr7" (OuterVolumeSpecName: "kube-api-access-58sr7") pod "44afce34-b62f-457b-a225-f8be11c8b20b" (UID: "44afce34-b62f-457b-a225-f8be11c8b20b"). InnerVolumeSpecName "kube-api-access-58sr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:40:05 crc kubenswrapper[4921]: I0312 14:40:05.787882 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58sr7\" (UniqueName: \"kubernetes.io/projected/44afce34-b62f-457b-a225-f8be11c8b20b-kube-api-access-58sr7\") on node \"crc\" DevicePath \"\"" Mar 12 14:40:06 crc kubenswrapper[4921]: I0312 14:40:06.039523 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555440-crc6g" event={"ID":"44afce34-b62f-457b-a225-f8be11c8b20b","Type":"ContainerDied","Data":"5c732d141f33e89cf1f4fcf8f0ee34af0c4fda6e4bebaa70d273d047f3e000cd"} Mar 12 14:40:06 crc kubenswrapper[4921]: I0312 14:40:06.039585 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c732d141f33e89cf1f4fcf8f0ee34af0c4fda6e4bebaa70d273d047f3e000cd" Mar 12 14:40:06 crc kubenswrapper[4921]: I0312 14:40:06.039599 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555440-crc6g" Mar 12 14:40:06 crc kubenswrapper[4921]: I0312 14:40:06.678502 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555434-tv7kn"] Mar 12 14:40:06 crc kubenswrapper[4921]: I0312 14:40:06.693028 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555434-tv7kn"] Mar 12 14:40:07 crc kubenswrapper[4921]: I0312 14:40:07.994357 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7" path="/var/lib/kubelet/pods/fec5e8bb-9e1e-47bc-9275-a22cc1c7f4c7/volumes" Mar 12 14:40:48 crc kubenswrapper[4921]: I0312 14:40:48.171766 4921 scope.go:117] "RemoveContainer" containerID="0b1f50333fe75d1be69571abf0c9d5d76f8705f468340cc7dde63fb3f9705965" Mar 12 14:41:56 crc kubenswrapper[4921]: I0312 14:41:56.324392 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:41:56 crc kubenswrapper[4921]: I0312 14:41:56.324945 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:42:00 crc kubenswrapper[4921]: I0312 14:42:00.147395 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555442-6brsm"] Mar 12 14:42:00 crc kubenswrapper[4921]: E0312 14:42:00.148391 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44afce34-b62f-457b-a225-f8be11c8b20b" containerName="oc" Mar 12 14:42:00 crc kubenswrapper[4921]: I0312 14:42:00.148409 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="44afce34-b62f-457b-a225-f8be11c8b20b" containerName="oc" Mar 12 14:42:00 crc kubenswrapper[4921]: I0312 14:42:00.148658 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="44afce34-b62f-457b-a225-f8be11c8b20b" containerName="oc" Mar 12 14:42:00 crc kubenswrapper[4921]: I0312 14:42:00.149460 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555442-6brsm" Mar 12 14:42:00 crc kubenswrapper[4921]: I0312 14:42:00.152247 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:42:00 crc kubenswrapper[4921]: I0312 14:42:00.152411 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:42:00 crc kubenswrapper[4921]: I0312 14:42:00.152516 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:42:00 crc kubenswrapper[4921]: I0312 14:42:00.167511 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkrrr\" (UniqueName: \"kubernetes.io/projected/2756ba22-178c-45ec-b7ac-4661efa5b786-kube-api-access-zkrrr\") pod \"auto-csr-approver-29555442-6brsm\" (UID: \"2756ba22-178c-45ec-b7ac-4661efa5b786\") " pod="openshift-infra/auto-csr-approver-29555442-6brsm" Mar 12 14:42:00 crc kubenswrapper[4921]: I0312 14:42:00.175998 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555442-6brsm"] Mar 12 14:42:00 crc kubenswrapper[4921]: I0312 14:42:00.269525 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkrrr\" (UniqueName: \"kubernetes.io/projected/2756ba22-178c-45ec-b7ac-4661efa5b786-kube-api-access-zkrrr\") pod \"auto-csr-approver-29555442-6brsm\" (UID: \"2756ba22-178c-45ec-b7ac-4661efa5b786\") " pod="openshift-infra/auto-csr-approver-29555442-6brsm" Mar 12 14:42:00 crc kubenswrapper[4921]: I0312 14:42:00.296678 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkrrr\" (UniqueName: \"kubernetes.io/projected/2756ba22-178c-45ec-b7ac-4661efa5b786-kube-api-access-zkrrr\") pod \"auto-csr-approver-29555442-6brsm\" (UID: \"2756ba22-178c-45ec-b7ac-4661efa5b786\") " pod="openshift-infra/auto-csr-approver-29555442-6brsm" Mar 12 14:42:00 crc kubenswrapper[4921]: I0312 14:42:00.471279 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555442-6brsm" Mar 12 14:42:00 crc kubenswrapper[4921]: I0312 14:42:00.948670 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555442-6brsm"] Mar 12 14:42:01 crc kubenswrapper[4921]: I0312 14:42:01.061459 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555442-6brsm" event={"ID":"2756ba22-178c-45ec-b7ac-4661efa5b786","Type":"ContainerStarted","Data":"d1edb158739019945dddbd2a5877b9d69515a15acfc084e1ab6839b033d82b0c"} Mar 12 14:42:03 crc kubenswrapper[4921]: I0312 14:42:03.080893 4921 generic.go:334] "Generic (PLEG): container finished" podID="2756ba22-178c-45ec-b7ac-4661efa5b786" containerID="54dfcbe34888661c76bb40160619ffeb6f5162afadc76849206237e1e0714a76" exitCode=0 Mar 12 14:42:03 crc kubenswrapper[4921]: I0312 14:42:03.080992 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555442-6brsm" event={"ID":"2756ba22-178c-45ec-b7ac-4661efa5b786","Type":"ContainerDied","Data":"54dfcbe34888661c76bb40160619ffeb6f5162afadc76849206237e1e0714a76"} Mar 12 14:42:04 crc kubenswrapper[4921]: I0312 14:42:04.594776 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555442-6brsm" Mar 12 14:42:04 crc kubenswrapper[4921]: I0312 14:42:04.765499 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkrrr\" (UniqueName: \"kubernetes.io/projected/2756ba22-178c-45ec-b7ac-4661efa5b786-kube-api-access-zkrrr\") pod \"2756ba22-178c-45ec-b7ac-4661efa5b786\" (UID: \"2756ba22-178c-45ec-b7ac-4661efa5b786\") " Mar 12 14:42:04 crc kubenswrapper[4921]: I0312 14:42:04.771275 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2756ba22-178c-45ec-b7ac-4661efa5b786-kube-api-access-zkrrr" (OuterVolumeSpecName: "kube-api-access-zkrrr") pod "2756ba22-178c-45ec-b7ac-4661efa5b786" (UID: "2756ba22-178c-45ec-b7ac-4661efa5b786"). InnerVolumeSpecName "kube-api-access-zkrrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:42:04 crc kubenswrapper[4921]: I0312 14:42:04.868731 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkrrr\" (UniqueName: \"kubernetes.io/projected/2756ba22-178c-45ec-b7ac-4661efa5b786-kube-api-access-zkrrr\") on node \"crc\" DevicePath \"\"" Mar 12 14:42:05 crc kubenswrapper[4921]: I0312 14:42:05.102227 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555442-6brsm" event={"ID":"2756ba22-178c-45ec-b7ac-4661efa5b786","Type":"ContainerDied","Data":"d1edb158739019945dddbd2a5877b9d69515a15acfc084e1ab6839b033d82b0c"} Mar 12 14:42:05 crc kubenswrapper[4921]: I0312 14:42:05.102268 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1edb158739019945dddbd2a5877b9d69515a15acfc084e1ab6839b033d82b0c" Mar 12 14:42:05 crc kubenswrapper[4921]: I0312 14:42:05.102323 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555442-6brsm" Mar 12 14:42:05 crc kubenswrapper[4921]: I0312 14:42:05.695047 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555436-zv8kj"] Mar 12 14:42:05 crc kubenswrapper[4921]: I0312 14:42:05.706765 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555436-zv8kj"] Mar 12 14:42:05 crc kubenswrapper[4921]: I0312 14:42:05.994979 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cd4fe7e-4531-412b-a9f3-03e79a87d30d" path="/var/lib/kubelet/pods/0cd4fe7e-4531-412b-a9f3-03e79a87d30d/volumes" Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.331439 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8h45z"] Mar 12 14:42:12 crc kubenswrapper[4921]: E0312 14:42:12.332985 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2756ba22-178c-45ec-b7ac-4661efa5b786" containerName="oc" Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.333029 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2756ba22-178c-45ec-b7ac-4661efa5b786" containerName="oc" Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.333405 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2756ba22-178c-45ec-b7ac-4661efa5b786" containerName="oc" Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.337331 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.356899 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8h45z"] Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.427120 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-catalog-content\") pod \"community-operators-8h45z\" (UID: \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\") " pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.427490 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8zjp\" (UniqueName: \"kubernetes.io/projected/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-kube-api-access-b8zjp\") pod \"community-operators-8h45z\" (UID: \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\") " pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.427544 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-utilities\") pod \"community-operators-8h45z\" (UID: \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\") " pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.529571 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-catalog-content\") pod \"community-operators-8h45z\" (UID: \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\") " pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.529633 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8zjp\" (UniqueName: \"kubernetes.io/projected/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-kube-api-access-b8zjp\") pod \"community-operators-8h45z\" (UID: \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\") " pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.529665 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-utilities\") pod \"community-operators-8h45z\" (UID: \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\") " pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.530143 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-catalog-content\") pod \"community-operators-8h45z\" (UID: \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\") " pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.530156 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-utilities\") pod \"community-operators-8h45z\" (UID: \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\") " pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.554052 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8zjp\" (UniqueName: \"kubernetes.io/projected/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-kube-api-access-b8zjp\") pod \"community-operators-8h45z\" (UID: \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\") " pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:12 crc kubenswrapper[4921]: I0312 14:42:12.665846 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:13 crc kubenswrapper[4921]: I0312 14:42:13.183354 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8h45z"] Mar 12 14:42:14 crc kubenswrapper[4921]: I0312 14:42:14.179376 4921 generic.go:334] "Generic (PLEG): container finished" podID="1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" containerID="eba8560d6d07a7ee1ca5c1406cfea778b7da31c27124c7da9546bda18eee973a" exitCode=0 Mar 12 14:42:14 crc kubenswrapper[4921]: I0312 14:42:14.179650 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8h45z" event={"ID":"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4","Type":"ContainerDied","Data":"eba8560d6d07a7ee1ca5c1406cfea778b7da31c27124c7da9546bda18eee973a"} Mar 12 14:42:14 crc kubenswrapper[4921]: I0312 14:42:14.179679 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8h45z" event={"ID":"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4","Type":"ContainerStarted","Data":"ade1bfdc7aa111334df5db2f67b817ae758c35181f0ab6487cafecb7447ca6f4"} Mar 12 14:42:17 crc kubenswrapper[4921]: I0312 14:42:17.207604 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8h45z" event={"ID":"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4","Type":"ContainerStarted","Data":"cb02ab526b85224d221e9f101dc3a3c68b1800cc1913752d99ac51ffa98d9979"} Mar 12 14:42:18 crc kubenswrapper[4921]: I0312 14:42:18.220107 4921 generic.go:334] "Generic (PLEG): container finished" podID="1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" containerID="cb02ab526b85224d221e9f101dc3a3c68b1800cc1913752d99ac51ffa98d9979" exitCode=0 Mar 12 14:42:18 crc kubenswrapper[4921]: I0312 14:42:18.220260 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8h45z" event={"ID":"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4","Type":"ContainerDied","Data":"cb02ab526b85224d221e9f101dc3a3c68b1800cc1913752d99ac51ffa98d9979"} Mar 12 14:42:19 crc kubenswrapper[4921]: I0312 14:42:19.236995 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8h45z" event={"ID":"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4","Type":"ContainerStarted","Data":"408a2f50ae271cae0044a21827a53bb725bbad329cd8a84dcb47709e41a4cc66"} Mar 12 14:42:19 crc kubenswrapper[4921]: I0312 14:42:19.263419 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8h45z" podStartSLOduration=2.726815879 podStartE2EDuration="7.263394456s" podCreationTimestamp="2026-03-12 14:42:12 +0000 UTC" firstStartedPulling="2026-03-12 14:42:14.182203227 +0000 UTC m=+5556.872275318" lastFinishedPulling="2026-03-12 14:42:18.718781924 +0000 UTC m=+5561.408853895" observedRunningTime="2026-03-12 14:42:19.254010386 +0000 UTC m=+5561.944082357" watchObservedRunningTime="2026-03-12 14:42:19.263394456 +0000 UTC m=+5561.953466427" Mar 12 14:42:22 crc kubenswrapper[4921]: I0312 14:42:22.666391 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:23 crc kubenswrapper[4921]: I0312 14:42:23.330319 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:23 crc kubenswrapper[4921]: I0312 14:42:23.406214 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:24 crc kubenswrapper[4921]: I0312 14:42:24.334367 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:24 crc kubenswrapper[4921]: I0312 14:42:24.901532 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8h45z"] Mar 12 14:42:26 crc kubenswrapper[4921]: I0312 14:42:26.296116 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8h45z" podUID="1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" containerName="registry-server" containerID="cri-o://408a2f50ae271cae0044a21827a53bb725bbad329cd8a84dcb47709e41a4cc66" gracePeriod=2 Mar 12 14:42:26 crc kubenswrapper[4921]: I0312 14:42:26.324834 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:42:26 crc kubenswrapper[4921]: I0312 14:42:26.324908 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:42:26 crc kubenswrapper[4921]: I0312 14:42:26.889274 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:26 crc kubenswrapper[4921]: I0312 14:42:26.972196 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-catalog-content\") pod \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\" (UID: \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\") " Mar 12 14:42:26 crc kubenswrapper[4921]: I0312 14:42:26.972286 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8zjp\" (UniqueName: \"kubernetes.io/projected/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-kube-api-access-b8zjp\") pod \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\" (UID: \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\") " Mar 12 14:42:26 crc kubenswrapper[4921]: I0312 14:42:26.972393 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-utilities\") pod \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\" (UID: \"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4\") " Mar 12 14:42:26 crc kubenswrapper[4921]: I0312 14:42:26.973586 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-utilities" (OuterVolumeSpecName: "utilities") pod "1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" (UID: "1ed882b2-108d-4733-bb0b-0d7a7f9bcba4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:42:26 crc kubenswrapper[4921]: I0312 14:42:26.992115 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-kube-api-access-b8zjp" (OuterVolumeSpecName: "kube-api-access-b8zjp") pod "1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" (UID: "1ed882b2-108d-4733-bb0b-0d7a7f9bcba4"). InnerVolumeSpecName "kube-api-access-b8zjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.028692 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" (UID: "1ed882b2-108d-4733-bb0b-0d7a7f9bcba4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.075121 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.075158 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8zjp\" (UniqueName: \"kubernetes.io/projected/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-kube-api-access-b8zjp\") on node \"crc\" DevicePath \"\"" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.075171 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.308627 4921 generic.go:334] "Generic (PLEG): container finished" podID="1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" containerID="408a2f50ae271cae0044a21827a53bb725bbad329cd8a84dcb47709e41a4cc66" exitCode=0 Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.308666 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8h45z" event={"ID":"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4","Type":"ContainerDied","Data":"408a2f50ae271cae0044a21827a53bb725bbad329cd8a84dcb47709e41a4cc66"} Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.308698 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8h45z" event={"ID":"1ed882b2-108d-4733-bb0b-0d7a7f9bcba4","Type":"ContainerDied","Data":"ade1bfdc7aa111334df5db2f67b817ae758c35181f0ab6487cafecb7447ca6f4"} Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.308721 4921 scope.go:117] "RemoveContainer" containerID="408a2f50ae271cae0044a21827a53bb725bbad329cd8a84dcb47709e41a4cc66" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.308879 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8h45z" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.347653 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8h45z"] Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.367609 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8h45z"] Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.391247 4921 scope.go:117] "RemoveContainer" containerID="cb02ab526b85224d221e9f101dc3a3c68b1800cc1913752d99ac51ffa98d9979" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.422995 4921 scope.go:117] "RemoveContainer" containerID="eba8560d6d07a7ee1ca5c1406cfea778b7da31c27124c7da9546bda18eee973a" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.465536 4921 scope.go:117] "RemoveContainer" containerID="408a2f50ae271cae0044a21827a53bb725bbad329cd8a84dcb47709e41a4cc66" Mar 12 14:42:27 crc kubenswrapper[4921]: E0312 14:42:27.465947 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408a2f50ae271cae0044a21827a53bb725bbad329cd8a84dcb47709e41a4cc66\": container with ID starting with 408a2f50ae271cae0044a21827a53bb725bbad329cd8a84dcb47709e41a4cc66 not found: ID does not exist" containerID="408a2f50ae271cae0044a21827a53bb725bbad329cd8a84dcb47709e41a4cc66" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.465987 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408a2f50ae271cae0044a21827a53bb725bbad329cd8a84dcb47709e41a4cc66"} err="failed to get container status \"408a2f50ae271cae0044a21827a53bb725bbad329cd8a84dcb47709e41a4cc66\": rpc error: code = NotFound desc = could not find container \"408a2f50ae271cae0044a21827a53bb725bbad329cd8a84dcb47709e41a4cc66\": container with ID starting with 408a2f50ae271cae0044a21827a53bb725bbad329cd8a84dcb47709e41a4cc66 not found: ID does not exist" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.466017 4921 scope.go:117] "RemoveContainer" containerID="cb02ab526b85224d221e9f101dc3a3c68b1800cc1913752d99ac51ffa98d9979" Mar 12 14:42:27 crc kubenswrapper[4921]: E0312 14:42:27.466382 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb02ab526b85224d221e9f101dc3a3c68b1800cc1913752d99ac51ffa98d9979\": container with ID starting with cb02ab526b85224d221e9f101dc3a3c68b1800cc1913752d99ac51ffa98d9979 not found: ID does not exist" containerID="cb02ab526b85224d221e9f101dc3a3c68b1800cc1913752d99ac51ffa98d9979" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.466424 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb02ab526b85224d221e9f101dc3a3c68b1800cc1913752d99ac51ffa98d9979"} err="failed to get container status \"cb02ab526b85224d221e9f101dc3a3c68b1800cc1913752d99ac51ffa98d9979\": rpc error: code = NotFound desc = could not find container \"cb02ab526b85224d221e9f101dc3a3c68b1800cc1913752d99ac51ffa98d9979\": container with ID starting with cb02ab526b85224d221e9f101dc3a3c68b1800cc1913752d99ac51ffa98d9979 not found: ID does not exist" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.466453 4921 scope.go:117] "RemoveContainer" containerID="eba8560d6d07a7ee1ca5c1406cfea778b7da31c27124c7da9546bda18eee973a" Mar 12 14:42:27 crc kubenswrapper[4921]: E0312 14:42:27.466866 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eba8560d6d07a7ee1ca5c1406cfea778b7da31c27124c7da9546bda18eee973a\": container with ID starting with eba8560d6d07a7ee1ca5c1406cfea778b7da31c27124c7da9546bda18eee973a not found: ID does not exist" containerID="eba8560d6d07a7ee1ca5c1406cfea778b7da31c27124c7da9546bda18eee973a" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.466895 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eba8560d6d07a7ee1ca5c1406cfea778b7da31c27124c7da9546bda18eee973a"} err="failed to get container status \"eba8560d6d07a7ee1ca5c1406cfea778b7da31c27124c7da9546bda18eee973a\": rpc error: code = NotFound desc = could not find container \"eba8560d6d07a7ee1ca5c1406cfea778b7da31c27124c7da9546bda18eee973a\": container with ID starting with eba8560d6d07a7ee1ca5c1406cfea778b7da31c27124c7da9546bda18eee973a not found: ID does not exist" Mar 12 14:42:27 crc kubenswrapper[4921]: I0312 14:42:27.994649 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" path="/var/lib/kubelet/pods/1ed882b2-108d-4733-bb0b-0d7a7f9bcba4/volumes" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.099773 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bddgg"] Mar 12 14:42:31 crc kubenswrapper[4921]: E0312 14:42:31.100795 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" containerName="extract-content" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.100829 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" containerName="extract-content" Mar 12 14:42:31 crc kubenswrapper[4921]: E0312 14:42:31.100852 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" containerName="extract-utilities" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.100858 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" containerName="extract-utilities" Mar 12 14:42:31 crc kubenswrapper[4921]: E0312 14:42:31.100871 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" containerName="registry-server" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.100878 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" containerName="registry-server" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.101049 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed882b2-108d-4733-bb0b-0d7a7f9bcba4" containerName="registry-server" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.102636 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.121477 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bddgg"] Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.169976 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kgln\" (UniqueName: \"kubernetes.io/projected/d5583e26-4f9d-49b8-b833-aaeb587cef10-kube-api-access-6kgln\") pod \"redhat-marketplace-bddgg\" (UID: \"d5583e26-4f9d-49b8-b833-aaeb587cef10\") " pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.170057 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5583e26-4f9d-49b8-b833-aaeb587cef10-catalog-content\") pod \"redhat-marketplace-bddgg\" (UID: \"d5583e26-4f9d-49b8-b833-aaeb587cef10\") " pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.170087 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5583e26-4f9d-49b8-b833-aaeb587cef10-utilities\") pod \"redhat-marketplace-bddgg\" (UID: \"d5583e26-4f9d-49b8-b833-aaeb587cef10\") " pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.271969 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5583e26-4f9d-49b8-b833-aaeb587cef10-catalog-content\") pod \"redhat-marketplace-bddgg\" (UID: \"d5583e26-4f9d-49b8-b833-aaeb587cef10\") " pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.272020 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5583e26-4f9d-49b8-b833-aaeb587cef10-utilities\") pod \"redhat-marketplace-bddgg\" (UID: \"d5583e26-4f9d-49b8-b833-aaeb587cef10\") " pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.272204 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kgln\" (UniqueName: \"kubernetes.io/projected/d5583e26-4f9d-49b8-b833-aaeb587cef10-kube-api-access-6kgln\") pod \"redhat-marketplace-bddgg\" (UID: \"d5583e26-4f9d-49b8-b833-aaeb587cef10\") " pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.272990 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5583e26-4f9d-49b8-b833-aaeb587cef10-catalog-content\") pod \"redhat-marketplace-bddgg\" (UID: \"d5583e26-4f9d-49b8-b833-aaeb587cef10\") " pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.273269 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5583e26-4f9d-49b8-b833-aaeb587cef10-utilities\") pod \"redhat-marketplace-bddgg\" (UID: \"d5583e26-4f9d-49b8-b833-aaeb587cef10\") " pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.297822 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kgln\" (UniqueName: \"kubernetes.io/projected/d5583e26-4f9d-49b8-b833-aaeb587cef10-kube-api-access-6kgln\") pod \"redhat-marketplace-bddgg\" (UID: \"d5583e26-4f9d-49b8-b833-aaeb587cef10\") " pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.423688 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:31 crc kubenswrapper[4921]: I0312 14:42:31.929418 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bddgg"] Mar 12 14:42:32 crc kubenswrapper[4921]: I0312 14:42:32.369052 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5583e26-4f9d-49b8-b833-aaeb587cef10" containerID="71ceed042bb0719f17cf52435305d1fe3442bb9bb81f4905fb2d856c65b89c6f" exitCode=0 Mar 12 14:42:32 crc kubenswrapper[4921]: I0312 14:42:32.369169 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bddgg" event={"ID":"d5583e26-4f9d-49b8-b833-aaeb587cef10","Type":"ContainerDied","Data":"71ceed042bb0719f17cf52435305d1fe3442bb9bb81f4905fb2d856c65b89c6f"} Mar 12 14:42:32 crc kubenswrapper[4921]: I0312 14:42:32.369347 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bddgg" event={"ID":"d5583e26-4f9d-49b8-b833-aaeb587cef10","Type":"ContainerStarted","Data":"b4ad42ea6cdae529ddcba19ba82a237282d260129441e5b4d9a58b159b9c28be"} Mar 12 14:42:33 crc kubenswrapper[4921]: I0312 14:42:33.382602 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bddgg" event={"ID":"d5583e26-4f9d-49b8-b833-aaeb587cef10","Type":"ContainerStarted","Data":"824adf21ccef0e75cc1e83865218d4fcc312fb0ecdf7af529feb8172622c6d3a"} Mar 12 14:42:34 crc kubenswrapper[4921]: I0312 14:42:34.398843 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5583e26-4f9d-49b8-b833-aaeb587cef10" containerID="824adf21ccef0e75cc1e83865218d4fcc312fb0ecdf7af529feb8172622c6d3a" exitCode=0 Mar 12 14:42:34 crc kubenswrapper[4921]: I0312 14:42:34.398946 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bddgg" event={"ID":"d5583e26-4f9d-49b8-b833-aaeb587cef10","Type":"ContainerDied","Data":"824adf21ccef0e75cc1e83865218d4fcc312fb0ecdf7af529feb8172622c6d3a"} Mar 12 14:42:35 crc kubenswrapper[4921]: I0312 14:42:35.413922 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bddgg" event={"ID":"d5583e26-4f9d-49b8-b833-aaeb587cef10","Type":"ContainerStarted","Data":"71af4de8ffe5a5a8db38672355f954fd69554b0325cebc62b082a14e16ebab19"} Mar 12 14:42:35 crc kubenswrapper[4921]: I0312 14:42:35.441016 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bddgg" podStartSLOduration=1.905126189 podStartE2EDuration="4.440992582s" podCreationTimestamp="2026-03-12 14:42:31 +0000 UTC" firstStartedPulling="2026-03-12 14:42:32.372180258 +0000 UTC m=+5575.062252229" lastFinishedPulling="2026-03-12 14:42:34.908046651 +0000 UTC m=+5577.598118622" observedRunningTime="2026-03-12 14:42:35.437452693 +0000 UTC m=+5578.127524664" watchObservedRunningTime="2026-03-12 14:42:35.440992582 +0000 UTC m=+5578.131064553" Mar 12 14:42:41 crc kubenswrapper[4921]: I0312 14:42:41.423856 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:41 crc kubenswrapper[4921]: I0312 14:42:41.424472 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:41 crc kubenswrapper[4921]: I0312 14:42:41.506561 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:41 crc kubenswrapper[4921]: I0312 14:42:41.581441 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:42 crc kubenswrapper[4921]: I0312 14:42:42.288786 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bddgg"] Mar 12 14:42:43 crc kubenswrapper[4921]: I0312 14:42:43.492070 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bddgg" podUID="d5583e26-4f9d-49b8-b833-aaeb587cef10" containerName="registry-server" containerID="cri-o://71af4de8ffe5a5a8db38672355f954fd69554b0325cebc62b082a14e16ebab19" gracePeriod=2 Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.127698 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.178294 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kgln\" (UniqueName: \"kubernetes.io/projected/d5583e26-4f9d-49b8-b833-aaeb587cef10-kube-api-access-6kgln\") pod \"d5583e26-4f9d-49b8-b833-aaeb587cef10\" (UID: \"d5583e26-4f9d-49b8-b833-aaeb587cef10\") " Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.178438 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5583e26-4f9d-49b8-b833-aaeb587cef10-utilities\") pod \"d5583e26-4f9d-49b8-b833-aaeb587cef10\" (UID: \"d5583e26-4f9d-49b8-b833-aaeb587cef10\") " Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.178611 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5583e26-4f9d-49b8-b833-aaeb587cef10-catalog-content\") pod \"d5583e26-4f9d-49b8-b833-aaeb587cef10\" (UID: \"d5583e26-4f9d-49b8-b833-aaeb587cef10\") " Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.179693 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5583e26-4f9d-49b8-b833-aaeb587cef10-utilities" (OuterVolumeSpecName: "utilities") pod "d5583e26-4f9d-49b8-b833-aaeb587cef10" (UID: "d5583e26-4f9d-49b8-b833-aaeb587cef10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.185646 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5583e26-4f9d-49b8-b833-aaeb587cef10-kube-api-access-6kgln" (OuterVolumeSpecName: "kube-api-access-6kgln") pod "d5583e26-4f9d-49b8-b833-aaeb587cef10" (UID: "d5583e26-4f9d-49b8-b833-aaeb587cef10"). InnerVolumeSpecName "kube-api-access-6kgln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.207123 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5583e26-4f9d-49b8-b833-aaeb587cef10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5583e26-4f9d-49b8-b833-aaeb587cef10" (UID: "d5583e26-4f9d-49b8-b833-aaeb587cef10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.280877 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kgln\" (UniqueName: \"kubernetes.io/projected/d5583e26-4f9d-49b8-b833-aaeb587cef10-kube-api-access-6kgln\") on node \"crc\" DevicePath \"\"" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.281121 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5583e26-4f9d-49b8-b833-aaeb587cef10-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.281181 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5583e26-4f9d-49b8-b833-aaeb587cef10-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.502387 4921 generic.go:334] "Generic (PLEG): container finished" podID="d5583e26-4f9d-49b8-b833-aaeb587cef10" containerID="71af4de8ffe5a5a8db38672355f954fd69554b0325cebc62b082a14e16ebab19" exitCode=0 Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.502430 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bddgg" event={"ID":"d5583e26-4f9d-49b8-b833-aaeb587cef10","Type":"ContainerDied","Data":"71af4de8ffe5a5a8db38672355f954fd69554b0325cebc62b082a14e16ebab19"} Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.502458 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bddgg" event={"ID":"d5583e26-4f9d-49b8-b833-aaeb587cef10","Type":"ContainerDied","Data":"b4ad42ea6cdae529ddcba19ba82a237282d260129441e5b4d9a58b159b9c28be"} Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.502469 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bddgg" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.502479 4921 scope.go:117] "RemoveContainer" containerID="71af4de8ffe5a5a8db38672355f954fd69554b0325cebc62b082a14e16ebab19" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.529530 4921 scope.go:117] "RemoveContainer" containerID="824adf21ccef0e75cc1e83865218d4fcc312fb0ecdf7af529feb8172622c6d3a" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.544807 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bddgg"] Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.556288 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bddgg"] Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.564093 4921 scope.go:117] "RemoveContainer" containerID="71ceed042bb0719f17cf52435305d1fe3442bb9bb81f4905fb2d856c65b89c6f" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.597593 4921 scope.go:117] "RemoveContainer" containerID="71af4de8ffe5a5a8db38672355f954fd69554b0325cebc62b082a14e16ebab19" Mar 12 14:42:44 crc kubenswrapper[4921]: E0312 14:42:44.598047 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71af4de8ffe5a5a8db38672355f954fd69554b0325cebc62b082a14e16ebab19\": container with ID starting with 71af4de8ffe5a5a8db38672355f954fd69554b0325cebc62b082a14e16ebab19 not found: ID does not exist" containerID="71af4de8ffe5a5a8db38672355f954fd69554b0325cebc62b082a14e16ebab19" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.598082 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71af4de8ffe5a5a8db38672355f954fd69554b0325cebc62b082a14e16ebab19"} err="failed to get container status \"71af4de8ffe5a5a8db38672355f954fd69554b0325cebc62b082a14e16ebab19\": rpc error: code = NotFound desc = could not find container \"71af4de8ffe5a5a8db38672355f954fd69554b0325cebc62b082a14e16ebab19\": container with ID starting with 71af4de8ffe5a5a8db38672355f954fd69554b0325cebc62b082a14e16ebab19 not found: ID does not exist" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.598106 4921 scope.go:117] "RemoveContainer" containerID="824adf21ccef0e75cc1e83865218d4fcc312fb0ecdf7af529feb8172622c6d3a" Mar 12 14:42:44 crc kubenswrapper[4921]: E0312 14:42:44.598325 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"824adf21ccef0e75cc1e83865218d4fcc312fb0ecdf7af529feb8172622c6d3a\": container with ID starting with 824adf21ccef0e75cc1e83865218d4fcc312fb0ecdf7af529feb8172622c6d3a not found: ID does not exist" containerID="824adf21ccef0e75cc1e83865218d4fcc312fb0ecdf7af529feb8172622c6d3a" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.598358 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824adf21ccef0e75cc1e83865218d4fcc312fb0ecdf7af529feb8172622c6d3a"} err="failed to get container status \"824adf21ccef0e75cc1e83865218d4fcc312fb0ecdf7af529feb8172622c6d3a\": rpc error: code = NotFound desc = could not find container \"824adf21ccef0e75cc1e83865218d4fcc312fb0ecdf7af529feb8172622c6d3a\": container with ID starting with 824adf21ccef0e75cc1e83865218d4fcc312fb0ecdf7af529feb8172622c6d3a not found: ID does not exist" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.598378 4921 scope.go:117] "RemoveContainer" containerID="71ceed042bb0719f17cf52435305d1fe3442bb9bb81f4905fb2d856c65b89c6f" Mar 12 14:42:44 crc kubenswrapper[4921]: E0312 14:42:44.598658 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71ceed042bb0719f17cf52435305d1fe3442bb9bb81f4905fb2d856c65b89c6f\": container with ID starting with 71ceed042bb0719f17cf52435305d1fe3442bb9bb81f4905fb2d856c65b89c6f not found: ID does not exist" containerID="71ceed042bb0719f17cf52435305d1fe3442bb9bb81f4905fb2d856c65b89c6f" Mar 12 14:42:44 crc kubenswrapper[4921]: I0312 14:42:44.598688 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ceed042bb0719f17cf52435305d1fe3442bb9bb81f4905fb2d856c65b89c6f"} err="failed to get container status \"71ceed042bb0719f17cf52435305d1fe3442bb9bb81f4905fb2d856c65b89c6f\": rpc error: code = NotFound desc = could not find container \"71ceed042bb0719f17cf52435305d1fe3442bb9bb81f4905fb2d856c65b89c6f\": container with ID starting with 71ceed042bb0719f17cf52435305d1fe3442bb9bb81f4905fb2d856c65b89c6f not found: ID does not exist" Mar 12 14:42:45 crc kubenswrapper[4921]: I0312 14:42:45.993319 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5583e26-4f9d-49b8-b833-aaeb587cef10" path="/var/lib/kubelet/pods/d5583e26-4f9d-49b8-b833-aaeb587cef10/volumes" Mar 12 14:42:48 crc kubenswrapper[4921]: I0312 14:42:48.282424 4921 scope.go:117] "RemoveContainer" containerID="d6e68c525465f58a87589a60ffc38fe19df203fef38a67c218cdbb778d8335aa" Mar 12 14:42:56 crc kubenswrapper[4921]: I0312 14:42:56.323975 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:42:56 crc kubenswrapper[4921]: I0312 14:42:56.324508 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:42:56 crc kubenswrapper[4921]: I0312 14:42:56.324551 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 14:42:56 crc kubenswrapper[4921]: I0312 14:42:56.325347 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d788b2bf8430364601978f85d83f69873cfb108662e13e91e5249e48c7f0a12b"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 14:42:56 crc kubenswrapper[4921]: I0312 14:42:56.325401 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://d788b2bf8430364601978f85d83f69873cfb108662e13e91e5249e48c7f0a12b" gracePeriod=600 Mar 12 14:42:56 crc kubenswrapper[4921]: I0312 14:42:56.628390 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="d788b2bf8430364601978f85d83f69873cfb108662e13e91e5249e48c7f0a12b" exitCode=0 Mar 12 14:42:56 crc kubenswrapper[4921]: I0312 14:42:56.628481 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"d788b2bf8430364601978f85d83f69873cfb108662e13e91e5249e48c7f0a12b"} Mar 12 14:42:56 crc kubenswrapper[4921]: I0312 14:42:56.628719 4921 scope.go:117] "RemoveContainer" containerID="8a22089889f69563b476a15eddd91145776a84f8dbac82eaefec93c125a04ea6" Mar 12 14:42:57 crc kubenswrapper[4921]: I0312 14:42:57.641632 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01"} Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.151948 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555444-jhrjr"] Mar 12 14:44:00 crc kubenswrapper[4921]: E0312 14:44:00.152811 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5583e26-4f9d-49b8-b833-aaeb587cef10" containerName="extract-content" Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.153402 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5583e26-4f9d-49b8-b833-aaeb587cef10" containerName="extract-content" Mar 12 14:44:00 crc kubenswrapper[4921]: E0312 14:44:00.153509 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5583e26-4f9d-49b8-b833-aaeb587cef10" containerName="registry-server" Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.153515 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5583e26-4f9d-49b8-b833-aaeb587cef10" containerName="registry-server" Mar 12 14:44:00 crc kubenswrapper[4921]: E0312 14:44:00.153528 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5583e26-4f9d-49b8-b833-aaeb587cef10" containerName="extract-utilities" Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.153534 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5583e26-4f9d-49b8-b833-aaeb587cef10" containerName="extract-utilities" Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.153723 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5583e26-4f9d-49b8-b833-aaeb587cef10" containerName="registry-server" Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.154373 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555444-jhrjr" Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.156615 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.156886 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.159089 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.169045 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555444-jhrjr"] Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.250309 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz4l2\" (UniqueName: \"kubernetes.io/projected/7c40cbff-69ba-4640-ac73-497e0d633cee-kube-api-access-rz4l2\") pod \"auto-csr-approver-29555444-jhrjr\" (UID: \"7c40cbff-69ba-4640-ac73-497e0d633cee\") " pod="openshift-infra/auto-csr-approver-29555444-jhrjr" Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.352472 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz4l2\" (UniqueName: \"kubernetes.io/projected/7c40cbff-69ba-4640-ac73-497e0d633cee-kube-api-access-rz4l2\") pod \"auto-csr-approver-29555444-jhrjr\" (UID: \"7c40cbff-69ba-4640-ac73-497e0d633cee\") " pod="openshift-infra/auto-csr-approver-29555444-jhrjr" Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.384584 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz4l2\" (UniqueName: \"kubernetes.io/projected/7c40cbff-69ba-4640-ac73-497e0d633cee-kube-api-access-rz4l2\") pod \"auto-csr-approver-29555444-jhrjr\" (UID: \"7c40cbff-69ba-4640-ac73-497e0d633cee\") " pod="openshift-infra/auto-csr-approver-29555444-jhrjr" Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.479222 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555444-jhrjr" Mar 12 14:44:00 crc kubenswrapper[4921]: I0312 14:44:00.922030 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555444-jhrjr"] Mar 12 14:44:01 crc kubenswrapper[4921]: I0312 14:44:01.297890 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555444-jhrjr" event={"ID":"7c40cbff-69ba-4640-ac73-497e0d633cee","Type":"ContainerStarted","Data":"aac2d8c174b5e960fb43c2995bb64b62e9196b6428cce52cfe27676796eb68da"} Mar 12 14:44:03 crc kubenswrapper[4921]: I0312 14:44:03.314391 4921 generic.go:334] "Generic (PLEG): container finished" podID="7c40cbff-69ba-4640-ac73-497e0d633cee" containerID="02d900a501938721b11ff641ab6214e62635421ace13d98fea7e7c9a64b09883" exitCode=0 Mar 12 14:44:03 crc kubenswrapper[4921]: I0312 14:44:03.314505 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555444-jhrjr" event={"ID":"7c40cbff-69ba-4640-ac73-497e0d633cee","Type":"ContainerDied","Data":"02d900a501938721b11ff641ab6214e62635421ace13d98fea7e7c9a64b09883"} Mar 12 14:44:04 crc kubenswrapper[4921]: I0312 14:44:04.842544 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555444-jhrjr" Mar 12 14:44:05 crc kubenswrapper[4921]: I0312 14:44:05.036600 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz4l2\" (UniqueName: \"kubernetes.io/projected/7c40cbff-69ba-4640-ac73-497e0d633cee-kube-api-access-rz4l2\") pod \"7c40cbff-69ba-4640-ac73-497e0d633cee\" (UID: \"7c40cbff-69ba-4640-ac73-497e0d633cee\") " Mar 12 14:44:05 crc kubenswrapper[4921]: I0312 14:44:05.046237 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c40cbff-69ba-4640-ac73-497e0d633cee-kube-api-access-rz4l2" (OuterVolumeSpecName: "kube-api-access-rz4l2") pod "7c40cbff-69ba-4640-ac73-497e0d633cee" (UID: "7c40cbff-69ba-4640-ac73-497e0d633cee"). InnerVolumeSpecName "kube-api-access-rz4l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:44:05 crc kubenswrapper[4921]: I0312 14:44:05.139262 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz4l2\" (UniqueName: \"kubernetes.io/projected/7c40cbff-69ba-4640-ac73-497e0d633cee-kube-api-access-rz4l2\") on node \"crc\" DevicePath \"\"" Mar 12 14:44:05 crc kubenswrapper[4921]: I0312 14:44:05.339219 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555444-jhrjr" event={"ID":"7c40cbff-69ba-4640-ac73-497e0d633cee","Type":"ContainerDied","Data":"aac2d8c174b5e960fb43c2995bb64b62e9196b6428cce52cfe27676796eb68da"} Mar 12 14:44:05 crc kubenswrapper[4921]: I0312 14:44:05.339289 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555444-jhrjr" Mar 12 14:44:05 crc kubenswrapper[4921]: I0312 14:44:05.339504 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aac2d8c174b5e960fb43c2995bb64b62e9196b6428cce52cfe27676796eb68da" Mar 12 14:44:05 crc kubenswrapper[4921]: I0312 14:44:05.928846 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555438-4cqxb"] Mar 12 14:44:05 crc kubenswrapper[4921]: I0312 14:44:05.944373 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555438-4cqxb"] Mar 12 14:44:05 crc kubenswrapper[4921]: I0312 14:44:05.998138 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c865d942-d9a9-4a66-8b9e-b19c82064aeb" path="/var/lib/kubelet/pods/c865d942-d9a9-4a66-8b9e-b19c82064aeb/volumes" Mar 12 14:44:48 crc kubenswrapper[4921]: I0312 14:44:48.419399 4921 scope.go:117] "RemoveContainer" containerID="9576560a889bf00c9d3fac5b181891be9e41b593c5e589d9923ab7ec6d4f81f8" Mar 12 14:44:56 crc kubenswrapper[4921]: I0312 14:44:56.323398 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:44:56 crc kubenswrapper[4921]: I0312 14:44:56.323968 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.150357 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr"] Mar 12 14:45:00 crc kubenswrapper[4921]: E0312 14:45:00.151252 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c40cbff-69ba-4640-ac73-497e0d633cee" containerName="oc" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.151266 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c40cbff-69ba-4640-ac73-497e0d633cee" containerName="oc" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.151500 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c40cbff-69ba-4640-ac73-497e0d633cee" containerName="oc" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.152141 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.157307 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.157737 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.165530 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr"] Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.306132 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg8fc\" (UniqueName: \"kubernetes.io/projected/80794446-210c-4c0c-ae85-fdbd7565ba54-kube-api-access-lg8fc\") pod \"collect-profiles-29555445-xtjfr\" (UID: \"80794446-210c-4c0c-ae85-fdbd7565ba54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.306212 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80794446-210c-4c0c-ae85-fdbd7565ba54-secret-volume\") pod \"collect-profiles-29555445-xtjfr\" (UID: \"80794446-210c-4c0c-ae85-fdbd7565ba54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.306245 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80794446-210c-4c0c-ae85-fdbd7565ba54-config-volume\") pod \"collect-profiles-29555445-xtjfr\" (UID: \"80794446-210c-4c0c-ae85-fdbd7565ba54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.408675 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg8fc\" (UniqueName: \"kubernetes.io/projected/80794446-210c-4c0c-ae85-fdbd7565ba54-kube-api-access-lg8fc\") pod \"collect-profiles-29555445-xtjfr\" (UID: \"80794446-210c-4c0c-ae85-fdbd7565ba54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.408741 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80794446-210c-4c0c-ae85-fdbd7565ba54-secret-volume\") pod \"collect-profiles-29555445-xtjfr\" (UID: \"80794446-210c-4c0c-ae85-fdbd7565ba54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.408769 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80794446-210c-4c0c-ae85-fdbd7565ba54-config-volume\") pod \"collect-profiles-29555445-xtjfr\" (UID: \"80794446-210c-4c0c-ae85-fdbd7565ba54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.409755 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80794446-210c-4c0c-ae85-fdbd7565ba54-config-volume\") pod \"collect-profiles-29555445-xtjfr\" (UID: \"80794446-210c-4c0c-ae85-fdbd7565ba54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.416123 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80794446-210c-4c0c-ae85-fdbd7565ba54-secret-volume\") pod \"collect-profiles-29555445-xtjfr\" (UID: \"80794446-210c-4c0c-ae85-fdbd7565ba54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.427856 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg8fc\" (UniqueName: \"kubernetes.io/projected/80794446-210c-4c0c-ae85-fdbd7565ba54-kube-api-access-lg8fc\") pod \"collect-profiles-29555445-xtjfr\" (UID: \"80794446-210c-4c0c-ae85-fdbd7565ba54\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.477326 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" Mar 12 14:45:00 crc kubenswrapper[4921]: I0312 14:45:00.974919 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr"] Mar 12 14:45:01 crc kubenswrapper[4921]: I0312 14:45:01.846735 4921 generic.go:334] "Generic (PLEG): container finished" podID="80794446-210c-4c0c-ae85-fdbd7565ba54" containerID="6adc46e47163256d4b6935553ca5971c6204a6f4c4458de1de5fd1470c4d8efe" exitCode=0 Mar 12 14:45:01 crc kubenswrapper[4921]: I0312 14:45:01.846966 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" event={"ID":"80794446-210c-4c0c-ae85-fdbd7565ba54","Type":"ContainerDied","Data":"6adc46e47163256d4b6935553ca5971c6204a6f4c4458de1de5fd1470c4d8efe"} Mar 12 14:45:01 crc kubenswrapper[4921]: I0312 14:45:01.847142 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" event={"ID":"80794446-210c-4c0c-ae85-fdbd7565ba54","Type":"ContainerStarted","Data":"9f5a10b4b71acc9c0d3fb5d5e9ba883d3e61495f3028a06b45fc56b496d44f7b"} Mar 12 14:45:03 crc kubenswrapper[4921]: I0312 14:45:03.341330 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" Mar 12 14:45:03 crc kubenswrapper[4921]: I0312 14:45:03.412625 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80794446-210c-4c0c-ae85-fdbd7565ba54-secret-volume\") pod \"80794446-210c-4c0c-ae85-fdbd7565ba54\" (UID: \"80794446-210c-4c0c-ae85-fdbd7565ba54\") " Mar 12 14:45:03 crc kubenswrapper[4921]: I0312 14:45:03.412714 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80794446-210c-4c0c-ae85-fdbd7565ba54-config-volume\") pod \"80794446-210c-4c0c-ae85-fdbd7565ba54\" (UID: \"80794446-210c-4c0c-ae85-fdbd7565ba54\") " Mar 12 14:45:03 crc kubenswrapper[4921]: I0312 14:45:03.412928 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg8fc\" (UniqueName: \"kubernetes.io/projected/80794446-210c-4c0c-ae85-fdbd7565ba54-kube-api-access-lg8fc\") pod \"80794446-210c-4c0c-ae85-fdbd7565ba54\" (UID: \"80794446-210c-4c0c-ae85-fdbd7565ba54\") " Mar 12 14:45:03 crc kubenswrapper[4921]: I0312 14:45:03.414363 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80794446-210c-4c0c-ae85-fdbd7565ba54-config-volume" (OuterVolumeSpecName: "config-volume") pod "80794446-210c-4c0c-ae85-fdbd7565ba54" (UID: "80794446-210c-4c0c-ae85-fdbd7565ba54"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 14:45:03 crc kubenswrapper[4921]: I0312 14:45:03.423038 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80794446-210c-4c0c-ae85-fdbd7565ba54-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "80794446-210c-4c0c-ae85-fdbd7565ba54" (UID: "80794446-210c-4c0c-ae85-fdbd7565ba54"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 14:45:03 crc kubenswrapper[4921]: I0312 14:45:03.423067 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80794446-210c-4c0c-ae85-fdbd7565ba54-kube-api-access-lg8fc" (OuterVolumeSpecName: "kube-api-access-lg8fc") pod "80794446-210c-4c0c-ae85-fdbd7565ba54" (UID: "80794446-210c-4c0c-ae85-fdbd7565ba54"). InnerVolumeSpecName "kube-api-access-lg8fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:45:03 crc kubenswrapper[4921]: I0312 14:45:03.515595 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80794446-210c-4c0c-ae85-fdbd7565ba54-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 14:45:03 crc kubenswrapper[4921]: I0312 14:45:03.515625 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80794446-210c-4c0c-ae85-fdbd7565ba54-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 14:45:03 crc kubenswrapper[4921]: I0312 14:45:03.515635 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg8fc\" (UniqueName: \"kubernetes.io/projected/80794446-210c-4c0c-ae85-fdbd7565ba54-kube-api-access-lg8fc\") on node \"crc\" DevicePath \"\"" Mar 12 14:45:03 crc kubenswrapper[4921]: I0312 14:45:03.868599 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" event={"ID":"80794446-210c-4c0c-ae85-fdbd7565ba54","Type":"ContainerDied","Data":"9f5a10b4b71acc9c0d3fb5d5e9ba883d3e61495f3028a06b45fc56b496d44f7b"} Mar 12 14:45:03 crc kubenswrapper[4921]: I0312 14:45:03.868654 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f5a10b4b71acc9c0d3fb5d5e9ba883d3e61495f3028a06b45fc56b496d44f7b" Mar 12 14:45:03 crc kubenswrapper[4921]: I0312 14:45:03.868720 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr" Mar 12 14:45:04 crc kubenswrapper[4921]: I0312 14:45:04.430007 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n"] Mar 12 14:45:04 crc kubenswrapper[4921]: I0312 14:45:04.439313 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555400-mjp2n"] Mar 12 14:45:05 crc kubenswrapper[4921]: I0312 14:45:05.994838 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12955294-d435-42e0-9130-5a84882f0fe0" path="/var/lib/kubelet/pods/12955294-d435-42e0-9130-5a84882f0fe0/volumes" Mar 12 14:45:26 crc kubenswrapper[4921]: I0312 14:45:26.324394 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:45:26 crc kubenswrapper[4921]: I0312 14:45:26.324974 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:45:48 crc kubenswrapper[4921]: I0312 14:45:48.506691 4921 scope.go:117] "RemoveContainer" containerID="f0feebcb84a08e5b580c6e8db79d6242c3e29baca96ede264340ca207c64072f" Mar 12 14:45:56 crc kubenswrapper[4921]: I0312 14:45:56.324443 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:45:56 crc kubenswrapper[4921]: I0312 14:45:56.325001 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:45:56 crc kubenswrapper[4921]: I0312 14:45:56.325052 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 14:45:56 crc kubenswrapper[4921]: I0312 14:45:56.325783 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 14:45:56 crc kubenswrapper[4921]: I0312 14:45:56.325859 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" gracePeriod=600 Mar 12 14:45:56 crc kubenswrapper[4921]: E0312 14:45:56.456941 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:45:57 crc kubenswrapper[4921]: I0312 14:45:57.349455 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" exitCode=0 Mar 12 14:45:57 crc kubenswrapper[4921]: I0312 14:45:57.349523 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01"} Mar 12 14:45:57 crc kubenswrapper[4921]: I0312 14:45:57.349837 4921 scope.go:117] "RemoveContainer" containerID="d788b2bf8430364601978f85d83f69873cfb108662e13e91e5249e48c7f0a12b" Mar 12 14:45:57 crc kubenswrapper[4921]: I0312 14:45:57.356065 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:45:57 crc kubenswrapper[4921]: E0312 14:45:57.357026 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:46:00 crc kubenswrapper[4921]: I0312 14:46:00.143549 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555446-wvxn9"] Mar 12 14:46:00 crc kubenswrapper[4921]: E0312 14:46:00.144474 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80794446-210c-4c0c-ae85-fdbd7565ba54" containerName="collect-profiles" Mar 12 14:46:00 crc kubenswrapper[4921]: I0312 14:46:00.144490 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="80794446-210c-4c0c-ae85-fdbd7565ba54" containerName="collect-profiles" Mar 12 14:46:00 crc kubenswrapper[4921]: I0312 14:46:00.144710 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="80794446-210c-4c0c-ae85-fdbd7565ba54" containerName="collect-profiles" Mar 12 14:46:00 crc kubenswrapper[4921]: I0312 14:46:00.145529 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555446-wvxn9" Mar 12 14:46:00 crc kubenswrapper[4921]: I0312 14:46:00.148753 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:46:00 crc kubenswrapper[4921]: I0312 14:46:00.148867 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:46:00 crc kubenswrapper[4921]: I0312 14:46:00.149081 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:46:00 crc kubenswrapper[4921]: I0312 14:46:00.155074 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555446-wvxn9"] Mar 12 14:46:00 crc kubenswrapper[4921]: I0312 14:46:00.190283 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htcsz\" (UniqueName: \"kubernetes.io/projected/cd4c3bd5-1c00-4331-acf3-8af324f7258d-kube-api-access-htcsz\") pod \"auto-csr-approver-29555446-wvxn9\" (UID: \"cd4c3bd5-1c00-4331-acf3-8af324f7258d\") " pod="openshift-infra/auto-csr-approver-29555446-wvxn9" Mar 12 14:46:00 crc kubenswrapper[4921]: I0312 14:46:00.292228 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htcsz\" (UniqueName: \"kubernetes.io/projected/cd4c3bd5-1c00-4331-acf3-8af324f7258d-kube-api-access-htcsz\") pod \"auto-csr-approver-29555446-wvxn9\" (UID: \"cd4c3bd5-1c00-4331-acf3-8af324f7258d\") " pod="openshift-infra/auto-csr-approver-29555446-wvxn9" Mar 12 14:46:00 crc kubenswrapper[4921]: I0312 14:46:00.312450 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htcsz\" (UniqueName: \"kubernetes.io/projected/cd4c3bd5-1c00-4331-acf3-8af324f7258d-kube-api-access-htcsz\") pod \"auto-csr-approver-29555446-wvxn9\" (UID: \"cd4c3bd5-1c00-4331-acf3-8af324f7258d\") " pod="openshift-infra/auto-csr-approver-29555446-wvxn9" Mar 12 14:46:00 crc kubenswrapper[4921]: I0312 14:46:00.466396 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555446-wvxn9" Mar 12 14:46:00 crc kubenswrapper[4921]: I0312 14:46:00.902259 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555446-wvxn9"] Mar 12 14:46:00 crc kubenswrapper[4921]: I0312 14:46:00.911677 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:46:01 crc kubenswrapper[4921]: I0312 14:46:01.389777 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555446-wvxn9" event={"ID":"cd4c3bd5-1c00-4331-acf3-8af324f7258d","Type":"ContainerStarted","Data":"63b308c74a1cb7962d4e1c7b4fa0ea852ca09adf2733115e47bb3ba916137d1d"} Mar 12 14:46:02 crc kubenswrapper[4921]: I0312 14:46:02.398628 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555446-wvxn9" event={"ID":"cd4c3bd5-1c00-4331-acf3-8af324f7258d","Type":"ContainerStarted","Data":"5a7d2ac7a8ab4ffe63febb50413d8f4e6139a3794c817c5130dc124b90070d05"} Mar 12 14:46:02 crc kubenswrapper[4921]: I0312 14:46:02.421187 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555446-wvxn9" podStartSLOduration=1.51093212 podStartE2EDuration="2.42116869s" podCreationTimestamp="2026-03-12 14:46:00 +0000 UTC" firstStartedPulling="2026-03-12 14:46:00.91143291 +0000 UTC m=+5783.601504881" lastFinishedPulling="2026-03-12 14:46:01.82166947 +0000 UTC m=+5784.511741451" observedRunningTime="2026-03-12 14:46:02.415372631 +0000 UTC m=+5785.105444602" watchObservedRunningTime="2026-03-12 14:46:02.42116869 +0000 UTC m=+5785.111240661" Mar 12 14:46:03 crc kubenswrapper[4921]: I0312 14:46:03.406727 4921 generic.go:334] "Generic (PLEG): container finished" podID="cd4c3bd5-1c00-4331-acf3-8af324f7258d" containerID="5a7d2ac7a8ab4ffe63febb50413d8f4e6139a3794c817c5130dc124b90070d05" exitCode=0 Mar 12 14:46:03 crc kubenswrapper[4921]: I0312 14:46:03.406778 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555446-wvxn9" event={"ID":"cd4c3bd5-1c00-4331-acf3-8af324f7258d","Type":"ContainerDied","Data":"5a7d2ac7a8ab4ffe63febb50413d8f4e6139a3794c817c5130dc124b90070d05"} Mar 12 14:46:04 crc kubenswrapper[4921]: I0312 14:46:04.982567 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555446-wvxn9" Mar 12 14:46:05 crc kubenswrapper[4921]: I0312 14:46:05.095745 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htcsz\" (UniqueName: \"kubernetes.io/projected/cd4c3bd5-1c00-4331-acf3-8af324f7258d-kube-api-access-htcsz\") pod \"cd4c3bd5-1c00-4331-acf3-8af324f7258d\" (UID: \"cd4c3bd5-1c00-4331-acf3-8af324f7258d\") " Mar 12 14:46:05 crc kubenswrapper[4921]: I0312 14:46:05.101480 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd4c3bd5-1c00-4331-acf3-8af324f7258d-kube-api-access-htcsz" (OuterVolumeSpecName: "kube-api-access-htcsz") pod "cd4c3bd5-1c00-4331-acf3-8af324f7258d" (UID: "cd4c3bd5-1c00-4331-acf3-8af324f7258d"). InnerVolumeSpecName "kube-api-access-htcsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:46:05 crc kubenswrapper[4921]: I0312 14:46:05.198041 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htcsz\" (UniqueName: \"kubernetes.io/projected/cd4c3bd5-1c00-4331-acf3-8af324f7258d-kube-api-access-htcsz\") on node \"crc\" DevicePath \"\"" Mar 12 14:46:05 crc kubenswrapper[4921]: I0312 14:46:05.425574 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555446-wvxn9" event={"ID":"cd4c3bd5-1c00-4331-acf3-8af324f7258d","Type":"ContainerDied","Data":"63b308c74a1cb7962d4e1c7b4fa0ea852ca09adf2733115e47bb3ba916137d1d"} Mar 12 14:46:05 crc kubenswrapper[4921]: I0312 14:46:05.425620 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63b308c74a1cb7962d4e1c7b4fa0ea852ca09adf2733115e47bb3ba916137d1d" Mar 12 14:46:05 crc kubenswrapper[4921]: I0312 14:46:05.425680 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555446-wvxn9" Mar 12 14:46:05 crc kubenswrapper[4921]: I0312 14:46:05.475383 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555440-crc6g"] Mar 12 14:46:05 crc kubenswrapper[4921]: I0312 14:46:05.483429 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555440-crc6g"] Mar 12 14:46:06 crc kubenswrapper[4921]: I0312 14:46:06.020774 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44afce34-b62f-457b-a225-f8be11c8b20b" path="/var/lib/kubelet/pods/44afce34-b62f-457b-a225-f8be11c8b20b/volumes" Mar 12 14:46:08 crc kubenswrapper[4921]: I0312 14:46:08.983046 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:46:08 crc kubenswrapper[4921]: E0312 14:46:08.983580 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:46:23 crc kubenswrapper[4921]: I0312 14:46:23.984046 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:46:23 crc kubenswrapper[4921]: E0312 14:46:23.984677 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:46:35 crc kubenswrapper[4921]: I0312 14:46:35.984105 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:46:35 crc kubenswrapper[4921]: E0312 14:46:35.985124 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:46:48 crc kubenswrapper[4921]: I0312 14:46:48.582965 4921 scope.go:117] "RemoveContainer" containerID="04aa8d94f68de59ff537c18cc1213af985c0e558e84cb1291be0fbc0a881d4a1" Mar 12 14:46:50 crc kubenswrapper[4921]: I0312 14:46:50.983563 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:46:50 crc kubenswrapper[4921]: E0312 14:46:50.984072 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:47:05 crc kubenswrapper[4921]: I0312 14:47:05.983772 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:47:05 crc kubenswrapper[4921]: E0312 14:47:05.986619 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:47:20 crc kubenswrapper[4921]: I0312 14:47:20.983701 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:47:20 crc kubenswrapper[4921]: E0312 14:47:20.984626 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:47:35 crc kubenswrapper[4921]: I0312 14:47:35.983502 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:47:35 crc kubenswrapper[4921]: E0312 14:47:35.984349 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:47:49 crc kubenswrapper[4921]: I0312 14:47:49.983983 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:47:49 crc kubenswrapper[4921]: E0312 14:47:49.984838 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:48:00 crc kubenswrapper[4921]: I0312 14:48:00.160284 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555448-fzqh6"] Mar 12 14:48:00 crc kubenswrapper[4921]: E0312 14:48:00.161636 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd4c3bd5-1c00-4331-acf3-8af324f7258d" containerName="oc" Mar 12 14:48:00 crc kubenswrapper[4921]: I0312 14:48:00.161663 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd4c3bd5-1c00-4331-acf3-8af324f7258d" containerName="oc" Mar 12 14:48:00 crc kubenswrapper[4921]: I0312 14:48:00.162217 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd4c3bd5-1c00-4331-acf3-8af324f7258d" containerName="oc" Mar 12 14:48:00 crc kubenswrapper[4921]: I0312 14:48:00.163500 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555448-fzqh6" Mar 12 14:48:00 crc kubenswrapper[4921]: I0312 14:48:00.166912 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:48:00 crc kubenswrapper[4921]: I0312 14:48:00.167857 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:48:00 crc kubenswrapper[4921]: I0312 14:48:00.168316 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:48:00 crc kubenswrapper[4921]: I0312 14:48:00.182536 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555448-fzqh6"] Mar 12 14:48:00 crc kubenswrapper[4921]: I0312 14:48:00.302442 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqjcl\" (UniqueName: \"kubernetes.io/projected/c3ebedac-8417-4852-bd01-4872169fbee2-kube-api-access-sqjcl\") pod \"auto-csr-approver-29555448-fzqh6\" (UID: \"c3ebedac-8417-4852-bd01-4872169fbee2\") " pod="openshift-infra/auto-csr-approver-29555448-fzqh6" Mar 12 14:48:00 crc kubenswrapper[4921]: I0312 14:48:00.404421 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqjcl\" (UniqueName: \"kubernetes.io/projected/c3ebedac-8417-4852-bd01-4872169fbee2-kube-api-access-sqjcl\") pod \"auto-csr-approver-29555448-fzqh6\" (UID: \"c3ebedac-8417-4852-bd01-4872169fbee2\") " pod="openshift-infra/auto-csr-approver-29555448-fzqh6" Mar 12 14:48:00 crc kubenswrapper[4921]: I0312 14:48:00.423467 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqjcl\" (UniqueName: \"kubernetes.io/projected/c3ebedac-8417-4852-bd01-4872169fbee2-kube-api-access-sqjcl\") pod \"auto-csr-approver-29555448-fzqh6\" (UID: \"c3ebedac-8417-4852-bd01-4872169fbee2\") " pod="openshift-infra/auto-csr-approver-29555448-fzqh6" Mar 12 14:48:00 crc kubenswrapper[4921]: I0312 14:48:00.486705 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555448-fzqh6" Mar 12 14:48:00 crc kubenswrapper[4921]: I0312 14:48:00.991761 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555448-fzqh6"] Mar 12 14:48:00 crc kubenswrapper[4921]: W0312 14:48:00.997780 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3ebedac_8417_4852_bd01_4872169fbee2.slice/crio-7ec104bf129ca9d21fb0598f1258b48b9c7eb3d0e0f204275ddb430a6a691336 WatchSource:0}: Error finding container 7ec104bf129ca9d21fb0598f1258b48b9c7eb3d0e0f204275ddb430a6a691336: Status 404 returned error can't find the container with id 7ec104bf129ca9d21fb0598f1258b48b9c7eb3d0e0f204275ddb430a6a691336 Mar 12 14:48:01 crc kubenswrapper[4921]: I0312 14:48:01.464364 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555448-fzqh6" event={"ID":"c3ebedac-8417-4852-bd01-4872169fbee2","Type":"ContainerStarted","Data":"7ec104bf129ca9d21fb0598f1258b48b9c7eb3d0e0f204275ddb430a6a691336"} Mar 12 14:48:01 crc kubenswrapper[4921]: I0312 14:48:01.983640 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:48:01 crc kubenswrapper[4921]: E0312 14:48:01.984055 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:48:02 crc kubenswrapper[4921]: I0312 14:48:02.476616 4921 generic.go:334] "Generic (PLEG): container finished" podID="c3ebedac-8417-4852-bd01-4872169fbee2" containerID="7e65b65fca86f12435fc9f37fdfc799029ce773c52eb66fd91decf5d780faa8e" exitCode=0 Mar 12 14:48:02 crc kubenswrapper[4921]: I0312 14:48:02.476664 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555448-fzqh6" event={"ID":"c3ebedac-8417-4852-bd01-4872169fbee2","Type":"ContainerDied","Data":"7e65b65fca86f12435fc9f37fdfc799029ce773c52eb66fd91decf5d780faa8e"} Mar 12 14:48:03 crc kubenswrapper[4921]: I0312 14:48:03.987486 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555448-fzqh6" Mar 12 14:48:04 crc kubenswrapper[4921]: I0312 14:48:04.076608 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqjcl\" (UniqueName: \"kubernetes.io/projected/c3ebedac-8417-4852-bd01-4872169fbee2-kube-api-access-sqjcl\") pod \"c3ebedac-8417-4852-bd01-4872169fbee2\" (UID: \"c3ebedac-8417-4852-bd01-4872169fbee2\") " Mar 12 14:48:04 crc kubenswrapper[4921]: I0312 14:48:04.083955 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3ebedac-8417-4852-bd01-4872169fbee2-kube-api-access-sqjcl" (OuterVolumeSpecName: "kube-api-access-sqjcl") pod "c3ebedac-8417-4852-bd01-4872169fbee2" (UID: "c3ebedac-8417-4852-bd01-4872169fbee2"). InnerVolumeSpecName "kube-api-access-sqjcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:04 crc kubenswrapper[4921]: I0312 14:48:04.179758 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqjcl\" (UniqueName: \"kubernetes.io/projected/c3ebedac-8417-4852-bd01-4872169fbee2-kube-api-access-sqjcl\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:04 crc kubenswrapper[4921]: I0312 14:48:04.493258 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555448-fzqh6" event={"ID":"c3ebedac-8417-4852-bd01-4872169fbee2","Type":"ContainerDied","Data":"7ec104bf129ca9d21fb0598f1258b48b9c7eb3d0e0f204275ddb430a6a691336"} Mar 12 14:48:04 crc kubenswrapper[4921]: I0312 14:48:04.493598 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ec104bf129ca9d21fb0598f1258b48b9c7eb3d0e0f204275ddb430a6a691336" Mar 12 14:48:04 crc kubenswrapper[4921]: I0312 14:48:04.493317 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555448-fzqh6" Mar 12 14:48:05 crc kubenswrapper[4921]: I0312 14:48:05.091433 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555442-6brsm"] Mar 12 14:48:05 crc kubenswrapper[4921]: I0312 14:48:05.100856 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555442-6brsm"] Mar 12 14:48:06 crc kubenswrapper[4921]: I0312 14:48:06.012541 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2756ba22-178c-45ec-b7ac-4661efa5b786" path="/var/lib/kubelet/pods/2756ba22-178c-45ec-b7ac-4661efa5b786/volumes" Mar 12 14:48:15 crc kubenswrapper[4921]: I0312 14:48:15.985424 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:48:15 crc kubenswrapper[4921]: E0312 14:48:15.988176 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:48:27 crc kubenswrapper[4921]: I0312 14:48:27.993682 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:48:27 crc kubenswrapper[4921]: E0312 14:48:27.995181 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.406623 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-258jg"] Mar 12 14:48:38 crc kubenswrapper[4921]: E0312 14:48:38.409064 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ebedac-8417-4852-bd01-4872169fbee2" containerName="oc" Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.409230 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ebedac-8417-4852-bd01-4872169fbee2" containerName="oc" Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.412014 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3ebedac-8417-4852-bd01-4872169fbee2" containerName="oc" Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.414002 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.448689 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-258jg"] Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.449792 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-catalog-content\") pod \"certified-operators-258jg\" (UID: \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\") " pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.449882 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-utilities\") pod \"certified-operators-258jg\" (UID: \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\") " pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.450138 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58v4q\" (UniqueName: \"kubernetes.io/projected/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-kube-api-access-58v4q\") pod \"certified-operators-258jg\" (UID: \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\") " pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.551627 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-utilities\") pod \"certified-operators-258jg\" (UID: \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\") " pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.551733 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58v4q\" (UniqueName: \"kubernetes.io/projected/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-kube-api-access-58v4q\") pod \"certified-operators-258jg\" (UID: \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\") " pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.551874 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-catalog-content\") pod \"certified-operators-258jg\" (UID: \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\") " pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.552395 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-catalog-content\") pod \"certified-operators-258jg\" (UID: \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\") " pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.552648 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-utilities\") pod \"certified-operators-258jg\" (UID: \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\") " pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.582583 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58v4q\" (UniqueName: \"kubernetes.io/projected/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-kube-api-access-58v4q\") pod \"certified-operators-258jg\" (UID: \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\") " pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:38 crc kubenswrapper[4921]: I0312 14:48:38.739557 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:39 crc kubenswrapper[4921]: I0312 14:48:39.307287 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-258jg"] Mar 12 14:48:39 crc kubenswrapper[4921]: I0312 14:48:39.818795 4921 generic.go:334] "Generic (PLEG): container finished" podID="4a321b6c-8470-4a80-9f95-997ab4b2dd1d" containerID="a533d780177bd354f9c8756fe6fe3b288824fa3c15aa77c8f67bad0f47b2a55d" exitCode=0 Mar 12 14:48:39 crc kubenswrapper[4921]: I0312 14:48:39.818867 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-258jg" event={"ID":"4a321b6c-8470-4a80-9f95-997ab4b2dd1d","Type":"ContainerDied","Data":"a533d780177bd354f9c8756fe6fe3b288824fa3c15aa77c8f67bad0f47b2a55d"} Mar 12 14:48:39 crc kubenswrapper[4921]: I0312 14:48:39.819110 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-258jg" event={"ID":"4a321b6c-8470-4a80-9f95-997ab4b2dd1d","Type":"ContainerStarted","Data":"b3c9dd41948f82262d0023d25c71090e651ea1d1f8e107ddae7e7d59aab64988"} Mar 12 14:48:39 crc kubenswrapper[4921]: I0312 14:48:39.985398 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:48:39 crc kubenswrapper[4921]: E0312 14:48:39.985734 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:48:40 crc kubenswrapper[4921]: I0312 14:48:40.830148 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-258jg" event={"ID":"4a321b6c-8470-4a80-9f95-997ab4b2dd1d","Type":"ContainerStarted","Data":"a743fd0f8035a74b708560fd44f6d281bb1df3ba40d835b0e2aa420e566d9895"} Mar 12 14:48:41 crc kubenswrapper[4921]: I0312 14:48:41.839891 4921 generic.go:334] "Generic (PLEG): container finished" podID="4a321b6c-8470-4a80-9f95-997ab4b2dd1d" containerID="a743fd0f8035a74b708560fd44f6d281bb1df3ba40d835b0e2aa420e566d9895" exitCode=0 Mar 12 14:48:41 crc kubenswrapper[4921]: I0312 14:48:41.840009 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-258jg" event={"ID":"4a321b6c-8470-4a80-9f95-997ab4b2dd1d","Type":"ContainerDied","Data":"a743fd0f8035a74b708560fd44f6d281bb1df3ba40d835b0e2aa420e566d9895"} Mar 12 14:48:42 crc kubenswrapper[4921]: I0312 14:48:42.852368 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-258jg" event={"ID":"4a321b6c-8470-4a80-9f95-997ab4b2dd1d","Type":"ContainerStarted","Data":"221809a05987c772b344286f3507a4ed22642fcdd294d47aefe9a00bd3b465c8"} Mar 12 14:48:42 crc kubenswrapper[4921]: I0312 14:48:42.881982 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-258jg" podStartSLOduration=2.3282427070000002 podStartE2EDuration="4.881960023s" podCreationTimestamp="2026-03-12 14:48:38 +0000 UTC" firstStartedPulling="2026-03-12 14:48:39.822677293 +0000 UTC m=+5942.512749264" lastFinishedPulling="2026-03-12 14:48:42.376394609 +0000 UTC m=+5945.066466580" observedRunningTime="2026-03-12 14:48:42.873301205 +0000 UTC m=+5945.563373186" watchObservedRunningTime="2026-03-12 14:48:42.881960023 +0000 UTC m=+5945.572031994" Mar 12 14:48:48 crc kubenswrapper[4921]: I0312 14:48:48.700305 4921 scope.go:117] "RemoveContainer" containerID="54dfcbe34888661c76bb40160619ffeb6f5162afadc76849206237e1e0714a76" Mar 12 14:48:48 crc kubenswrapper[4921]: I0312 14:48:48.740303 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:48 crc kubenswrapper[4921]: I0312 14:48:48.740804 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:48 crc kubenswrapper[4921]: I0312 14:48:48.801107 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:48 crc kubenswrapper[4921]: I0312 14:48:48.962427 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:49 crc kubenswrapper[4921]: I0312 14:48:49.036469 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-258jg"] Mar 12 14:48:50 crc kubenswrapper[4921]: I0312 14:48:50.935250 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-258jg" podUID="4a321b6c-8470-4a80-9f95-997ab4b2dd1d" containerName="registry-server" containerID="cri-o://221809a05987c772b344286f3507a4ed22642fcdd294d47aefe9a00bd3b465c8" gracePeriod=2 Mar 12 14:48:51 crc kubenswrapper[4921]: I0312 14:48:51.944852 4921 generic.go:334] "Generic (PLEG): container finished" podID="4a321b6c-8470-4a80-9f95-997ab4b2dd1d" containerID="221809a05987c772b344286f3507a4ed22642fcdd294d47aefe9a00bd3b465c8" exitCode=0 Mar 12 14:48:51 crc kubenswrapper[4921]: I0312 14:48:51.944915 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-258jg" event={"ID":"4a321b6c-8470-4a80-9f95-997ab4b2dd1d","Type":"ContainerDied","Data":"221809a05987c772b344286f3507a4ed22642fcdd294d47aefe9a00bd3b465c8"} Mar 12 14:48:51 crc kubenswrapper[4921]: I0312 14:48:51.945372 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-258jg" event={"ID":"4a321b6c-8470-4a80-9f95-997ab4b2dd1d","Type":"ContainerDied","Data":"b3c9dd41948f82262d0023d25c71090e651ea1d1f8e107ddae7e7d59aab64988"} Mar 12 14:48:51 crc kubenswrapper[4921]: I0312 14:48:51.945385 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3c9dd41948f82262d0023d25c71090e651ea1d1f8e107ddae7e7d59aab64988" Mar 12 14:48:52 crc kubenswrapper[4921]: I0312 14:48:52.036575 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:52 crc kubenswrapper[4921]: I0312 14:48:52.151510 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-utilities\") pod \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\" (UID: \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\") " Mar 12 14:48:52 crc kubenswrapper[4921]: I0312 14:48:52.151665 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-catalog-content\") pod \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\" (UID: \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\") " Mar 12 14:48:52 crc kubenswrapper[4921]: I0312 14:48:52.151723 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58v4q\" (UniqueName: \"kubernetes.io/projected/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-kube-api-access-58v4q\") pod \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\" (UID: \"4a321b6c-8470-4a80-9f95-997ab4b2dd1d\") " Mar 12 14:48:52 crc kubenswrapper[4921]: I0312 14:48:52.152428 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-utilities" (OuterVolumeSpecName: "utilities") pod "4a321b6c-8470-4a80-9f95-997ab4b2dd1d" (UID: "4a321b6c-8470-4a80-9f95-997ab4b2dd1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:48:52 crc kubenswrapper[4921]: I0312 14:48:52.162142 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-kube-api-access-58v4q" (OuterVolumeSpecName: "kube-api-access-58v4q") pod "4a321b6c-8470-4a80-9f95-997ab4b2dd1d" (UID: "4a321b6c-8470-4a80-9f95-997ab4b2dd1d"). InnerVolumeSpecName "kube-api-access-58v4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:48:52 crc kubenswrapper[4921]: I0312 14:48:52.207911 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a321b6c-8470-4a80-9f95-997ab4b2dd1d" (UID: "4a321b6c-8470-4a80-9f95-997ab4b2dd1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:48:52 crc kubenswrapper[4921]: I0312 14:48:52.254345 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:52 crc kubenswrapper[4921]: I0312 14:48:52.254384 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58v4q\" (UniqueName: \"kubernetes.io/projected/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-kube-api-access-58v4q\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:52 crc kubenswrapper[4921]: I0312 14:48:52.254399 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a321b6c-8470-4a80-9f95-997ab4b2dd1d-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:48:52 crc kubenswrapper[4921]: I0312 14:48:52.952314 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-258jg" Mar 12 14:48:52 crc kubenswrapper[4921]: I0312 14:48:52.983871 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:48:52 crc kubenswrapper[4921]: E0312 14:48:52.984193 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:48:52 crc kubenswrapper[4921]: I0312 14:48:52.985376 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-258jg"] Mar 12 14:48:52 crc kubenswrapper[4921]: I0312 14:48:52.994190 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-258jg"] Mar 12 14:48:53 crc kubenswrapper[4921]: I0312 14:48:53.999644 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a321b6c-8470-4a80-9f95-997ab4b2dd1d" path="/var/lib/kubelet/pods/4a321b6c-8470-4a80-9f95-997ab4b2dd1d/volumes" Mar 12 14:49:03 crc kubenswrapper[4921]: I0312 14:49:03.984015 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:49:03 crc kubenswrapper[4921]: E0312 14:49:03.984801 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:49:14 crc kubenswrapper[4921]: I0312 14:49:14.983150 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:49:14 crc kubenswrapper[4921]: E0312 14:49:14.984791 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:49:28 crc kubenswrapper[4921]: I0312 14:49:28.983911 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:49:28 crc kubenswrapper[4921]: E0312 14:49:28.985135 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:49:42 crc kubenswrapper[4921]: I0312 14:49:42.984026 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:49:42 crc kubenswrapper[4921]: E0312 14:49:42.984826 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:49:56 crc kubenswrapper[4921]: I0312 14:49:56.984237 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:49:56 crc kubenswrapper[4921]: E0312 14:49:56.985492 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.025432 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555450-7tkm6"] Mar 12 14:50:01 crc kubenswrapper[4921]: E0312 14:50:01.037441 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a321b6c-8470-4a80-9f95-997ab4b2dd1d" containerName="extract-utilities" Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.037461 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a321b6c-8470-4a80-9f95-997ab4b2dd1d" containerName="extract-utilities" Mar 12 14:50:01 crc kubenswrapper[4921]: E0312 14:50:01.037488 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a321b6c-8470-4a80-9f95-997ab4b2dd1d" containerName="registry-server" Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.037495 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a321b6c-8470-4a80-9f95-997ab4b2dd1d" containerName="registry-server" Mar 12 14:50:01 crc kubenswrapper[4921]: E0312 14:50:01.037520 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a321b6c-8470-4a80-9f95-997ab4b2dd1d" containerName="extract-content" Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.037526 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a321b6c-8470-4a80-9f95-997ab4b2dd1d" containerName="extract-content" Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.037688 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a321b6c-8470-4a80-9f95-997ab4b2dd1d" containerName="registry-server" Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.042209 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555450-7tkm6"] Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.042300 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555450-7tkm6" Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.049966 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.050145 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.050228 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.123458 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dvrf\" (UniqueName: \"kubernetes.io/projected/1bb62a06-2852-4ddc-8f71-2cbaf81a1f09-kube-api-access-4dvrf\") pod \"auto-csr-approver-29555450-7tkm6\" (UID: \"1bb62a06-2852-4ddc-8f71-2cbaf81a1f09\") " pod="openshift-infra/auto-csr-approver-29555450-7tkm6" Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.225129 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dvrf\" (UniqueName: \"kubernetes.io/projected/1bb62a06-2852-4ddc-8f71-2cbaf81a1f09-kube-api-access-4dvrf\") pod \"auto-csr-approver-29555450-7tkm6\" (UID: \"1bb62a06-2852-4ddc-8f71-2cbaf81a1f09\") " pod="openshift-infra/auto-csr-approver-29555450-7tkm6" Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.257769 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dvrf\" (UniqueName: \"kubernetes.io/projected/1bb62a06-2852-4ddc-8f71-2cbaf81a1f09-kube-api-access-4dvrf\") pod \"auto-csr-approver-29555450-7tkm6\" (UID: \"1bb62a06-2852-4ddc-8f71-2cbaf81a1f09\") " pod="openshift-infra/auto-csr-approver-29555450-7tkm6" Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.370140 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555450-7tkm6" Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.839859 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555450-7tkm6"] Mar 12 14:50:01 crc kubenswrapper[4921]: I0312 14:50:01.995372 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555450-7tkm6" event={"ID":"1bb62a06-2852-4ddc-8f71-2cbaf81a1f09","Type":"ContainerStarted","Data":"73b7306c5112f41f243fefd4d69817c929c9c0da53eff82b6e423da3bac576ec"} Mar 12 14:50:04 crc kubenswrapper[4921]: I0312 14:50:04.032664 4921 generic.go:334] "Generic (PLEG): container finished" podID="1bb62a06-2852-4ddc-8f71-2cbaf81a1f09" containerID="f64e27dbf4ded37b05f1df6c081e4432172ae7e1dba50d2769f4a7f4f03639b6" exitCode=0 Mar 12 14:50:04 crc kubenswrapper[4921]: I0312 14:50:04.032955 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555450-7tkm6" event={"ID":"1bb62a06-2852-4ddc-8f71-2cbaf81a1f09","Type":"ContainerDied","Data":"f64e27dbf4ded37b05f1df6c081e4432172ae7e1dba50d2769f4a7f4f03639b6"} Mar 12 14:50:05 crc kubenswrapper[4921]: I0312 14:50:05.614120 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555450-7tkm6" Mar 12 14:50:05 crc kubenswrapper[4921]: I0312 14:50:05.721031 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dvrf\" (UniqueName: \"kubernetes.io/projected/1bb62a06-2852-4ddc-8f71-2cbaf81a1f09-kube-api-access-4dvrf\") pod \"1bb62a06-2852-4ddc-8f71-2cbaf81a1f09\" (UID: \"1bb62a06-2852-4ddc-8f71-2cbaf81a1f09\") " Mar 12 14:50:05 crc kubenswrapper[4921]: I0312 14:50:05.738720 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bb62a06-2852-4ddc-8f71-2cbaf81a1f09-kube-api-access-4dvrf" (OuterVolumeSpecName: "kube-api-access-4dvrf") pod "1bb62a06-2852-4ddc-8f71-2cbaf81a1f09" (UID: "1bb62a06-2852-4ddc-8f71-2cbaf81a1f09"). InnerVolumeSpecName "kube-api-access-4dvrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:05 crc kubenswrapper[4921]: I0312 14:50:05.823489 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dvrf\" (UniqueName: \"kubernetes.io/projected/1bb62a06-2852-4ddc-8f71-2cbaf81a1f09-kube-api-access-4dvrf\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:06 crc kubenswrapper[4921]: I0312 14:50:06.067197 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555450-7tkm6" event={"ID":"1bb62a06-2852-4ddc-8f71-2cbaf81a1f09","Type":"ContainerDied","Data":"73b7306c5112f41f243fefd4d69817c929c9c0da53eff82b6e423da3bac576ec"} Mar 12 14:50:06 crc kubenswrapper[4921]: I0312 14:50:06.067260 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73b7306c5112f41f243fefd4d69817c929c9c0da53eff82b6e423da3bac576ec" Mar 12 14:50:06 crc kubenswrapper[4921]: I0312 14:50:06.067268 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555450-7tkm6" Mar 12 14:50:06 crc kubenswrapper[4921]: I0312 14:50:06.688923 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555444-jhrjr"] Mar 12 14:50:06 crc kubenswrapper[4921]: I0312 14:50:06.698246 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555444-jhrjr"] Mar 12 14:50:07 crc kubenswrapper[4921]: I0312 14:50:07.995091 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c40cbff-69ba-4640-ac73-497e0d633cee" path="/var/lib/kubelet/pods/7c40cbff-69ba-4640-ac73-497e0d633cee/volumes" Mar 12 14:50:09 crc kubenswrapper[4921]: I0312 14:50:09.983357 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:50:09 crc kubenswrapper[4921]: E0312 14:50:09.984265 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.123634 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7pq7d"] Mar 12 14:50:23 crc kubenswrapper[4921]: E0312 14:50:23.124709 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bb62a06-2852-4ddc-8f71-2cbaf81a1f09" containerName="oc" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.124725 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb62a06-2852-4ddc-8f71-2cbaf81a1f09" containerName="oc" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.125027 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bb62a06-2852-4ddc-8f71-2cbaf81a1f09" containerName="oc" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.131090 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.135535 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pq7d"] Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.196785 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-utilities\") pod \"redhat-operators-7pq7d\" (UID: \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\") " pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.196900 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhlvl\" (UniqueName: \"kubernetes.io/projected/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-kube-api-access-zhlvl\") pod \"redhat-operators-7pq7d\" (UID: \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\") " pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.197613 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-catalog-content\") pod \"redhat-operators-7pq7d\" (UID: \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\") " pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.299752 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-utilities\") pod \"redhat-operators-7pq7d\" (UID: \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\") " pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.299904 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhlvl\" (UniqueName: \"kubernetes.io/projected/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-kube-api-access-zhlvl\") pod \"redhat-operators-7pq7d\" (UID: \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\") " pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.299971 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-catalog-content\") pod \"redhat-operators-7pq7d\" (UID: \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\") " pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.301842 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-utilities\") pod \"redhat-operators-7pq7d\" (UID: \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\") " pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.301909 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-catalog-content\") pod \"redhat-operators-7pq7d\" (UID: \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\") " pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.328054 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhlvl\" (UniqueName: \"kubernetes.io/projected/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-kube-api-access-zhlvl\") pod \"redhat-operators-7pq7d\" (UID: \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\") " pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.467299 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.985078 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:50:23 crc kubenswrapper[4921]: E0312 14:50:23.985464 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:50:23 crc kubenswrapper[4921]: I0312 14:50:23.997227 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pq7d"] Mar 12 14:50:24 crc kubenswrapper[4921]: I0312 14:50:24.238207 4921 generic.go:334] "Generic (PLEG): container finished" podID="ddb5137a-5a3d-4c7a-8570-b51244bd3d19" containerID="55919ad2f07f1351f158d7de321b6d110efa98730fa9d5cc0c1bb8f2292d75f5" exitCode=0 Mar 12 14:50:24 crc kubenswrapper[4921]: I0312 14:50:24.238484 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pq7d" event={"ID":"ddb5137a-5a3d-4c7a-8570-b51244bd3d19","Type":"ContainerDied","Data":"55919ad2f07f1351f158d7de321b6d110efa98730fa9d5cc0c1bb8f2292d75f5"} Mar 12 14:50:24 crc kubenswrapper[4921]: I0312 14:50:24.238581 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pq7d" event={"ID":"ddb5137a-5a3d-4c7a-8570-b51244bd3d19","Type":"ContainerStarted","Data":"372bcb476fc729ae3b67b8c6d545142dc6504000720af0c50854dd754826d82c"} Mar 12 14:50:26 crc kubenswrapper[4921]: I0312 14:50:26.258959 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pq7d" event={"ID":"ddb5137a-5a3d-4c7a-8570-b51244bd3d19","Type":"ContainerStarted","Data":"d50b9a07a3cb04ee085f3a1833bb38cb52d129c588f2fd5aa7fffc69ed0cd732"} Mar 12 14:50:30 crc kubenswrapper[4921]: I0312 14:50:30.307429 4921 generic.go:334] "Generic (PLEG): container finished" podID="ddb5137a-5a3d-4c7a-8570-b51244bd3d19" containerID="d50b9a07a3cb04ee085f3a1833bb38cb52d129c588f2fd5aa7fffc69ed0cd732" exitCode=0 Mar 12 14:50:30 crc kubenswrapper[4921]: I0312 14:50:30.308051 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pq7d" event={"ID":"ddb5137a-5a3d-4c7a-8570-b51244bd3d19","Type":"ContainerDied","Data":"d50b9a07a3cb04ee085f3a1833bb38cb52d129c588f2fd5aa7fffc69ed0cd732"} Mar 12 14:50:31 crc kubenswrapper[4921]: I0312 14:50:31.317252 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pq7d" event={"ID":"ddb5137a-5a3d-4c7a-8570-b51244bd3d19","Type":"ContainerStarted","Data":"8358147ab4b155d70903ed4d2d50dd092c12a98844dbfa39c8bbf82f059bf35f"} Mar 12 14:50:31 crc kubenswrapper[4921]: I0312 14:50:31.338641 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7pq7d" podStartSLOduration=1.7989933919999999 podStartE2EDuration="8.338622566s" podCreationTimestamp="2026-03-12 14:50:23 +0000 UTC" firstStartedPulling="2026-03-12 14:50:24.240879422 +0000 UTC m=+6046.930951393" lastFinishedPulling="2026-03-12 14:50:30.780508586 +0000 UTC m=+6053.470580567" observedRunningTime="2026-03-12 14:50:31.331577968 +0000 UTC m=+6054.021649959" watchObservedRunningTime="2026-03-12 14:50:31.338622566 +0000 UTC m=+6054.028694537" Mar 12 14:50:33 crc kubenswrapper[4921]: I0312 14:50:33.467732 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:33 crc kubenswrapper[4921]: I0312 14:50:33.468212 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:34 crc kubenswrapper[4921]: I0312 14:50:34.516060 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7pq7d" podUID="ddb5137a-5a3d-4c7a-8570-b51244bd3d19" containerName="registry-server" probeResult="failure" output=< Mar 12 14:50:34 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 14:50:34 crc kubenswrapper[4921]: > Mar 12 14:50:34 crc kubenswrapper[4921]: I0312 14:50:34.983756 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:50:34 crc kubenswrapper[4921]: E0312 14:50:34.984040 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:50:43 crc kubenswrapper[4921]: I0312 14:50:43.520931 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:43 crc kubenswrapper[4921]: I0312 14:50:43.568805 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:43 crc kubenswrapper[4921]: I0312 14:50:43.753494 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7pq7d"] Mar 12 14:50:45 crc kubenswrapper[4921]: I0312 14:50:45.447145 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7pq7d" podUID="ddb5137a-5a3d-4c7a-8570-b51244bd3d19" containerName="registry-server" containerID="cri-o://8358147ab4b155d70903ed4d2d50dd092c12a98844dbfa39c8bbf82f059bf35f" gracePeriod=2 Mar 12 14:50:46 crc kubenswrapper[4921]: I0312 14:50:46.569863 4921 generic.go:334] "Generic (PLEG): container finished" podID="ddb5137a-5a3d-4c7a-8570-b51244bd3d19" containerID="8358147ab4b155d70903ed4d2d50dd092c12a98844dbfa39c8bbf82f059bf35f" exitCode=0 Mar 12 14:50:46 crc kubenswrapper[4921]: I0312 14:50:46.570128 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pq7d" event={"ID":"ddb5137a-5a3d-4c7a-8570-b51244bd3d19","Type":"ContainerDied","Data":"8358147ab4b155d70903ed4d2d50dd092c12a98844dbfa39c8bbf82f059bf35f"} Mar 12 14:50:46 crc kubenswrapper[4921]: I0312 14:50:46.570152 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pq7d" event={"ID":"ddb5137a-5a3d-4c7a-8570-b51244bd3d19","Type":"ContainerDied","Data":"372bcb476fc729ae3b67b8c6d545142dc6504000720af0c50854dd754826d82c"} Mar 12 14:50:46 crc kubenswrapper[4921]: I0312 14:50:46.570162 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="372bcb476fc729ae3b67b8c6d545142dc6504000720af0c50854dd754826d82c" Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.088789 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:50:47 crc kubenswrapper[4921]: E0312 14:50:47.089510 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.106060 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.284751 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-utilities\") pod \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\" (UID: \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\") " Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.285140 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-catalog-content\") pod \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\" (UID: \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\") " Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.285643 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-utilities" (OuterVolumeSpecName: "utilities") pod "ddb5137a-5a3d-4c7a-8570-b51244bd3d19" (UID: "ddb5137a-5a3d-4c7a-8570-b51244bd3d19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.286147 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhlvl\" (UniqueName: \"kubernetes.io/projected/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-kube-api-access-zhlvl\") pod \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\" (UID: \"ddb5137a-5a3d-4c7a-8570-b51244bd3d19\") " Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.287359 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.292198 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-kube-api-access-zhlvl" (OuterVolumeSpecName: "kube-api-access-zhlvl") pod "ddb5137a-5a3d-4c7a-8570-b51244bd3d19" (UID: "ddb5137a-5a3d-4c7a-8570-b51244bd3d19"). InnerVolumeSpecName "kube-api-access-zhlvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.411354 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ddb5137a-5a3d-4c7a-8570-b51244bd3d19" (UID: "ddb5137a-5a3d-4c7a-8570-b51244bd3d19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.575465 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhlvl\" (UniqueName: \"kubernetes.io/projected/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-kube-api-access-zhlvl\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.575565 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ddb5137a-5a3d-4c7a-8570-b51244bd3d19-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.589406 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pq7d" Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.620281 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7pq7d"] Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.630388 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7pq7d"] Mar 12 14:50:47 crc kubenswrapper[4921]: I0312 14:50:47.995553 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddb5137a-5a3d-4c7a-8570-b51244bd3d19" path="/var/lib/kubelet/pods/ddb5137a-5a3d-4c7a-8570-b51244bd3d19/volumes" Mar 12 14:50:48 crc kubenswrapper[4921]: I0312 14:50:48.831029 4921 scope.go:117] "RemoveContainer" containerID="02d900a501938721b11ff641ab6214e62635421ace13d98fea7e7c9a64b09883" Mar 12 14:51:00 crc kubenswrapper[4921]: I0312 14:51:00.983738 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:51:01 crc kubenswrapper[4921]: I0312 14:51:01.714143 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"930648be0d8bf646882c87307f1b96290a71998ec99e0c13a0e1dc2dae923eb4"} Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.140593 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555452-6pdzr"] Mar 12 14:52:00 crc kubenswrapper[4921]: E0312 14:52:00.141593 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb5137a-5a3d-4c7a-8570-b51244bd3d19" containerName="extract-content" Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.141613 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb5137a-5a3d-4c7a-8570-b51244bd3d19" containerName="extract-content" Mar 12 14:52:00 crc kubenswrapper[4921]: E0312 14:52:00.141652 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb5137a-5a3d-4c7a-8570-b51244bd3d19" containerName="registry-server" Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.141659 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb5137a-5a3d-4c7a-8570-b51244bd3d19" containerName="registry-server" Mar 12 14:52:00 crc kubenswrapper[4921]: E0312 14:52:00.141674 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddb5137a-5a3d-4c7a-8570-b51244bd3d19" containerName="extract-utilities" Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.141683 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddb5137a-5a3d-4c7a-8570-b51244bd3d19" containerName="extract-utilities" Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.141933 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddb5137a-5a3d-4c7a-8570-b51244bd3d19" containerName="registry-server" Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.142695 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555452-6pdzr" Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.145037 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.145139 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.145498 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.152603 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555452-6pdzr"] Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.239827 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkn6l\" (UniqueName: \"kubernetes.io/projected/00f63d31-2de6-4734-9554-da341c053194-kube-api-access-fkn6l\") pod \"auto-csr-approver-29555452-6pdzr\" (UID: \"00f63d31-2de6-4734-9554-da341c053194\") " pod="openshift-infra/auto-csr-approver-29555452-6pdzr" Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.341511 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkn6l\" (UniqueName: \"kubernetes.io/projected/00f63d31-2de6-4734-9554-da341c053194-kube-api-access-fkn6l\") pod \"auto-csr-approver-29555452-6pdzr\" (UID: \"00f63d31-2de6-4734-9554-da341c053194\") " pod="openshift-infra/auto-csr-approver-29555452-6pdzr" Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.360127 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkn6l\" (UniqueName: \"kubernetes.io/projected/00f63d31-2de6-4734-9554-da341c053194-kube-api-access-fkn6l\") pod \"auto-csr-approver-29555452-6pdzr\" (UID: \"00f63d31-2de6-4734-9554-da341c053194\") " pod="openshift-infra/auto-csr-approver-29555452-6pdzr" Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.460939 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555452-6pdzr" Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.913014 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555452-6pdzr"] Mar 12 14:52:00 crc kubenswrapper[4921]: W0312 14:52:00.932719 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00f63d31_2de6_4734_9554_da341c053194.slice/crio-81abc447c5b3ff1cee4ca4baf937a190dc7cbcf0468666d8a7e046a8c7319c61 WatchSource:0}: Error finding container 81abc447c5b3ff1cee4ca4baf937a190dc7cbcf0468666d8a7e046a8c7319c61: Status 404 returned error can't find the container with id 81abc447c5b3ff1cee4ca4baf937a190dc7cbcf0468666d8a7e046a8c7319c61 Mar 12 14:52:00 crc kubenswrapper[4921]: I0312 14:52:00.935234 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:52:01 crc kubenswrapper[4921]: I0312 14:52:01.674337 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555452-6pdzr" event={"ID":"00f63d31-2de6-4734-9554-da341c053194","Type":"ContainerStarted","Data":"81abc447c5b3ff1cee4ca4baf937a190dc7cbcf0468666d8a7e046a8c7319c61"} Mar 12 14:52:05 crc kubenswrapper[4921]: I0312 14:52:05.715342 4921 generic.go:334] "Generic (PLEG): container finished" podID="00f63d31-2de6-4734-9554-da341c053194" containerID="cd5a8c41220f999fea617e2ed3037db486bbdd6edf7e4522663bccc91b929664" exitCode=0 Mar 12 14:52:05 crc kubenswrapper[4921]: I0312 14:52:05.715389 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555452-6pdzr" event={"ID":"00f63d31-2de6-4734-9554-da341c053194","Type":"ContainerDied","Data":"cd5a8c41220f999fea617e2ed3037db486bbdd6edf7e4522663bccc91b929664"} Mar 12 14:52:07 crc kubenswrapper[4921]: I0312 14:52:07.144267 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555452-6pdzr" Mar 12 14:52:07 crc kubenswrapper[4921]: I0312 14:52:07.232650 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkn6l\" (UniqueName: \"kubernetes.io/projected/00f63d31-2de6-4734-9554-da341c053194-kube-api-access-fkn6l\") pod \"00f63d31-2de6-4734-9554-da341c053194\" (UID: \"00f63d31-2de6-4734-9554-da341c053194\") " Mar 12 14:52:07 crc kubenswrapper[4921]: I0312 14:52:07.239249 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00f63d31-2de6-4734-9554-da341c053194-kube-api-access-fkn6l" (OuterVolumeSpecName: "kube-api-access-fkn6l") pod "00f63d31-2de6-4734-9554-da341c053194" (UID: "00f63d31-2de6-4734-9554-da341c053194"). InnerVolumeSpecName "kube-api-access-fkn6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:52:07 crc kubenswrapper[4921]: I0312 14:52:07.337422 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkn6l\" (UniqueName: \"kubernetes.io/projected/00f63d31-2de6-4734-9554-da341c053194-kube-api-access-fkn6l\") on node \"crc\" DevicePath \"\"" Mar 12 14:52:07 crc kubenswrapper[4921]: I0312 14:52:07.733922 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555452-6pdzr" event={"ID":"00f63d31-2de6-4734-9554-da341c053194","Type":"ContainerDied","Data":"81abc447c5b3ff1cee4ca4baf937a190dc7cbcf0468666d8a7e046a8c7319c61"} Mar 12 14:52:07 crc kubenswrapper[4921]: I0312 14:52:07.734260 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81abc447c5b3ff1cee4ca4baf937a190dc7cbcf0468666d8a7e046a8c7319c61" Mar 12 14:52:07 crc kubenswrapper[4921]: I0312 14:52:07.733989 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555452-6pdzr" Mar 12 14:52:08 crc kubenswrapper[4921]: I0312 14:52:08.226687 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555446-wvxn9"] Mar 12 14:52:08 crc kubenswrapper[4921]: I0312 14:52:08.236690 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555446-wvxn9"] Mar 12 14:52:09 crc kubenswrapper[4921]: I0312 14:52:09.996190 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd4c3bd5-1c00-4331-acf3-8af324f7258d" path="/var/lib/kubelet/pods/cd4c3bd5-1c00-4331-acf3-8af324f7258d/volumes" Mar 12 14:52:48 crc kubenswrapper[4921]: I0312 14:52:48.934675 4921 scope.go:117] "RemoveContainer" containerID="5a7d2ac7a8ab4ffe63febb50413d8f4e6139a3794c817c5130dc124b90070d05" Mar 12 14:53:26 crc kubenswrapper[4921]: I0312 14:53:26.324366 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:53:26 crc kubenswrapper[4921]: I0312 14:53:26.325751 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:53:56 crc kubenswrapper[4921]: I0312 14:53:56.323921 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:53:56 crc kubenswrapper[4921]: I0312 14:53:56.324671 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:54:00 crc kubenswrapper[4921]: I0312 14:54:00.145873 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555454-6v9ll"] Mar 12 14:54:00 crc kubenswrapper[4921]: E0312 14:54:00.146845 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00f63d31-2de6-4734-9554-da341c053194" containerName="oc" Mar 12 14:54:00 crc kubenswrapper[4921]: I0312 14:54:00.146859 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="00f63d31-2de6-4734-9554-da341c053194" containerName="oc" Mar 12 14:54:00 crc kubenswrapper[4921]: I0312 14:54:00.147050 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="00f63d31-2de6-4734-9554-da341c053194" containerName="oc" Mar 12 14:54:00 crc kubenswrapper[4921]: I0312 14:54:00.148028 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555454-6v9ll" Mar 12 14:54:00 crc kubenswrapper[4921]: I0312 14:54:00.150017 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:54:00 crc kubenswrapper[4921]: I0312 14:54:00.150071 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:54:00 crc kubenswrapper[4921]: I0312 14:54:00.151235 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:54:00 crc kubenswrapper[4921]: I0312 14:54:00.172162 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555454-6v9ll"] Mar 12 14:54:00 crc kubenswrapper[4921]: I0312 14:54:00.254247 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bph4r\" (UniqueName: \"kubernetes.io/projected/b08f4757-cd45-491d-a4cd-e0a08adc9fbd-kube-api-access-bph4r\") pod \"auto-csr-approver-29555454-6v9ll\" (UID: \"b08f4757-cd45-491d-a4cd-e0a08adc9fbd\") " pod="openshift-infra/auto-csr-approver-29555454-6v9ll" Mar 12 14:54:00 crc kubenswrapper[4921]: I0312 14:54:00.356532 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bph4r\" (UniqueName: \"kubernetes.io/projected/b08f4757-cd45-491d-a4cd-e0a08adc9fbd-kube-api-access-bph4r\") pod \"auto-csr-approver-29555454-6v9ll\" (UID: \"b08f4757-cd45-491d-a4cd-e0a08adc9fbd\") " pod="openshift-infra/auto-csr-approver-29555454-6v9ll" Mar 12 14:54:00 crc kubenswrapper[4921]: I0312 14:54:00.378355 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bph4r\" (UniqueName: \"kubernetes.io/projected/b08f4757-cd45-491d-a4cd-e0a08adc9fbd-kube-api-access-bph4r\") pod \"auto-csr-approver-29555454-6v9ll\" (UID: \"b08f4757-cd45-491d-a4cd-e0a08adc9fbd\") " pod="openshift-infra/auto-csr-approver-29555454-6v9ll" Mar 12 14:54:00 crc kubenswrapper[4921]: I0312 14:54:00.479926 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555454-6v9ll" Mar 12 14:54:00 crc kubenswrapper[4921]: I0312 14:54:00.909324 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555454-6v9ll"] Mar 12 14:54:00 crc kubenswrapper[4921]: I0312 14:54:00.938365 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555454-6v9ll" event={"ID":"b08f4757-cd45-491d-a4cd-e0a08adc9fbd","Type":"ContainerStarted","Data":"90f9deb6ebedf7808b73fd17e97170e592c517b32431c68129099c96307f4628"} Mar 12 14:54:02 crc kubenswrapper[4921]: I0312 14:54:02.961465 4921 generic.go:334] "Generic (PLEG): container finished" podID="b08f4757-cd45-491d-a4cd-e0a08adc9fbd" containerID="2aa0e2c24cede3c575862f8ae39d111cea8f06c750256c5534c2bd1a7974b594" exitCode=0 Mar 12 14:54:02 crc kubenswrapper[4921]: I0312 14:54:02.961566 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555454-6v9ll" event={"ID":"b08f4757-cd45-491d-a4cd-e0a08adc9fbd","Type":"ContainerDied","Data":"2aa0e2c24cede3c575862f8ae39d111cea8f06c750256c5534c2bd1a7974b594"} Mar 12 14:54:04 crc kubenswrapper[4921]: I0312 14:54:04.432544 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555454-6v9ll" Mar 12 14:54:04 crc kubenswrapper[4921]: I0312 14:54:04.541469 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bph4r\" (UniqueName: \"kubernetes.io/projected/b08f4757-cd45-491d-a4cd-e0a08adc9fbd-kube-api-access-bph4r\") pod \"b08f4757-cd45-491d-a4cd-e0a08adc9fbd\" (UID: \"b08f4757-cd45-491d-a4cd-e0a08adc9fbd\") " Mar 12 14:54:04 crc kubenswrapper[4921]: I0312 14:54:04.551832 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08f4757-cd45-491d-a4cd-e0a08adc9fbd-kube-api-access-bph4r" (OuterVolumeSpecName: "kube-api-access-bph4r") pod "b08f4757-cd45-491d-a4cd-e0a08adc9fbd" (UID: "b08f4757-cd45-491d-a4cd-e0a08adc9fbd"). InnerVolumeSpecName "kube-api-access-bph4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:54:04 crc kubenswrapper[4921]: I0312 14:54:04.644151 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bph4r\" (UniqueName: \"kubernetes.io/projected/b08f4757-cd45-491d-a4cd-e0a08adc9fbd-kube-api-access-bph4r\") on node \"crc\" DevicePath \"\"" Mar 12 14:54:04 crc kubenswrapper[4921]: I0312 14:54:04.988133 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555454-6v9ll" event={"ID":"b08f4757-cd45-491d-a4cd-e0a08adc9fbd","Type":"ContainerDied","Data":"90f9deb6ebedf7808b73fd17e97170e592c517b32431c68129099c96307f4628"} Mar 12 14:54:04 crc kubenswrapper[4921]: I0312 14:54:04.988517 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90f9deb6ebedf7808b73fd17e97170e592c517b32431c68129099c96307f4628" Mar 12 14:54:04 crc kubenswrapper[4921]: I0312 14:54:04.988161 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555454-6v9ll" Mar 12 14:54:05 crc kubenswrapper[4921]: I0312 14:54:05.504794 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555448-fzqh6"] Mar 12 14:54:05 crc kubenswrapper[4921]: I0312 14:54:05.512618 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555448-fzqh6"] Mar 12 14:54:05 crc kubenswrapper[4921]: I0312 14:54:05.994253 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3ebedac-8417-4852-bd01-4872169fbee2" path="/var/lib/kubelet/pods/c3ebedac-8417-4852-bd01-4872169fbee2/volumes" Mar 12 14:54:26 crc kubenswrapper[4921]: I0312 14:54:26.324077 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:54:26 crc kubenswrapper[4921]: I0312 14:54:26.324793 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:54:26 crc kubenswrapper[4921]: I0312 14:54:26.324907 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 14:54:26 crc kubenswrapper[4921]: I0312 14:54:26.325709 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"930648be0d8bf646882c87307f1b96290a71998ec99e0c13a0e1dc2dae923eb4"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 14:54:26 crc kubenswrapper[4921]: I0312 14:54:26.325807 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://930648be0d8bf646882c87307f1b96290a71998ec99e0c13a0e1dc2dae923eb4" gracePeriod=600 Mar 12 14:54:26 crc kubenswrapper[4921]: I0312 14:54:26.996103 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="930648be0d8bf646882c87307f1b96290a71998ec99e0c13a0e1dc2dae923eb4" exitCode=0 Mar 12 14:54:26 crc kubenswrapper[4921]: I0312 14:54:26.996195 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"930648be0d8bf646882c87307f1b96290a71998ec99e0c13a0e1dc2dae923eb4"} Mar 12 14:54:26 crc kubenswrapper[4921]: I0312 14:54:26.996475 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e"} Mar 12 14:54:26 crc kubenswrapper[4921]: I0312 14:54:26.996498 4921 scope.go:117] "RemoveContainer" containerID="302a82b5cbf553ac44de10f5c6feef58633e0ba89b80a036da561d61652d0f01" Mar 12 14:54:49 crc kubenswrapper[4921]: I0312 14:54:49.020999 4921 scope.go:117] "RemoveContainer" containerID="221809a05987c772b344286f3507a4ed22642fcdd294d47aefe9a00bd3b465c8" Mar 12 14:54:49 crc kubenswrapper[4921]: I0312 14:54:49.041874 4921 scope.go:117] "RemoveContainer" containerID="a743fd0f8035a74b708560fd44f6d281bb1df3ba40d835b0e2aa420e566d9895" Mar 12 14:54:49 crc kubenswrapper[4921]: I0312 14:54:49.075124 4921 scope.go:117] "RemoveContainer" containerID="7e65b65fca86f12435fc9f37fdfc799029ce773c52eb66fd91decf5d780faa8e" Mar 12 14:54:49 crc kubenswrapper[4921]: I0312 14:54:49.144289 4921 scope.go:117] "RemoveContainer" containerID="a533d780177bd354f9c8756fe6fe3b288824fa3c15aa77c8f67bad0f47b2a55d" Mar 12 14:56:00 crc kubenswrapper[4921]: I0312 14:56:00.142441 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555456-f6hzz"] Mar 12 14:56:00 crc kubenswrapper[4921]: E0312 14:56:00.143268 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08f4757-cd45-491d-a4cd-e0a08adc9fbd" containerName="oc" Mar 12 14:56:00 crc kubenswrapper[4921]: I0312 14:56:00.143279 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08f4757-cd45-491d-a4cd-e0a08adc9fbd" containerName="oc" Mar 12 14:56:00 crc kubenswrapper[4921]: I0312 14:56:00.143469 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08f4757-cd45-491d-a4cd-e0a08adc9fbd" containerName="oc" Mar 12 14:56:00 crc kubenswrapper[4921]: I0312 14:56:00.144147 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555456-f6hzz" Mar 12 14:56:00 crc kubenswrapper[4921]: I0312 14:56:00.146302 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:56:00 crc kubenswrapper[4921]: I0312 14:56:00.146488 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:56:00 crc kubenswrapper[4921]: I0312 14:56:00.148243 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:56:00 crc kubenswrapper[4921]: I0312 14:56:00.160826 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555456-f6hzz"] Mar 12 14:56:00 crc kubenswrapper[4921]: I0312 14:56:00.265503 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h97hn\" (UniqueName: \"kubernetes.io/projected/68143e2b-5e11-40f2-8771-fde5a32f2188-kube-api-access-h97hn\") pod \"auto-csr-approver-29555456-f6hzz\" (UID: \"68143e2b-5e11-40f2-8771-fde5a32f2188\") " pod="openshift-infra/auto-csr-approver-29555456-f6hzz" Mar 12 14:56:00 crc kubenswrapper[4921]: I0312 14:56:00.367706 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h97hn\" (UniqueName: \"kubernetes.io/projected/68143e2b-5e11-40f2-8771-fde5a32f2188-kube-api-access-h97hn\") pod \"auto-csr-approver-29555456-f6hzz\" (UID: \"68143e2b-5e11-40f2-8771-fde5a32f2188\") " pod="openshift-infra/auto-csr-approver-29555456-f6hzz" Mar 12 14:56:00 crc kubenswrapper[4921]: I0312 14:56:00.389638 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h97hn\" (UniqueName: \"kubernetes.io/projected/68143e2b-5e11-40f2-8771-fde5a32f2188-kube-api-access-h97hn\") pod \"auto-csr-approver-29555456-f6hzz\" (UID: \"68143e2b-5e11-40f2-8771-fde5a32f2188\") " pod="openshift-infra/auto-csr-approver-29555456-f6hzz" Mar 12 14:56:00 crc kubenswrapper[4921]: I0312 14:56:00.494137 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555456-f6hzz" Mar 12 14:56:00 crc kubenswrapper[4921]: I0312 14:56:00.973637 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555456-f6hzz"] Mar 12 14:56:01 crc kubenswrapper[4921]: I0312 14:56:01.574844 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555456-f6hzz" event={"ID":"68143e2b-5e11-40f2-8771-fde5a32f2188","Type":"ContainerStarted","Data":"22c6b73153f61bd2daf4843e132d58d4eadbcdfdab2798528860ed651e5a820b"} Mar 12 14:56:02 crc kubenswrapper[4921]: I0312 14:56:02.585862 4921 generic.go:334] "Generic (PLEG): container finished" podID="68143e2b-5e11-40f2-8771-fde5a32f2188" containerID="67ac367f6c7f66d3c84623e180dce28ce62234c2d6d40b8f2be1adae461ab5b4" exitCode=0 Mar 12 14:56:02 crc kubenswrapper[4921]: I0312 14:56:02.585984 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555456-f6hzz" event={"ID":"68143e2b-5e11-40f2-8771-fde5a32f2188","Type":"ContainerDied","Data":"67ac367f6c7f66d3c84623e180dce28ce62234c2d6d40b8f2be1adae461ab5b4"} Mar 12 14:56:04 crc kubenswrapper[4921]: I0312 14:56:04.057476 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555456-f6hzz" Mar 12 14:56:04 crc kubenswrapper[4921]: I0312 14:56:04.156332 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h97hn\" (UniqueName: \"kubernetes.io/projected/68143e2b-5e11-40f2-8771-fde5a32f2188-kube-api-access-h97hn\") pod \"68143e2b-5e11-40f2-8771-fde5a32f2188\" (UID: \"68143e2b-5e11-40f2-8771-fde5a32f2188\") " Mar 12 14:56:04 crc kubenswrapper[4921]: I0312 14:56:04.170041 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68143e2b-5e11-40f2-8771-fde5a32f2188-kube-api-access-h97hn" (OuterVolumeSpecName: "kube-api-access-h97hn") pod "68143e2b-5e11-40f2-8771-fde5a32f2188" (UID: "68143e2b-5e11-40f2-8771-fde5a32f2188"). InnerVolumeSpecName "kube-api-access-h97hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:56:04 crc kubenswrapper[4921]: I0312 14:56:04.259138 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h97hn\" (UniqueName: \"kubernetes.io/projected/68143e2b-5e11-40f2-8771-fde5a32f2188-kube-api-access-h97hn\") on node \"crc\" DevicePath \"\"" Mar 12 14:56:04 crc kubenswrapper[4921]: I0312 14:56:04.602245 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555456-f6hzz" event={"ID":"68143e2b-5e11-40f2-8771-fde5a32f2188","Type":"ContainerDied","Data":"22c6b73153f61bd2daf4843e132d58d4eadbcdfdab2798528860ed651e5a820b"} Mar 12 14:56:04 crc kubenswrapper[4921]: I0312 14:56:04.602644 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22c6b73153f61bd2daf4843e132d58d4eadbcdfdab2798528860ed651e5a820b" Mar 12 14:56:04 crc kubenswrapper[4921]: I0312 14:56:04.602316 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555456-f6hzz" Mar 12 14:56:05 crc kubenswrapper[4921]: I0312 14:56:05.130729 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555450-7tkm6"] Mar 12 14:56:05 crc kubenswrapper[4921]: I0312 14:56:05.139337 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555450-7tkm6"] Mar 12 14:56:05 crc kubenswrapper[4921]: I0312 14:56:05.994081 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bb62a06-2852-4ddc-8f71-2cbaf81a1f09" path="/var/lib/kubelet/pods/1bb62a06-2852-4ddc-8f71-2cbaf81a1f09/volumes" Mar 12 14:56:26 crc kubenswrapper[4921]: I0312 14:56:26.323868 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:56:26 crc kubenswrapper[4921]: I0312 14:56:26.324767 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:56:49 crc kubenswrapper[4921]: I0312 14:56:49.263894 4921 scope.go:117] "RemoveContainer" containerID="f64e27dbf4ded37b05f1df6c081e4432172ae7e1dba50d2769f4a7f4f03639b6" Mar 12 14:56:49 crc kubenswrapper[4921]: I0312 14:56:49.311801 4921 scope.go:117] "RemoveContainer" containerID="55919ad2f07f1351f158d7de321b6d110efa98730fa9d5cc0c1bb8f2292d75f5" Mar 12 14:56:49 crc kubenswrapper[4921]: I0312 14:56:49.336075 4921 scope.go:117] "RemoveContainer" containerID="8358147ab4b155d70903ed4d2d50dd092c12a98844dbfa39c8bbf82f059bf35f" Mar 12 14:56:49 crc kubenswrapper[4921]: I0312 14:56:49.393515 4921 scope.go:117] "RemoveContainer" containerID="d50b9a07a3cb04ee085f3a1833bb38cb52d129c588f2fd5aa7fffc69ed0cd732" Mar 12 14:56:56 crc kubenswrapper[4921]: I0312 14:56:56.324251 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:56:56 crc kubenswrapper[4921]: I0312 14:56:56.324802 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:57:26 crc kubenswrapper[4921]: I0312 14:57:26.324446 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 14:57:26 crc kubenswrapper[4921]: I0312 14:57:26.325100 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 14:57:26 crc kubenswrapper[4921]: I0312 14:57:26.325155 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 14:57:26 crc kubenswrapper[4921]: I0312 14:57:26.326074 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 14:57:26 crc kubenswrapper[4921]: I0312 14:57:26.326147 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" gracePeriod=600 Mar 12 14:57:26 crc kubenswrapper[4921]: E0312 14:57:26.447031 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:57:26 crc kubenswrapper[4921]: I0312 14:57:26.619061 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" exitCode=0 Mar 12 14:57:26 crc kubenswrapper[4921]: I0312 14:57:26.619115 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e"} Mar 12 14:57:26 crc kubenswrapper[4921]: I0312 14:57:26.619156 4921 scope.go:117] "RemoveContainer" containerID="930648be0d8bf646882c87307f1b96290a71998ec99e0c13a0e1dc2dae923eb4" Mar 12 14:57:26 crc kubenswrapper[4921]: I0312 14:57:26.619986 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 14:57:26 crc kubenswrapper[4921]: E0312 14:57:26.620587 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:57:40 crc kubenswrapper[4921]: I0312 14:57:40.983542 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 14:57:40 crc kubenswrapper[4921]: E0312 14:57:40.984139 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:57:51 crc kubenswrapper[4921]: I0312 14:57:51.864852 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9kd2p"] Mar 12 14:57:51 crc kubenswrapper[4921]: E0312 14:57:51.866282 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68143e2b-5e11-40f2-8771-fde5a32f2188" containerName="oc" Mar 12 14:57:51 crc kubenswrapper[4921]: I0312 14:57:51.866307 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="68143e2b-5e11-40f2-8771-fde5a32f2188" containerName="oc" Mar 12 14:57:51 crc kubenswrapper[4921]: I0312 14:57:51.866686 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="68143e2b-5e11-40f2-8771-fde5a32f2188" containerName="oc" Mar 12 14:57:51 crc kubenswrapper[4921]: I0312 14:57:51.869505 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:57:51 crc kubenswrapper[4921]: I0312 14:57:51.897239 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9kd2p"] Mar 12 14:57:51 crc kubenswrapper[4921]: I0312 14:57:51.939022 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhw5w\" (UniqueName: \"kubernetes.io/projected/5cce59f8-0511-441c-998f-b87d820f093c-kube-api-access-zhw5w\") pod \"redhat-marketplace-9kd2p\" (UID: \"5cce59f8-0511-441c-998f-b87d820f093c\") " pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:57:51 crc kubenswrapper[4921]: I0312 14:57:51.939173 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cce59f8-0511-441c-998f-b87d820f093c-catalog-content\") pod \"redhat-marketplace-9kd2p\" (UID: \"5cce59f8-0511-441c-998f-b87d820f093c\") " pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:57:51 crc kubenswrapper[4921]: I0312 14:57:51.939246 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cce59f8-0511-441c-998f-b87d820f093c-utilities\") pod \"redhat-marketplace-9kd2p\" (UID: \"5cce59f8-0511-441c-998f-b87d820f093c\") " pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:57:51 crc kubenswrapper[4921]: I0312 14:57:51.983300 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 14:57:51 crc kubenswrapper[4921]: E0312 14:57:51.983602 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:57:52 crc kubenswrapper[4921]: I0312 14:57:52.040055 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhw5w\" (UniqueName: \"kubernetes.io/projected/5cce59f8-0511-441c-998f-b87d820f093c-kube-api-access-zhw5w\") pod \"redhat-marketplace-9kd2p\" (UID: \"5cce59f8-0511-441c-998f-b87d820f093c\") " pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:57:52 crc kubenswrapper[4921]: I0312 14:57:52.040183 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cce59f8-0511-441c-998f-b87d820f093c-catalog-content\") pod \"redhat-marketplace-9kd2p\" (UID: \"5cce59f8-0511-441c-998f-b87d820f093c\") " pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:57:52 crc kubenswrapper[4921]: I0312 14:57:52.040266 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cce59f8-0511-441c-998f-b87d820f093c-utilities\") pod \"redhat-marketplace-9kd2p\" (UID: \"5cce59f8-0511-441c-998f-b87d820f093c\") " pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:57:52 crc kubenswrapper[4921]: I0312 14:57:52.040992 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cce59f8-0511-441c-998f-b87d820f093c-catalog-content\") pod \"redhat-marketplace-9kd2p\" (UID: \"5cce59f8-0511-441c-998f-b87d820f093c\") " pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:57:52 crc kubenswrapper[4921]: I0312 14:57:52.041166 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cce59f8-0511-441c-998f-b87d820f093c-utilities\") pod \"redhat-marketplace-9kd2p\" (UID: \"5cce59f8-0511-441c-998f-b87d820f093c\") " pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:57:52 crc kubenswrapper[4921]: I0312 14:57:52.072582 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhw5w\" (UniqueName: \"kubernetes.io/projected/5cce59f8-0511-441c-998f-b87d820f093c-kube-api-access-zhw5w\") pod \"redhat-marketplace-9kd2p\" (UID: \"5cce59f8-0511-441c-998f-b87d820f093c\") " pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:57:52 crc kubenswrapper[4921]: I0312 14:57:52.194308 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:57:52 crc kubenswrapper[4921]: I0312 14:57:52.699703 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9kd2p"] Mar 12 14:57:52 crc kubenswrapper[4921]: I0312 14:57:52.875166 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9kd2p" event={"ID":"5cce59f8-0511-441c-998f-b87d820f093c","Type":"ContainerStarted","Data":"bf2b50c8ed9c964ffb62dc466dba5b218b4b1d5dd6c36dceec40d4aae3c2e680"} Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.261198 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qthkb"] Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.264356 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.271012 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qthkb"] Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.364546 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6mbx\" (UniqueName: \"kubernetes.io/projected/75b9c6bc-85ba-409b-9fac-78de164cbeeb-kube-api-access-v6mbx\") pod \"community-operators-qthkb\" (UID: \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\") " pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.364617 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b9c6bc-85ba-409b-9fac-78de164cbeeb-catalog-content\") pod \"community-operators-qthkb\" (UID: \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\") " pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.364733 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b9c6bc-85ba-409b-9fac-78de164cbeeb-utilities\") pod \"community-operators-qthkb\" (UID: \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\") " pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.466326 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b9c6bc-85ba-409b-9fac-78de164cbeeb-utilities\") pod \"community-operators-qthkb\" (UID: \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\") " pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.466460 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6mbx\" (UniqueName: \"kubernetes.io/projected/75b9c6bc-85ba-409b-9fac-78de164cbeeb-kube-api-access-v6mbx\") pod \"community-operators-qthkb\" (UID: \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\") " pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.466507 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b9c6bc-85ba-409b-9fac-78de164cbeeb-catalog-content\") pod \"community-operators-qthkb\" (UID: \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\") " pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.466886 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b9c6bc-85ba-409b-9fac-78de164cbeeb-utilities\") pod \"community-operators-qthkb\" (UID: \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\") " pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.466915 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b9c6bc-85ba-409b-9fac-78de164cbeeb-catalog-content\") pod \"community-operators-qthkb\" (UID: \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\") " pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.485022 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6mbx\" (UniqueName: \"kubernetes.io/projected/75b9c6bc-85ba-409b-9fac-78de164cbeeb-kube-api-access-v6mbx\") pod \"community-operators-qthkb\" (UID: \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\") " pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.581784 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.892762 4921 generic.go:334] "Generic (PLEG): container finished" podID="5cce59f8-0511-441c-998f-b87d820f093c" containerID="32e25535b2f57aae1f8b2e902eb090357bada3071ceed79f2ba66d5d5abf5692" exitCode=0 Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.892807 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9kd2p" event={"ID":"5cce59f8-0511-441c-998f-b87d820f093c","Type":"ContainerDied","Data":"32e25535b2f57aae1f8b2e902eb090357bada3071ceed79f2ba66d5d5abf5692"} Mar 12 14:57:53 crc kubenswrapper[4921]: I0312 14:57:53.895789 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:57:54 crc kubenswrapper[4921]: W0312 14:57:54.163006 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75b9c6bc_85ba_409b_9fac_78de164cbeeb.slice/crio-b681ca16739c55488683db0e517fada67766fdab6d9e3bf8df8d8e063e366dc2 WatchSource:0}: Error finding container b681ca16739c55488683db0e517fada67766fdab6d9e3bf8df8d8e063e366dc2: Status 404 returned error can't find the container with id b681ca16739c55488683db0e517fada67766fdab6d9e3bf8df8d8e063e366dc2 Mar 12 14:57:54 crc kubenswrapper[4921]: I0312 14:57:54.164628 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qthkb"] Mar 12 14:57:54 crc kubenswrapper[4921]: I0312 14:57:54.902325 4921 generic.go:334] "Generic (PLEG): container finished" podID="75b9c6bc-85ba-409b-9fac-78de164cbeeb" containerID="5b756a2c2252a5ff94d348e0bdfa5553074603347715d1d8f483294a86d37a8d" exitCode=0 Mar 12 14:57:54 crc kubenswrapper[4921]: I0312 14:57:54.902421 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qthkb" event={"ID":"75b9c6bc-85ba-409b-9fac-78de164cbeeb","Type":"ContainerDied","Data":"5b756a2c2252a5ff94d348e0bdfa5553074603347715d1d8f483294a86d37a8d"} Mar 12 14:57:54 crc kubenswrapper[4921]: I0312 14:57:54.902988 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qthkb" event={"ID":"75b9c6bc-85ba-409b-9fac-78de164cbeeb","Type":"ContainerStarted","Data":"b681ca16739c55488683db0e517fada67766fdab6d9e3bf8df8d8e063e366dc2"} Mar 12 14:57:54 crc kubenswrapper[4921]: I0312 14:57:54.907026 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9kd2p" event={"ID":"5cce59f8-0511-441c-998f-b87d820f093c","Type":"ContainerStarted","Data":"bdf2f80a8b98d523a953e18b7432ac8059aa80331a909b33d1019c17b842b4fa"} Mar 12 14:57:55 crc kubenswrapper[4921]: I0312 14:57:55.925262 4921 generic.go:334] "Generic (PLEG): container finished" podID="5cce59f8-0511-441c-998f-b87d820f093c" containerID="bdf2f80a8b98d523a953e18b7432ac8059aa80331a909b33d1019c17b842b4fa" exitCode=0 Mar 12 14:57:55 crc kubenswrapper[4921]: I0312 14:57:55.925426 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9kd2p" event={"ID":"5cce59f8-0511-441c-998f-b87d820f093c","Type":"ContainerDied","Data":"bdf2f80a8b98d523a953e18b7432ac8059aa80331a909b33d1019c17b842b4fa"} Mar 12 14:57:56 crc kubenswrapper[4921]: I0312 14:57:56.936030 4921 generic.go:334] "Generic (PLEG): container finished" podID="75b9c6bc-85ba-409b-9fac-78de164cbeeb" containerID="fdaa8fae7bd7dbfdfcdb92f4ed3c4cff0eeac9f28a6a36b4e6b9197ad53960fa" exitCode=0 Mar 12 14:57:56 crc kubenswrapper[4921]: I0312 14:57:56.936138 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qthkb" event={"ID":"75b9c6bc-85ba-409b-9fac-78de164cbeeb","Type":"ContainerDied","Data":"fdaa8fae7bd7dbfdfcdb92f4ed3c4cff0eeac9f28a6a36b4e6b9197ad53960fa"} Mar 12 14:57:56 crc kubenswrapper[4921]: I0312 14:57:56.940417 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9kd2p" event={"ID":"5cce59f8-0511-441c-998f-b87d820f093c","Type":"ContainerStarted","Data":"805cf23246604a6e7190f755be738aa969411c95ace18168491ac45e5cc37edf"} Mar 12 14:57:57 crc kubenswrapper[4921]: I0312 14:57:57.950360 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qthkb" event={"ID":"75b9c6bc-85ba-409b-9fac-78de164cbeeb","Type":"ContainerStarted","Data":"f9f47dea2808c45063af6dd55661dde2cc4fd28cb7759a72343ffb0e0e018e59"} Mar 12 14:57:57 crc kubenswrapper[4921]: I0312 14:57:57.978513 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qthkb" podStartSLOduration=2.526597176 podStartE2EDuration="4.978493582s" podCreationTimestamp="2026-03-12 14:57:53 +0000 UTC" firstStartedPulling="2026-03-12 14:57:54.905020223 +0000 UTC m=+6497.595092194" lastFinishedPulling="2026-03-12 14:57:57.356916629 +0000 UTC m=+6500.046988600" observedRunningTime="2026-03-12 14:57:57.974036964 +0000 UTC m=+6500.664108945" watchObservedRunningTime="2026-03-12 14:57:57.978493582 +0000 UTC m=+6500.668565553" Mar 12 14:57:57 crc kubenswrapper[4921]: I0312 14:57:57.980054 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9kd2p" podStartSLOduration=4.570176253 podStartE2EDuration="6.98004692s" podCreationTimestamp="2026-03-12 14:57:51 +0000 UTC" firstStartedPulling="2026-03-12 14:57:53.895543254 +0000 UTC m=+6496.585615225" lastFinishedPulling="2026-03-12 14:57:56.305413921 +0000 UTC m=+6498.995485892" observedRunningTime="2026-03-12 14:57:56.977467775 +0000 UTC m=+6499.667539766" watchObservedRunningTime="2026-03-12 14:57:57.98004692 +0000 UTC m=+6500.670118891" Mar 12 14:58:00 crc kubenswrapper[4921]: I0312 14:58:00.147945 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555458-qd8sq"] Mar 12 14:58:00 crc kubenswrapper[4921]: I0312 14:58:00.149455 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555458-qd8sq" Mar 12 14:58:00 crc kubenswrapper[4921]: I0312 14:58:00.152047 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 14:58:00 crc kubenswrapper[4921]: I0312 14:58:00.152546 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 14:58:00 crc kubenswrapper[4921]: I0312 14:58:00.155764 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 14:58:00 crc kubenswrapper[4921]: I0312 14:58:00.158989 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555458-qd8sq"] Mar 12 14:58:00 crc kubenswrapper[4921]: I0312 14:58:00.304447 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k94kl\" (UniqueName: \"kubernetes.io/projected/1f75c568-d4f3-4787-b73d-e08ed0712f97-kube-api-access-k94kl\") pod \"auto-csr-approver-29555458-qd8sq\" (UID: \"1f75c568-d4f3-4787-b73d-e08ed0712f97\") " pod="openshift-infra/auto-csr-approver-29555458-qd8sq" Mar 12 14:58:00 crc kubenswrapper[4921]: I0312 14:58:00.406608 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k94kl\" (UniqueName: \"kubernetes.io/projected/1f75c568-d4f3-4787-b73d-e08ed0712f97-kube-api-access-k94kl\") pod \"auto-csr-approver-29555458-qd8sq\" (UID: \"1f75c568-d4f3-4787-b73d-e08ed0712f97\") " pod="openshift-infra/auto-csr-approver-29555458-qd8sq" Mar 12 14:58:00 crc kubenswrapper[4921]: I0312 14:58:00.435682 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k94kl\" (UniqueName: \"kubernetes.io/projected/1f75c568-d4f3-4787-b73d-e08ed0712f97-kube-api-access-k94kl\") pod \"auto-csr-approver-29555458-qd8sq\" (UID: \"1f75c568-d4f3-4787-b73d-e08ed0712f97\") " pod="openshift-infra/auto-csr-approver-29555458-qd8sq" Mar 12 14:58:00 crc kubenswrapper[4921]: I0312 14:58:00.470100 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555458-qd8sq" Mar 12 14:58:00 crc kubenswrapper[4921]: I0312 14:58:00.965329 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555458-qd8sq"] Mar 12 14:58:00 crc kubenswrapper[4921]: W0312 14:58:00.971923 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f75c568_d4f3_4787_b73d_e08ed0712f97.slice/crio-1a676dcf32ffd00c72bfea86bd3ab8aebd205a4166617eee0b90ba8e2ab03d71 WatchSource:0}: Error finding container 1a676dcf32ffd00c72bfea86bd3ab8aebd205a4166617eee0b90ba8e2ab03d71: Status 404 returned error can't find the container with id 1a676dcf32ffd00c72bfea86bd3ab8aebd205a4166617eee0b90ba8e2ab03d71 Mar 12 14:58:01 crc kubenswrapper[4921]: I0312 14:58:01.997721 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555458-qd8sq" event={"ID":"1f75c568-d4f3-4787-b73d-e08ed0712f97","Type":"ContainerStarted","Data":"1a676dcf32ffd00c72bfea86bd3ab8aebd205a4166617eee0b90ba8e2ab03d71"} Mar 12 14:58:02 crc kubenswrapper[4921]: I0312 14:58:02.195497 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:58:02 crc kubenswrapper[4921]: I0312 14:58:02.195556 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:58:02 crc kubenswrapper[4921]: I0312 14:58:02.245873 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:58:03 crc kubenswrapper[4921]: I0312 14:58:03.012647 4921 generic.go:334] "Generic (PLEG): container finished" podID="1f75c568-d4f3-4787-b73d-e08ed0712f97" containerID="0e014f11b7f8403e227ceebe40e7dd4ea56b6624d53e75bb7ee0540ee3074e88" exitCode=0 Mar 12 14:58:03 crc kubenswrapper[4921]: I0312 14:58:03.014683 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555458-qd8sq" event={"ID":"1f75c568-d4f3-4787-b73d-e08ed0712f97","Type":"ContainerDied","Data":"0e014f11b7f8403e227ceebe40e7dd4ea56b6624d53e75bb7ee0540ee3074e88"} Mar 12 14:58:03 crc kubenswrapper[4921]: I0312 14:58:03.083495 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:58:03 crc kubenswrapper[4921]: I0312 14:58:03.582561 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:58:03 crc kubenswrapper[4921]: I0312 14:58:03.582599 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:58:03 crc kubenswrapper[4921]: I0312 14:58:03.626058 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:58:04 crc kubenswrapper[4921]: I0312 14:58:04.091261 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:58:04 crc kubenswrapper[4921]: I0312 14:58:04.402617 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555458-qd8sq" Mar 12 14:58:04 crc kubenswrapper[4921]: I0312 14:58:04.404294 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k94kl\" (UniqueName: \"kubernetes.io/projected/1f75c568-d4f3-4787-b73d-e08ed0712f97-kube-api-access-k94kl\") pod \"1f75c568-d4f3-4787-b73d-e08ed0712f97\" (UID: \"1f75c568-d4f3-4787-b73d-e08ed0712f97\") " Mar 12 14:58:04 crc kubenswrapper[4921]: I0312 14:58:04.411573 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f75c568-d4f3-4787-b73d-e08ed0712f97-kube-api-access-k94kl" (OuterVolumeSpecName: "kube-api-access-k94kl") pod "1f75c568-d4f3-4787-b73d-e08ed0712f97" (UID: "1f75c568-d4f3-4787-b73d-e08ed0712f97"). InnerVolumeSpecName "kube-api-access-k94kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:58:04 crc kubenswrapper[4921]: I0312 14:58:04.507795 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k94kl\" (UniqueName: \"kubernetes.io/projected/1f75c568-d4f3-4787-b73d-e08ed0712f97-kube-api-access-k94kl\") on node \"crc\" DevicePath \"\"" Mar 12 14:58:04 crc kubenswrapper[4921]: I0312 14:58:04.649167 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9kd2p"] Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.032323 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555458-qd8sq" event={"ID":"1f75c568-d4f3-4787-b73d-e08ed0712f97","Type":"ContainerDied","Data":"1a676dcf32ffd00c72bfea86bd3ab8aebd205a4166617eee0b90ba8e2ab03d71"} Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.032372 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a676dcf32ffd00c72bfea86bd3ab8aebd205a4166617eee0b90ba8e2ab03d71" Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.032445 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9kd2p" podUID="5cce59f8-0511-441c-998f-b87d820f093c" containerName="registry-server" containerID="cri-o://805cf23246604a6e7190f755be738aa969411c95ace18168491ac45e5cc37edf" gracePeriod=2 Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.032908 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555458-qd8sq" Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.476872 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555452-6pdzr"] Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.507507 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555452-6pdzr"] Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.547561 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.728278 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cce59f8-0511-441c-998f-b87d820f093c-utilities\") pod \"5cce59f8-0511-441c-998f-b87d820f093c\" (UID: \"5cce59f8-0511-441c-998f-b87d820f093c\") " Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.728430 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cce59f8-0511-441c-998f-b87d820f093c-catalog-content\") pod \"5cce59f8-0511-441c-998f-b87d820f093c\" (UID: \"5cce59f8-0511-441c-998f-b87d820f093c\") " Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.728477 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhw5w\" (UniqueName: \"kubernetes.io/projected/5cce59f8-0511-441c-998f-b87d820f093c-kube-api-access-zhw5w\") pod \"5cce59f8-0511-441c-998f-b87d820f093c\" (UID: \"5cce59f8-0511-441c-998f-b87d820f093c\") " Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.729524 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cce59f8-0511-441c-998f-b87d820f093c-utilities" (OuterVolumeSpecName: "utilities") pod "5cce59f8-0511-441c-998f-b87d820f093c" (UID: "5cce59f8-0511-441c-998f-b87d820f093c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.734503 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cce59f8-0511-441c-998f-b87d820f093c-kube-api-access-zhw5w" (OuterVolumeSpecName: "kube-api-access-zhw5w") pod "5cce59f8-0511-441c-998f-b87d820f093c" (UID: "5cce59f8-0511-441c-998f-b87d820f093c"). InnerVolumeSpecName "kube-api-access-zhw5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.761461 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cce59f8-0511-441c-998f-b87d820f093c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5cce59f8-0511-441c-998f-b87d820f093c" (UID: "5cce59f8-0511-441c-998f-b87d820f093c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.831144 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cce59f8-0511-441c-998f-b87d820f093c-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.831181 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cce59f8-0511-441c-998f-b87d820f093c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.831193 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhw5w\" (UniqueName: \"kubernetes.io/projected/5cce59f8-0511-441c-998f-b87d820f093c-kube-api-access-zhw5w\") on node \"crc\" DevicePath \"\"" Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.983989 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 14:58:05 crc kubenswrapper[4921]: E0312 14:58:05.984494 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:58:05 crc kubenswrapper[4921]: I0312 14:58:05.998116 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00f63d31-2de6-4734-9554-da341c053194" path="/var/lib/kubelet/pods/00f63d31-2de6-4734-9554-da341c053194/volumes" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.054921 4921 generic.go:334] "Generic (PLEG): container finished" podID="5cce59f8-0511-441c-998f-b87d820f093c" containerID="805cf23246604a6e7190f755be738aa969411c95ace18168491ac45e5cc37edf" exitCode=0 Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.055109 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9kd2p" event={"ID":"5cce59f8-0511-441c-998f-b87d820f093c","Type":"ContainerDied","Data":"805cf23246604a6e7190f755be738aa969411c95ace18168491ac45e5cc37edf"} Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.055141 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9kd2p" event={"ID":"5cce59f8-0511-441c-998f-b87d820f093c","Type":"ContainerDied","Data":"bf2b50c8ed9c964ffb62dc466dba5b218b4b1d5dd6c36dceec40d4aae3c2e680"} Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.055269 4921 scope.go:117] "RemoveContainer" containerID="805cf23246604a6e7190f755be738aa969411c95ace18168491ac45e5cc37edf" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.055675 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9kd2p" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.065378 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qthkb"] Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.065752 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qthkb" podUID="75b9c6bc-85ba-409b-9fac-78de164cbeeb" containerName="registry-server" containerID="cri-o://f9f47dea2808c45063af6dd55661dde2cc4fd28cb7759a72343ffb0e0e018e59" gracePeriod=2 Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.092559 4921 scope.go:117] "RemoveContainer" containerID="bdf2f80a8b98d523a953e18b7432ac8059aa80331a909b33d1019c17b842b4fa" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.092946 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9kd2p"] Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.101539 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9kd2p"] Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.113410 4921 scope.go:117] "RemoveContainer" containerID="32e25535b2f57aae1f8b2e902eb090357bada3071ceed79f2ba66d5d5abf5692" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.269459 4921 scope.go:117] "RemoveContainer" containerID="805cf23246604a6e7190f755be738aa969411c95ace18168491ac45e5cc37edf" Mar 12 14:58:06 crc kubenswrapper[4921]: E0312 14:58:06.276872 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"805cf23246604a6e7190f755be738aa969411c95ace18168491ac45e5cc37edf\": container with ID starting with 805cf23246604a6e7190f755be738aa969411c95ace18168491ac45e5cc37edf not found: ID does not exist" containerID="805cf23246604a6e7190f755be738aa969411c95ace18168491ac45e5cc37edf" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.276935 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"805cf23246604a6e7190f755be738aa969411c95ace18168491ac45e5cc37edf"} err="failed to get container status \"805cf23246604a6e7190f755be738aa969411c95ace18168491ac45e5cc37edf\": rpc error: code = NotFound desc = could not find container \"805cf23246604a6e7190f755be738aa969411c95ace18168491ac45e5cc37edf\": container with ID starting with 805cf23246604a6e7190f755be738aa969411c95ace18168491ac45e5cc37edf not found: ID does not exist" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.276961 4921 scope.go:117] "RemoveContainer" containerID="bdf2f80a8b98d523a953e18b7432ac8059aa80331a909b33d1019c17b842b4fa" Mar 12 14:58:06 crc kubenswrapper[4921]: E0312 14:58:06.277964 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf2f80a8b98d523a953e18b7432ac8059aa80331a909b33d1019c17b842b4fa\": container with ID starting with bdf2f80a8b98d523a953e18b7432ac8059aa80331a909b33d1019c17b842b4fa not found: ID does not exist" containerID="bdf2f80a8b98d523a953e18b7432ac8059aa80331a909b33d1019c17b842b4fa" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.278001 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf2f80a8b98d523a953e18b7432ac8059aa80331a909b33d1019c17b842b4fa"} err="failed to get container status \"bdf2f80a8b98d523a953e18b7432ac8059aa80331a909b33d1019c17b842b4fa\": rpc error: code = NotFound desc = could not find container \"bdf2f80a8b98d523a953e18b7432ac8059aa80331a909b33d1019c17b842b4fa\": container with ID starting with bdf2f80a8b98d523a953e18b7432ac8059aa80331a909b33d1019c17b842b4fa not found: ID does not exist" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.278028 4921 scope.go:117] "RemoveContainer" containerID="32e25535b2f57aae1f8b2e902eb090357bada3071ceed79f2ba66d5d5abf5692" Mar 12 14:58:06 crc kubenswrapper[4921]: E0312 14:58:06.278566 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32e25535b2f57aae1f8b2e902eb090357bada3071ceed79f2ba66d5d5abf5692\": container with ID starting with 32e25535b2f57aae1f8b2e902eb090357bada3071ceed79f2ba66d5d5abf5692 not found: ID does not exist" containerID="32e25535b2f57aae1f8b2e902eb090357bada3071ceed79f2ba66d5d5abf5692" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.278592 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32e25535b2f57aae1f8b2e902eb090357bada3071ceed79f2ba66d5d5abf5692"} err="failed to get container status \"32e25535b2f57aae1f8b2e902eb090357bada3071ceed79f2ba66d5d5abf5692\": rpc error: code = NotFound desc = could not find container \"32e25535b2f57aae1f8b2e902eb090357bada3071ceed79f2ba66d5d5abf5692\": container with ID starting with 32e25535b2f57aae1f8b2e902eb090357bada3071ceed79f2ba66d5d5abf5692 not found: ID does not exist" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.509373 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.647097 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b9c6bc-85ba-409b-9fac-78de164cbeeb-utilities\") pod \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\" (UID: \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\") " Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.647228 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6mbx\" (UniqueName: \"kubernetes.io/projected/75b9c6bc-85ba-409b-9fac-78de164cbeeb-kube-api-access-v6mbx\") pod \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\" (UID: \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\") " Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.647397 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b9c6bc-85ba-409b-9fac-78de164cbeeb-catalog-content\") pod \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\" (UID: \"75b9c6bc-85ba-409b-9fac-78de164cbeeb\") " Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.648058 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b9c6bc-85ba-409b-9fac-78de164cbeeb-utilities" (OuterVolumeSpecName: "utilities") pod "75b9c6bc-85ba-409b-9fac-78de164cbeeb" (UID: "75b9c6bc-85ba-409b-9fac-78de164cbeeb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.657714 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75b9c6bc-85ba-409b-9fac-78de164cbeeb-kube-api-access-v6mbx" (OuterVolumeSpecName: "kube-api-access-v6mbx") pod "75b9c6bc-85ba-409b-9fac-78de164cbeeb" (UID: "75b9c6bc-85ba-409b-9fac-78de164cbeeb"). InnerVolumeSpecName "kube-api-access-v6mbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.709277 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75b9c6bc-85ba-409b-9fac-78de164cbeeb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75b9c6bc-85ba-409b-9fac-78de164cbeeb" (UID: "75b9c6bc-85ba-409b-9fac-78de164cbeeb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.750120 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6mbx\" (UniqueName: \"kubernetes.io/projected/75b9c6bc-85ba-409b-9fac-78de164cbeeb-kube-api-access-v6mbx\") on node \"crc\" DevicePath \"\"" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.750162 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75b9c6bc-85ba-409b-9fac-78de164cbeeb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:58:06 crc kubenswrapper[4921]: I0312 14:58:06.750172 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75b9c6bc-85ba-409b-9fac-78de164cbeeb-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.065870 4921 generic.go:334] "Generic (PLEG): container finished" podID="75b9c6bc-85ba-409b-9fac-78de164cbeeb" containerID="f9f47dea2808c45063af6dd55661dde2cc4fd28cb7759a72343ffb0e0e018e59" exitCode=0 Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.065912 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qthkb" Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.065942 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qthkb" event={"ID":"75b9c6bc-85ba-409b-9fac-78de164cbeeb","Type":"ContainerDied","Data":"f9f47dea2808c45063af6dd55661dde2cc4fd28cb7759a72343ffb0e0e018e59"} Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.066361 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qthkb" event={"ID":"75b9c6bc-85ba-409b-9fac-78de164cbeeb","Type":"ContainerDied","Data":"b681ca16739c55488683db0e517fada67766fdab6d9e3bf8df8d8e063e366dc2"} Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.066407 4921 scope.go:117] "RemoveContainer" containerID="f9f47dea2808c45063af6dd55661dde2cc4fd28cb7759a72343ffb0e0e018e59" Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.090238 4921 scope.go:117] "RemoveContainer" containerID="fdaa8fae7bd7dbfdfcdb92f4ed3c4cff0eeac9f28a6a36b4e6b9197ad53960fa" Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.113444 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qthkb"] Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.125297 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qthkb"] Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.126244 4921 scope.go:117] "RemoveContainer" containerID="5b756a2c2252a5ff94d348e0bdfa5553074603347715d1d8f483294a86d37a8d" Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.158918 4921 scope.go:117] "RemoveContainer" containerID="f9f47dea2808c45063af6dd55661dde2cc4fd28cb7759a72343ffb0e0e018e59" Mar 12 14:58:07 crc kubenswrapper[4921]: E0312 14:58:07.159288 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9f47dea2808c45063af6dd55661dde2cc4fd28cb7759a72343ffb0e0e018e59\": container with ID starting with f9f47dea2808c45063af6dd55661dde2cc4fd28cb7759a72343ffb0e0e018e59 not found: ID does not exist" containerID="f9f47dea2808c45063af6dd55661dde2cc4fd28cb7759a72343ffb0e0e018e59" Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.159333 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9f47dea2808c45063af6dd55661dde2cc4fd28cb7759a72343ffb0e0e018e59"} err="failed to get container status \"f9f47dea2808c45063af6dd55661dde2cc4fd28cb7759a72343ffb0e0e018e59\": rpc error: code = NotFound desc = could not find container \"f9f47dea2808c45063af6dd55661dde2cc4fd28cb7759a72343ffb0e0e018e59\": container with ID starting with f9f47dea2808c45063af6dd55661dde2cc4fd28cb7759a72343ffb0e0e018e59 not found: ID does not exist" Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.159363 4921 scope.go:117] "RemoveContainer" containerID="fdaa8fae7bd7dbfdfcdb92f4ed3c4cff0eeac9f28a6a36b4e6b9197ad53960fa" Mar 12 14:58:07 crc kubenswrapper[4921]: E0312 14:58:07.159951 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdaa8fae7bd7dbfdfcdb92f4ed3c4cff0eeac9f28a6a36b4e6b9197ad53960fa\": container with ID starting with fdaa8fae7bd7dbfdfcdb92f4ed3c4cff0eeac9f28a6a36b4e6b9197ad53960fa not found: ID does not exist" containerID="fdaa8fae7bd7dbfdfcdb92f4ed3c4cff0eeac9f28a6a36b4e6b9197ad53960fa" Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.159991 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdaa8fae7bd7dbfdfcdb92f4ed3c4cff0eeac9f28a6a36b4e6b9197ad53960fa"} err="failed to get container status \"fdaa8fae7bd7dbfdfcdb92f4ed3c4cff0eeac9f28a6a36b4e6b9197ad53960fa\": rpc error: code = NotFound desc = could not find container \"fdaa8fae7bd7dbfdfcdb92f4ed3c4cff0eeac9f28a6a36b4e6b9197ad53960fa\": container with ID starting with fdaa8fae7bd7dbfdfcdb92f4ed3c4cff0eeac9f28a6a36b4e6b9197ad53960fa not found: ID does not exist" Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.160019 4921 scope.go:117] "RemoveContainer" containerID="5b756a2c2252a5ff94d348e0bdfa5553074603347715d1d8f483294a86d37a8d" Mar 12 14:58:07 crc kubenswrapper[4921]: E0312 14:58:07.160516 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b756a2c2252a5ff94d348e0bdfa5553074603347715d1d8f483294a86d37a8d\": container with ID starting with 5b756a2c2252a5ff94d348e0bdfa5553074603347715d1d8f483294a86d37a8d not found: ID does not exist" containerID="5b756a2c2252a5ff94d348e0bdfa5553074603347715d1d8f483294a86d37a8d" Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.160551 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b756a2c2252a5ff94d348e0bdfa5553074603347715d1d8f483294a86d37a8d"} err="failed to get container status \"5b756a2c2252a5ff94d348e0bdfa5553074603347715d1d8f483294a86d37a8d\": rpc error: code = NotFound desc = could not find container \"5b756a2c2252a5ff94d348e0bdfa5553074603347715d1d8f483294a86d37a8d\": container with ID starting with 5b756a2c2252a5ff94d348e0bdfa5553074603347715d1d8f483294a86d37a8d not found: ID does not exist" Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.993907 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cce59f8-0511-441c-998f-b87d820f093c" path="/var/lib/kubelet/pods/5cce59f8-0511-441c-998f-b87d820f093c/volumes" Mar 12 14:58:07 crc kubenswrapper[4921]: I0312 14:58:07.994569 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75b9c6bc-85ba-409b-9fac-78de164cbeeb" path="/var/lib/kubelet/pods/75b9c6bc-85ba-409b-9fac-78de164cbeeb/volumes" Mar 12 14:58:18 crc kubenswrapper[4921]: I0312 14:58:18.983340 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 14:58:18 crc kubenswrapper[4921]: E0312 14:58:18.984193 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:58:31 crc kubenswrapper[4921]: I0312 14:58:31.983898 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 14:58:31 crc kubenswrapper[4921]: E0312 14:58:31.984731 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:58:44 crc kubenswrapper[4921]: I0312 14:58:44.984268 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 14:58:44 crc kubenswrapper[4921]: E0312 14:58:44.985168 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:58:49 crc kubenswrapper[4921]: I0312 14:58:49.487939 4921 scope.go:117] "RemoveContainer" containerID="cd5a8c41220f999fea617e2ed3037db486bbdd6edf7e4522663bccc91b929664" Mar 12 14:58:57 crc kubenswrapper[4921]: I0312 14:58:57.990055 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 14:58:57 crc kubenswrapper[4921]: E0312 14:58:57.992035 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:58:59 crc kubenswrapper[4921]: I0312 14:58:59.898061 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nr58v"] Mar 12 14:58:59 crc kubenswrapper[4921]: E0312 14:58:59.898798 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b9c6bc-85ba-409b-9fac-78de164cbeeb" containerName="extract-utilities" Mar 12 14:58:59 crc kubenswrapper[4921]: I0312 14:58:59.898829 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b9c6bc-85ba-409b-9fac-78de164cbeeb" containerName="extract-utilities" Mar 12 14:58:59 crc kubenswrapper[4921]: E0312 14:58:59.898844 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b9c6bc-85ba-409b-9fac-78de164cbeeb" containerName="registry-server" Mar 12 14:58:59 crc kubenswrapper[4921]: I0312 14:58:59.898851 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b9c6bc-85ba-409b-9fac-78de164cbeeb" containerName="registry-server" Mar 12 14:58:59 crc kubenswrapper[4921]: E0312 14:58:59.898865 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cce59f8-0511-441c-998f-b87d820f093c" containerName="registry-server" Mar 12 14:58:59 crc kubenswrapper[4921]: I0312 14:58:59.898873 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cce59f8-0511-441c-998f-b87d820f093c" containerName="registry-server" Mar 12 14:58:59 crc kubenswrapper[4921]: E0312 14:58:59.898903 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f75c568-d4f3-4787-b73d-e08ed0712f97" containerName="oc" Mar 12 14:58:59 crc kubenswrapper[4921]: I0312 14:58:59.898911 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f75c568-d4f3-4787-b73d-e08ed0712f97" containerName="oc" Mar 12 14:58:59 crc kubenswrapper[4921]: E0312 14:58:59.898927 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cce59f8-0511-441c-998f-b87d820f093c" containerName="extract-content" Mar 12 14:58:59 crc kubenswrapper[4921]: I0312 14:58:59.898932 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cce59f8-0511-441c-998f-b87d820f093c" containerName="extract-content" Mar 12 14:58:59 crc kubenswrapper[4921]: E0312 14:58:59.898941 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b9c6bc-85ba-409b-9fac-78de164cbeeb" containerName="extract-content" Mar 12 14:58:59 crc kubenswrapper[4921]: I0312 14:58:59.898946 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b9c6bc-85ba-409b-9fac-78de164cbeeb" containerName="extract-content" Mar 12 14:58:59 crc kubenswrapper[4921]: E0312 14:58:59.898961 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cce59f8-0511-441c-998f-b87d820f093c" containerName="extract-utilities" Mar 12 14:58:59 crc kubenswrapper[4921]: I0312 14:58:59.898967 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cce59f8-0511-441c-998f-b87d820f093c" containerName="extract-utilities" Mar 12 14:58:59 crc kubenswrapper[4921]: I0312 14:58:59.899145 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cce59f8-0511-441c-998f-b87d820f093c" containerName="registry-server" Mar 12 14:58:59 crc kubenswrapper[4921]: I0312 14:58:59.899162 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f75c568-d4f3-4787-b73d-e08ed0712f97" containerName="oc" Mar 12 14:58:59 crc kubenswrapper[4921]: I0312 14:58:59.899171 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="75b9c6bc-85ba-409b-9fac-78de164cbeeb" containerName="registry-server" Mar 12 14:58:59 crc kubenswrapper[4921]: I0312 14:58:59.900585 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:58:59 crc kubenswrapper[4921]: I0312 14:58:59.910122 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nr58v"] Mar 12 14:59:00 crc kubenswrapper[4921]: I0312 14:59:00.084383 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d96c7171-483e-4f1c-bc44-97005b79e976-catalog-content\") pod \"certified-operators-nr58v\" (UID: \"d96c7171-483e-4f1c-bc44-97005b79e976\") " pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:00 crc kubenswrapper[4921]: I0312 14:59:00.084449 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82lwf\" (UniqueName: \"kubernetes.io/projected/d96c7171-483e-4f1c-bc44-97005b79e976-kube-api-access-82lwf\") pod \"certified-operators-nr58v\" (UID: \"d96c7171-483e-4f1c-bc44-97005b79e976\") " pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:00 crc kubenswrapper[4921]: I0312 14:59:00.084546 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d96c7171-483e-4f1c-bc44-97005b79e976-utilities\") pod \"certified-operators-nr58v\" (UID: \"d96c7171-483e-4f1c-bc44-97005b79e976\") " pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:00 crc kubenswrapper[4921]: I0312 14:59:00.186900 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d96c7171-483e-4f1c-bc44-97005b79e976-utilities\") pod \"certified-operators-nr58v\" (UID: \"d96c7171-483e-4f1c-bc44-97005b79e976\") " pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:00 crc kubenswrapper[4921]: I0312 14:59:00.187203 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d96c7171-483e-4f1c-bc44-97005b79e976-catalog-content\") pod \"certified-operators-nr58v\" (UID: \"d96c7171-483e-4f1c-bc44-97005b79e976\") " pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:00 crc kubenswrapper[4921]: I0312 14:59:00.187297 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82lwf\" (UniqueName: \"kubernetes.io/projected/d96c7171-483e-4f1c-bc44-97005b79e976-kube-api-access-82lwf\") pod \"certified-operators-nr58v\" (UID: \"d96c7171-483e-4f1c-bc44-97005b79e976\") " pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:00 crc kubenswrapper[4921]: I0312 14:59:00.187521 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d96c7171-483e-4f1c-bc44-97005b79e976-catalog-content\") pod \"certified-operators-nr58v\" (UID: \"d96c7171-483e-4f1c-bc44-97005b79e976\") " pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:00 crc kubenswrapper[4921]: I0312 14:59:00.187664 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d96c7171-483e-4f1c-bc44-97005b79e976-utilities\") pod \"certified-operators-nr58v\" (UID: \"d96c7171-483e-4f1c-bc44-97005b79e976\") " pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:00 crc kubenswrapper[4921]: I0312 14:59:00.214777 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82lwf\" (UniqueName: \"kubernetes.io/projected/d96c7171-483e-4f1c-bc44-97005b79e976-kube-api-access-82lwf\") pod \"certified-operators-nr58v\" (UID: \"d96c7171-483e-4f1c-bc44-97005b79e976\") " pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:00 crc kubenswrapper[4921]: I0312 14:59:00.219775 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:00 crc kubenswrapper[4921]: I0312 14:59:00.728492 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nr58v"] Mar 12 14:59:01 crc kubenswrapper[4921]: I0312 14:59:01.593786 4921 generic.go:334] "Generic (PLEG): container finished" podID="d96c7171-483e-4f1c-bc44-97005b79e976" containerID="a1ad351f8d9e5c0f753cbea850e3db4135e8dd7b6194ab34476f9aeac23306fd" exitCode=0 Mar 12 14:59:01 crc kubenswrapper[4921]: I0312 14:59:01.593880 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nr58v" event={"ID":"d96c7171-483e-4f1c-bc44-97005b79e976","Type":"ContainerDied","Data":"a1ad351f8d9e5c0f753cbea850e3db4135e8dd7b6194ab34476f9aeac23306fd"} Mar 12 14:59:01 crc kubenswrapper[4921]: I0312 14:59:01.594533 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nr58v" event={"ID":"d96c7171-483e-4f1c-bc44-97005b79e976","Type":"ContainerStarted","Data":"ba4a8c07b3355ad4341f783bc73811ab36d8d727bf44eb87abdb36e52cebb3a5"} Mar 12 14:59:03 crc kubenswrapper[4921]: I0312 14:59:03.612934 4921 generic.go:334] "Generic (PLEG): container finished" podID="d96c7171-483e-4f1c-bc44-97005b79e976" containerID="fa2840bc2749106c117ff24212de1e616767ee1222cb496d0d76943064b126e0" exitCode=0 Mar 12 14:59:03 crc kubenswrapper[4921]: I0312 14:59:03.612996 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nr58v" event={"ID":"d96c7171-483e-4f1c-bc44-97005b79e976","Type":"ContainerDied","Data":"fa2840bc2749106c117ff24212de1e616767ee1222cb496d0d76943064b126e0"} Mar 12 14:59:06 crc kubenswrapper[4921]: I0312 14:59:06.645254 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nr58v" event={"ID":"d96c7171-483e-4f1c-bc44-97005b79e976","Type":"ContainerStarted","Data":"d553093d4ee3b60a07189bcb70e5c65657edbf6322a427e4a47b889a8bee4bef"} Mar 12 14:59:06 crc kubenswrapper[4921]: I0312 14:59:06.687627 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nr58v" podStartSLOduration=3.301159294 podStartE2EDuration="7.687597539s" podCreationTimestamp="2026-03-12 14:58:59 +0000 UTC" firstStartedPulling="2026-03-12 14:59:01.596453311 +0000 UTC m=+6564.286525312" lastFinishedPulling="2026-03-12 14:59:05.982891586 +0000 UTC m=+6568.672963557" observedRunningTime="2026-03-12 14:59:06.67370383 +0000 UTC m=+6569.363775841" watchObservedRunningTime="2026-03-12 14:59:06.687597539 +0000 UTC m=+6569.377669540" Mar 12 14:59:09 crc kubenswrapper[4921]: I0312 14:59:09.984336 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 14:59:09 crc kubenswrapper[4921]: E0312 14:59:09.985296 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:59:10 crc kubenswrapper[4921]: I0312 14:59:10.224498 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:10 crc kubenswrapper[4921]: I0312 14:59:10.224571 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:10 crc kubenswrapper[4921]: I0312 14:59:10.276143 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:20 crc kubenswrapper[4921]: I0312 14:59:20.275889 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:20 crc kubenswrapper[4921]: I0312 14:59:20.331701 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nr58v"] Mar 12 14:59:20 crc kubenswrapper[4921]: I0312 14:59:20.779711 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nr58v" podUID="d96c7171-483e-4f1c-bc44-97005b79e976" containerName="registry-server" containerID="cri-o://d553093d4ee3b60a07189bcb70e5c65657edbf6322a427e4a47b889a8bee4bef" gracePeriod=2 Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.278315 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.369334 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d96c7171-483e-4f1c-bc44-97005b79e976-utilities\") pod \"d96c7171-483e-4f1c-bc44-97005b79e976\" (UID: \"d96c7171-483e-4f1c-bc44-97005b79e976\") " Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.369416 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82lwf\" (UniqueName: \"kubernetes.io/projected/d96c7171-483e-4f1c-bc44-97005b79e976-kube-api-access-82lwf\") pod \"d96c7171-483e-4f1c-bc44-97005b79e976\" (UID: \"d96c7171-483e-4f1c-bc44-97005b79e976\") " Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.369457 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d96c7171-483e-4f1c-bc44-97005b79e976-catalog-content\") pod \"d96c7171-483e-4f1c-bc44-97005b79e976\" (UID: \"d96c7171-483e-4f1c-bc44-97005b79e976\") " Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.370961 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d96c7171-483e-4f1c-bc44-97005b79e976-utilities" (OuterVolumeSpecName: "utilities") pod "d96c7171-483e-4f1c-bc44-97005b79e976" (UID: "d96c7171-483e-4f1c-bc44-97005b79e976"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.376002 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d96c7171-483e-4f1c-bc44-97005b79e976-kube-api-access-82lwf" (OuterVolumeSpecName: "kube-api-access-82lwf") pod "d96c7171-483e-4f1c-bc44-97005b79e976" (UID: "d96c7171-483e-4f1c-bc44-97005b79e976"). InnerVolumeSpecName "kube-api-access-82lwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.430699 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d96c7171-483e-4f1c-bc44-97005b79e976-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d96c7171-483e-4f1c-bc44-97005b79e976" (UID: "d96c7171-483e-4f1c-bc44-97005b79e976"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.472277 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d96c7171-483e-4f1c-bc44-97005b79e976-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.472307 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82lwf\" (UniqueName: \"kubernetes.io/projected/d96c7171-483e-4f1c-bc44-97005b79e976-kube-api-access-82lwf\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.472317 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d96c7171-483e-4f1c-bc44-97005b79e976-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.788483 4921 generic.go:334] "Generic (PLEG): container finished" podID="d96c7171-483e-4f1c-bc44-97005b79e976" containerID="d553093d4ee3b60a07189bcb70e5c65657edbf6322a427e4a47b889a8bee4bef" exitCode=0 Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.788541 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nr58v" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.788560 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nr58v" event={"ID":"d96c7171-483e-4f1c-bc44-97005b79e976","Type":"ContainerDied","Data":"d553093d4ee3b60a07189bcb70e5c65657edbf6322a427e4a47b889a8bee4bef"} Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.788986 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nr58v" event={"ID":"d96c7171-483e-4f1c-bc44-97005b79e976","Type":"ContainerDied","Data":"ba4a8c07b3355ad4341f783bc73811ab36d8d727bf44eb87abdb36e52cebb3a5"} Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.789009 4921 scope.go:117] "RemoveContainer" containerID="d553093d4ee3b60a07189bcb70e5c65657edbf6322a427e4a47b889a8bee4bef" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.814661 4921 scope.go:117] "RemoveContainer" containerID="fa2840bc2749106c117ff24212de1e616767ee1222cb496d0d76943064b126e0" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.819467 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nr58v"] Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.827480 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nr58v"] Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.837427 4921 scope.go:117] "RemoveContainer" containerID="a1ad351f8d9e5c0f753cbea850e3db4135e8dd7b6194ab34476f9aeac23306fd" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.877100 4921 scope.go:117] "RemoveContainer" containerID="d553093d4ee3b60a07189bcb70e5c65657edbf6322a427e4a47b889a8bee4bef" Mar 12 14:59:21 crc kubenswrapper[4921]: E0312 14:59:21.877521 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d553093d4ee3b60a07189bcb70e5c65657edbf6322a427e4a47b889a8bee4bef\": container with ID starting with d553093d4ee3b60a07189bcb70e5c65657edbf6322a427e4a47b889a8bee4bef not found: ID does not exist" containerID="d553093d4ee3b60a07189bcb70e5c65657edbf6322a427e4a47b889a8bee4bef" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.877554 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d553093d4ee3b60a07189bcb70e5c65657edbf6322a427e4a47b889a8bee4bef"} err="failed to get container status \"d553093d4ee3b60a07189bcb70e5c65657edbf6322a427e4a47b889a8bee4bef\": rpc error: code = NotFound desc = could not find container \"d553093d4ee3b60a07189bcb70e5c65657edbf6322a427e4a47b889a8bee4bef\": container with ID starting with d553093d4ee3b60a07189bcb70e5c65657edbf6322a427e4a47b889a8bee4bef not found: ID does not exist" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.877578 4921 scope.go:117] "RemoveContainer" containerID="fa2840bc2749106c117ff24212de1e616767ee1222cb496d0d76943064b126e0" Mar 12 14:59:21 crc kubenswrapper[4921]: E0312 14:59:21.877982 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa2840bc2749106c117ff24212de1e616767ee1222cb496d0d76943064b126e0\": container with ID starting with fa2840bc2749106c117ff24212de1e616767ee1222cb496d0d76943064b126e0 not found: ID does not exist" containerID="fa2840bc2749106c117ff24212de1e616767ee1222cb496d0d76943064b126e0" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.878003 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2840bc2749106c117ff24212de1e616767ee1222cb496d0d76943064b126e0"} err="failed to get container status \"fa2840bc2749106c117ff24212de1e616767ee1222cb496d0d76943064b126e0\": rpc error: code = NotFound desc = could not find container \"fa2840bc2749106c117ff24212de1e616767ee1222cb496d0d76943064b126e0\": container with ID starting with fa2840bc2749106c117ff24212de1e616767ee1222cb496d0d76943064b126e0 not found: ID does not exist" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.878019 4921 scope.go:117] "RemoveContainer" containerID="a1ad351f8d9e5c0f753cbea850e3db4135e8dd7b6194ab34476f9aeac23306fd" Mar 12 14:59:21 crc kubenswrapper[4921]: E0312 14:59:21.878356 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1ad351f8d9e5c0f753cbea850e3db4135e8dd7b6194ab34476f9aeac23306fd\": container with ID starting with a1ad351f8d9e5c0f753cbea850e3db4135e8dd7b6194ab34476f9aeac23306fd not found: ID does not exist" containerID="a1ad351f8d9e5c0f753cbea850e3db4135e8dd7b6194ab34476f9aeac23306fd" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.878387 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1ad351f8d9e5c0f753cbea850e3db4135e8dd7b6194ab34476f9aeac23306fd"} err="failed to get container status \"a1ad351f8d9e5c0f753cbea850e3db4135e8dd7b6194ab34476f9aeac23306fd\": rpc error: code = NotFound desc = could not find container \"a1ad351f8d9e5c0f753cbea850e3db4135e8dd7b6194ab34476f9aeac23306fd\": container with ID starting with a1ad351f8d9e5c0f753cbea850e3db4135e8dd7b6194ab34476f9aeac23306fd not found: ID does not exist" Mar 12 14:59:21 crc kubenswrapper[4921]: I0312 14:59:21.994012 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d96c7171-483e-4f1c-bc44-97005b79e976" path="/var/lib/kubelet/pods/d96c7171-483e-4f1c-bc44-97005b79e976/volumes" Mar 12 14:59:23 crc kubenswrapper[4921]: I0312 14:59:23.983184 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 14:59:23 crc kubenswrapper[4921]: E0312 14:59:23.983685 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:59:38 crc kubenswrapper[4921]: I0312 14:59:38.005777 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 14:59:38 crc kubenswrapper[4921]: E0312 14:59:38.009113 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 14:59:49 crc kubenswrapper[4921]: I0312 14:59:49.983999 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 14:59:49 crc kubenswrapper[4921]: E0312 14:59:49.984923 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.150017 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555460-ztlpg"] Mar 12 15:00:00 crc kubenswrapper[4921]: E0312 15:00:00.151128 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96c7171-483e-4f1c-bc44-97005b79e976" containerName="extract-content" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.151148 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96c7171-483e-4f1c-bc44-97005b79e976" containerName="extract-content" Mar 12 15:00:00 crc kubenswrapper[4921]: E0312 15:00:00.151188 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96c7171-483e-4f1c-bc44-97005b79e976" containerName="registry-server" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.151195 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96c7171-483e-4f1c-bc44-97005b79e976" containerName="registry-server" Mar 12 15:00:00 crc kubenswrapper[4921]: E0312 15:00:00.151214 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96c7171-483e-4f1c-bc44-97005b79e976" containerName="extract-utilities" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.151223 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96c7171-483e-4f1c-bc44-97005b79e976" containerName="extract-utilities" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.151450 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d96c7171-483e-4f1c-bc44-97005b79e976" containerName="registry-server" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.152259 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555460-ztlpg" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.154973 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.155560 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.155654 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.160588 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm"] Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.161706 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.163607 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.166633 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.171885 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555460-ztlpg"] Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.201543 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm"] Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.273723 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5f91db2-4487-472c-a05e-fbe7391d60ff-config-volume\") pod \"collect-profiles-29555460-kzlfm\" (UID: \"b5f91db2-4487-472c-a05e-fbe7391d60ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.273953 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l246v\" (UniqueName: \"kubernetes.io/projected/b5f91db2-4487-472c-a05e-fbe7391d60ff-kube-api-access-l246v\") pod \"collect-profiles-29555460-kzlfm\" (UID: \"b5f91db2-4487-472c-a05e-fbe7391d60ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.274104 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5f91db2-4487-472c-a05e-fbe7391d60ff-secret-volume\") pod \"collect-profiles-29555460-kzlfm\" (UID: \"b5f91db2-4487-472c-a05e-fbe7391d60ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.274634 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4vnv\" (UniqueName: \"kubernetes.io/projected/987e7d06-17e6-4d6c-af5d-f72d91fbf6d9-kube-api-access-c4vnv\") pod \"auto-csr-approver-29555460-ztlpg\" (UID: \"987e7d06-17e6-4d6c-af5d-f72d91fbf6d9\") " pod="openshift-infra/auto-csr-approver-29555460-ztlpg" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.377108 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5f91db2-4487-472c-a05e-fbe7391d60ff-secret-volume\") pod \"collect-profiles-29555460-kzlfm\" (UID: \"b5f91db2-4487-472c-a05e-fbe7391d60ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.377227 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4vnv\" (UniqueName: \"kubernetes.io/projected/987e7d06-17e6-4d6c-af5d-f72d91fbf6d9-kube-api-access-c4vnv\") pod \"auto-csr-approver-29555460-ztlpg\" (UID: \"987e7d06-17e6-4d6c-af5d-f72d91fbf6d9\") " pod="openshift-infra/auto-csr-approver-29555460-ztlpg" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.377287 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5f91db2-4487-472c-a05e-fbe7391d60ff-config-volume\") pod \"collect-profiles-29555460-kzlfm\" (UID: \"b5f91db2-4487-472c-a05e-fbe7391d60ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.377342 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l246v\" (UniqueName: \"kubernetes.io/projected/b5f91db2-4487-472c-a05e-fbe7391d60ff-kube-api-access-l246v\") pod \"collect-profiles-29555460-kzlfm\" (UID: \"b5f91db2-4487-472c-a05e-fbe7391d60ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.378317 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5f91db2-4487-472c-a05e-fbe7391d60ff-config-volume\") pod \"collect-profiles-29555460-kzlfm\" (UID: \"b5f91db2-4487-472c-a05e-fbe7391d60ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.392430 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5f91db2-4487-472c-a05e-fbe7391d60ff-secret-volume\") pod \"collect-profiles-29555460-kzlfm\" (UID: \"b5f91db2-4487-472c-a05e-fbe7391d60ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.397393 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l246v\" (UniqueName: \"kubernetes.io/projected/b5f91db2-4487-472c-a05e-fbe7391d60ff-kube-api-access-l246v\") pod \"collect-profiles-29555460-kzlfm\" (UID: \"b5f91db2-4487-472c-a05e-fbe7391d60ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.405149 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4vnv\" (UniqueName: \"kubernetes.io/projected/987e7d06-17e6-4d6c-af5d-f72d91fbf6d9-kube-api-access-c4vnv\") pod \"auto-csr-approver-29555460-ztlpg\" (UID: \"987e7d06-17e6-4d6c-af5d-f72d91fbf6d9\") " pod="openshift-infra/auto-csr-approver-29555460-ztlpg" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.475672 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555460-ztlpg" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.487261 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" Mar 12 15:00:00 crc kubenswrapper[4921]: I0312 15:00:00.952579 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm"] Mar 12 15:00:01 crc kubenswrapper[4921]: I0312 15:00:01.023618 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555460-ztlpg"] Mar 12 15:00:01 crc kubenswrapper[4921]: W0312 15:00:01.026418 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod987e7d06_17e6_4d6c_af5d_f72d91fbf6d9.slice/crio-d42ff2150bb29c12e5ec5ac0ee53ea47a7101b588f1e2c43342232be5a948035 WatchSource:0}: Error finding container d42ff2150bb29c12e5ec5ac0ee53ea47a7101b588f1e2c43342232be5a948035: Status 404 returned error can't find the container with id d42ff2150bb29c12e5ec5ac0ee53ea47a7101b588f1e2c43342232be5a948035 Mar 12 15:00:01 crc kubenswrapper[4921]: I0312 15:00:01.138348 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555460-ztlpg" event={"ID":"987e7d06-17e6-4d6c-af5d-f72d91fbf6d9","Type":"ContainerStarted","Data":"d42ff2150bb29c12e5ec5ac0ee53ea47a7101b588f1e2c43342232be5a948035"} Mar 12 15:00:01 crc kubenswrapper[4921]: I0312 15:00:01.140220 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" event={"ID":"b5f91db2-4487-472c-a05e-fbe7391d60ff","Type":"ContainerStarted","Data":"f7e1901b946d5d563acd30a784997c2fa500e70ebfd9644b218e4ac3e286a3f2"} Mar 12 15:00:01 crc kubenswrapper[4921]: I0312 15:00:01.140253 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" event={"ID":"b5f91db2-4487-472c-a05e-fbe7391d60ff","Type":"ContainerStarted","Data":"eedd78af64e11ddd8f7b5ac032e4838641c990c02d392da3de6b500d2e78a959"} Mar 12 15:00:01 crc kubenswrapper[4921]: I0312 15:00:01.161500 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" podStartSLOduration=1.1614751700000001 podStartE2EDuration="1.16147517s" podCreationTimestamp="2026-03-12 15:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:00:01.152288727 +0000 UTC m=+6623.842360718" watchObservedRunningTime="2026-03-12 15:00:01.16147517 +0000 UTC m=+6623.851547141" Mar 12 15:00:02 crc kubenswrapper[4921]: I0312 15:00:02.152501 4921 generic.go:334] "Generic (PLEG): container finished" podID="b5f91db2-4487-472c-a05e-fbe7391d60ff" containerID="f7e1901b946d5d563acd30a784997c2fa500e70ebfd9644b218e4ac3e286a3f2" exitCode=0 Mar 12 15:00:02 crc kubenswrapper[4921]: I0312 15:00:02.152568 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" event={"ID":"b5f91db2-4487-472c-a05e-fbe7391d60ff","Type":"ContainerDied","Data":"f7e1901b946d5d563acd30a784997c2fa500e70ebfd9644b218e4ac3e286a3f2"} Mar 12 15:00:02 crc kubenswrapper[4921]: I0312 15:00:02.983607 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 15:00:02 crc kubenswrapper[4921]: E0312 15:00:02.984217 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:00:03 crc kubenswrapper[4921]: I0312 15:00:03.518151 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" Mar 12 15:00:03 crc kubenswrapper[4921]: I0312 15:00:03.542249 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5f91db2-4487-472c-a05e-fbe7391d60ff-config-volume\") pod \"b5f91db2-4487-472c-a05e-fbe7391d60ff\" (UID: \"b5f91db2-4487-472c-a05e-fbe7391d60ff\") " Mar 12 15:00:03 crc kubenswrapper[4921]: I0312 15:00:03.542335 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l246v\" (UniqueName: \"kubernetes.io/projected/b5f91db2-4487-472c-a05e-fbe7391d60ff-kube-api-access-l246v\") pod \"b5f91db2-4487-472c-a05e-fbe7391d60ff\" (UID: \"b5f91db2-4487-472c-a05e-fbe7391d60ff\") " Mar 12 15:00:03 crc kubenswrapper[4921]: I0312 15:00:03.542446 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5f91db2-4487-472c-a05e-fbe7391d60ff-secret-volume\") pod \"b5f91db2-4487-472c-a05e-fbe7391d60ff\" (UID: \"b5f91db2-4487-472c-a05e-fbe7391d60ff\") " Mar 12 15:00:03 crc kubenswrapper[4921]: I0312 15:00:03.543176 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5f91db2-4487-472c-a05e-fbe7391d60ff-config-volume" (OuterVolumeSpecName: "config-volume") pod "b5f91db2-4487-472c-a05e-fbe7391d60ff" (UID: "b5f91db2-4487-472c-a05e-fbe7391d60ff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:00:03 crc kubenswrapper[4921]: I0312 15:00:03.548265 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5f91db2-4487-472c-a05e-fbe7391d60ff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b5f91db2-4487-472c-a05e-fbe7391d60ff" (UID: "b5f91db2-4487-472c-a05e-fbe7391d60ff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:00:03 crc kubenswrapper[4921]: I0312 15:00:03.549027 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5f91db2-4487-472c-a05e-fbe7391d60ff-kube-api-access-l246v" (OuterVolumeSpecName: "kube-api-access-l246v") pod "b5f91db2-4487-472c-a05e-fbe7391d60ff" (UID: "b5f91db2-4487-472c-a05e-fbe7391d60ff"). InnerVolumeSpecName "kube-api-access-l246v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:00:03 crc kubenswrapper[4921]: I0312 15:00:03.645403 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b5f91db2-4487-472c-a05e-fbe7391d60ff-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:00:03 crc kubenswrapper[4921]: I0312 15:00:03.645443 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b5f91db2-4487-472c-a05e-fbe7391d60ff-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:00:03 crc kubenswrapper[4921]: I0312 15:00:03.645454 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l246v\" (UniqueName: \"kubernetes.io/projected/b5f91db2-4487-472c-a05e-fbe7391d60ff-kube-api-access-l246v\") on node \"crc\" DevicePath \"\"" Mar 12 15:00:04 crc kubenswrapper[4921]: I0312 15:00:04.172051 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" event={"ID":"b5f91db2-4487-472c-a05e-fbe7391d60ff","Type":"ContainerDied","Data":"eedd78af64e11ddd8f7b5ac032e4838641c990c02d392da3de6b500d2e78a959"} Mar 12 15:00:04 crc kubenswrapper[4921]: I0312 15:00:04.172093 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eedd78af64e11ddd8f7b5ac032e4838641c990c02d392da3de6b500d2e78a959" Mar 12 15:00:04 crc kubenswrapper[4921]: I0312 15:00:04.172111 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm" Mar 12 15:00:04 crc kubenswrapper[4921]: I0312 15:00:04.233772 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns"] Mar 12 15:00:04 crc kubenswrapper[4921]: I0312 15:00:04.249159 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555415-xsmns"] Mar 12 15:00:05 crc kubenswrapper[4921]: I0312 15:00:05.995369 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c314933f-5c81-4c99-872c-e16f3d6317f4" path="/var/lib/kubelet/pods/c314933f-5c81-4c99-872c-e16f3d6317f4/volumes" Mar 12 15:00:13 crc kubenswrapper[4921]: I0312 15:00:13.983125 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 15:00:13 crc kubenswrapper[4921]: E0312 15:00:13.984215 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:00:20 crc kubenswrapper[4921]: I0312 15:00:20.298273 4921 generic.go:334] "Generic (PLEG): container finished" podID="987e7d06-17e6-4d6c-af5d-f72d91fbf6d9" containerID="318092902b5fc55d36a417923ec86bf93491a2e8d475695330b56c4a9b95a0bf" exitCode=0 Mar 12 15:00:20 crc kubenswrapper[4921]: I0312 15:00:20.298480 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555460-ztlpg" event={"ID":"987e7d06-17e6-4d6c-af5d-f72d91fbf6d9","Type":"ContainerDied","Data":"318092902b5fc55d36a417923ec86bf93491a2e8d475695330b56c4a9b95a0bf"} Mar 12 15:00:21 crc kubenswrapper[4921]: I0312 15:00:21.727164 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555460-ztlpg" Mar 12 15:00:21 crc kubenswrapper[4921]: I0312 15:00:21.785445 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4vnv\" (UniqueName: \"kubernetes.io/projected/987e7d06-17e6-4d6c-af5d-f72d91fbf6d9-kube-api-access-c4vnv\") pod \"987e7d06-17e6-4d6c-af5d-f72d91fbf6d9\" (UID: \"987e7d06-17e6-4d6c-af5d-f72d91fbf6d9\") " Mar 12 15:00:21 crc kubenswrapper[4921]: I0312 15:00:21.792137 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/987e7d06-17e6-4d6c-af5d-f72d91fbf6d9-kube-api-access-c4vnv" (OuterVolumeSpecName: "kube-api-access-c4vnv") pod "987e7d06-17e6-4d6c-af5d-f72d91fbf6d9" (UID: "987e7d06-17e6-4d6c-af5d-f72d91fbf6d9"). InnerVolumeSpecName "kube-api-access-c4vnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:00:21 crc kubenswrapper[4921]: I0312 15:00:21.887850 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4vnv\" (UniqueName: \"kubernetes.io/projected/987e7d06-17e6-4d6c-af5d-f72d91fbf6d9-kube-api-access-c4vnv\") on node \"crc\" DevicePath \"\"" Mar 12 15:00:22 crc kubenswrapper[4921]: I0312 15:00:22.540304 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555460-ztlpg" event={"ID":"987e7d06-17e6-4d6c-af5d-f72d91fbf6d9","Type":"ContainerDied","Data":"d42ff2150bb29c12e5ec5ac0ee53ea47a7101b588f1e2c43342232be5a948035"} Mar 12 15:00:22 crc kubenswrapper[4921]: I0312 15:00:22.540355 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d42ff2150bb29c12e5ec5ac0ee53ea47a7101b588f1e2c43342232be5a948035" Mar 12 15:00:22 crc kubenswrapper[4921]: I0312 15:00:22.540416 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555460-ztlpg" Mar 12 15:00:22 crc kubenswrapper[4921]: I0312 15:00:22.807216 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555454-6v9ll"] Mar 12 15:00:22 crc kubenswrapper[4921]: I0312 15:00:22.817367 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555454-6v9ll"] Mar 12 15:00:23 crc kubenswrapper[4921]: I0312 15:00:23.993377 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08f4757-cd45-491d-a4cd-e0a08adc9fbd" path="/var/lib/kubelet/pods/b08f4757-cd45-491d-a4cd-e0a08adc9fbd/volumes" Mar 12 15:00:24 crc kubenswrapper[4921]: I0312 15:00:24.983648 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 15:00:24 crc kubenswrapper[4921]: E0312 15:00:24.983984 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:00:37 crc kubenswrapper[4921]: I0312 15:00:37.995583 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 15:00:37 crc kubenswrapper[4921]: E0312 15:00:37.996859 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:00:49 crc kubenswrapper[4921]: I0312 15:00:49.625723 4921 scope.go:117] "RemoveContainer" containerID="e6f2d1304816860c507b46e8230027c3fa84b5c66cb0f65241c8cd5403d96885" Mar 12 15:00:49 crc kubenswrapper[4921]: I0312 15:00:49.651389 4921 scope.go:117] "RemoveContainer" containerID="2aa0e2c24cede3c575862f8ae39d111cea8f06c750256c5534c2bd1a7974b594" Mar 12 15:00:49 crc kubenswrapper[4921]: I0312 15:00:49.983919 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 15:00:49 crc kubenswrapper[4921]: E0312 15:00:49.984171 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.158402 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29555461-nscpw"] Mar 12 15:01:00 crc kubenswrapper[4921]: E0312 15:01:00.159534 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987e7d06-17e6-4d6c-af5d-f72d91fbf6d9" containerName="oc" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.159552 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="987e7d06-17e6-4d6c-af5d-f72d91fbf6d9" containerName="oc" Mar 12 15:01:00 crc kubenswrapper[4921]: E0312 15:01:00.159580 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5f91db2-4487-472c-a05e-fbe7391d60ff" containerName="collect-profiles" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.159588 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5f91db2-4487-472c-a05e-fbe7391d60ff" containerName="collect-profiles" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.159883 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5f91db2-4487-472c-a05e-fbe7391d60ff" containerName="collect-profiles" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.159913 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="987e7d06-17e6-4d6c-af5d-f72d91fbf6d9" containerName="oc" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.160723 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.167146 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29555461-nscpw"] Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.327531 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-fernet-keys\") pod \"keystone-cron-29555461-nscpw\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.327611 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lssnm\" (UniqueName: \"kubernetes.io/projected/ce60198f-3189-4ce6-b4a7-32387eb98fa7-kube-api-access-lssnm\") pod \"keystone-cron-29555461-nscpw\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.327662 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-combined-ca-bundle\") pod \"keystone-cron-29555461-nscpw\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.327793 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-config-data\") pod \"keystone-cron-29555461-nscpw\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.430136 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-fernet-keys\") pod \"keystone-cron-29555461-nscpw\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.430189 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lssnm\" (UniqueName: \"kubernetes.io/projected/ce60198f-3189-4ce6-b4a7-32387eb98fa7-kube-api-access-lssnm\") pod \"keystone-cron-29555461-nscpw\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.430214 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-combined-ca-bundle\") pod \"keystone-cron-29555461-nscpw\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.430260 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-config-data\") pod \"keystone-cron-29555461-nscpw\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.443249 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-fernet-keys\") pod \"keystone-cron-29555461-nscpw\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.448633 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-combined-ca-bundle\") pod \"keystone-cron-29555461-nscpw\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.452392 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-config-data\") pod \"keystone-cron-29555461-nscpw\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.452995 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lssnm\" (UniqueName: \"kubernetes.io/projected/ce60198f-3189-4ce6-b4a7-32387eb98fa7-kube-api-access-lssnm\") pod \"keystone-cron-29555461-nscpw\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.482573 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:00 crc kubenswrapper[4921]: I0312 15:01:00.949249 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29555461-nscpw"] Mar 12 15:01:01 crc kubenswrapper[4921]: I0312 15:01:01.066619 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29555461-nscpw" event={"ID":"ce60198f-3189-4ce6-b4a7-32387eb98fa7","Type":"ContainerStarted","Data":"de17df3667314acd5b5f8447fe4859a64fc980bb2c6493974c5d86f8c2ac1963"} Mar 12 15:01:02 crc kubenswrapper[4921]: I0312 15:01:02.079505 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29555461-nscpw" event={"ID":"ce60198f-3189-4ce6-b4a7-32387eb98fa7","Type":"ContainerStarted","Data":"eee3c952d34091685a82a955f928f1f458b4a53f1b54f61062170327c8e2857e"} Mar 12 15:01:02 crc kubenswrapper[4921]: I0312 15:01:02.107807 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29555461-nscpw" podStartSLOduration=2.107782832 podStartE2EDuration="2.107782832s" podCreationTimestamp="2026-03-12 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:01:02.096318137 +0000 UTC m=+6684.786390098" watchObservedRunningTime="2026-03-12 15:01:02.107782832 +0000 UTC m=+6684.797854803" Mar 12 15:01:03 crc kubenswrapper[4921]: I0312 15:01:03.983168 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 15:01:03 crc kubenswrapper[4921]: E0312 15:01:03.984369 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:01:05 crc kubenswrapper[4921]: I0312 15:01:05.106243 4921 generic.go:334] "Generic (PLEG): container finished" podID="ce60198f-3189-4ce6-b4a7-32387eb98fa7" containerID="eee3c952d34091685a82a955f928f1f458b4a53f1b54f61062170327c8e2857e" exitCode=0 Mar 12 15:01:05 crc kubenswrapper[4921]: I0312 15:01:05.106326 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29555461-nscpw" event={"ID":"ce60198f-3189-4ce6-b4a7-32387eb98fa7","Type":"ContainerDied","Data":"eee3c952d34091685a82a955f928f1f458b4a53f1b54f61062170327c8e2857e"} Mar 12 15:01:06 crc kubenswrapper[4921]: I0312 15:01:06.493238 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:06 crc kubenswrapper[4921]: I0312 15:01:06.660177 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lssnm\" (UniqueName: \"kubernetes.io/projected/ce60198f-3189-4ce6-b4a7-32387eb98fa7-kube-api-access-lssnm\") pod \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " Mar 12 15:01:06 crc kubenswrapper[4921]: I0312 15:01:06.660568 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-config-data\") pod \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " Mar 12 15:01:06 crc kubenswrapper[4921]: I0312 15:01:06.660597 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-fernet-keys\") pod \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " Mar 12 15:01:06 crc kubenswrapper[4921]: I0312 15:01:06.660840 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-combined-ca-bundle\") pod \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\" (UID: \"ce60198f-3189-4ce6-b4a7-32387eb98fa7\") " Mar 12 15:01:06 crc kubenswrapper[4921]: I0312 15:01:06.666658 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce60198f-3189-4ce6-b4a7-32387eb98fa7-kube-api-access-lssnm" (OuterVolumeSpecName: "kube-api-access-lssnm") pod "ce60198f-3189-4ce6-b4a7-32387eb98fa7" (UID: "ce60198f-3189-4ce6-b4a7-32387eb98fa7"). InnerVolumeSpecName "kube-api-access-lssnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:01:06 crc kubenswrapper[4921]: I0312 15:01:06.670156 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ce60198f-3189-4ce6-b4a7-32387eb98fa7" (UID: "ce60198f-3189-4ce6-b4a7-32387eb98fa7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:01:06 crc kubenswrapper[4921]: I0312 15:01:06.697901 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce60198f-3189-4ce6-b4a7-32387eb98fa7" (UID: "ce60198f-3189-4ce6-b4a7-32387eb98fa7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:01:06 crc kubenswrapper[4921]: I0312 15:01:06.713035 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-config-data" (OuterVolumeSpecName: "config-data") pod "ce60198f-3189-4ce6-b4a7-32387eb98fa7" (UID: "ce60198f-3189-4ce6-b4a7-32387eb98fa7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:01:06 crc kubenswrapper[4921]: I0312 15:01:06.764226 4921 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 12 15:01:06 crc kubenswrapper[4921]: I0312 15:01:06.764269 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lssnm\" (UniqueName: \"kubernetes.io/projected/ce60198f-3189-4ce6-b4a7-32387eb98fa7-kube-api-access-lssnm\") on node \"crc\" DevicePath \"\"" Mar 12 15:01:06 crc kubenswrapper[4921]: I0312 15:01:06.764282 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:01:06 crc kubenswrapper[4921]: I0312 15:01:06.764292 4921 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce60198f-3189-4ce6-b4a7-32387eb98fa7-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 12 15:01:07 crc kubenswrapper[4921]: I0312 15:01:07.128064 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29555461-nscpw" event={"ID":"ce60198f-3189-4ce6-b4a7-32387eb98fa7","Type":"ContainerDied","Data":"de17df3667314acd5b5f8447fe4859a64fc980bb2c6493974c5d86f8c2ac1963"} Mar 12 15:01:07 crc kubenswrapper[4921]: I0312 15:01:07.128118 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de17df3667314acd5b5f8447fe4859a64fc980bb2c6493974c5d86f8c2ac1963" Mar 12 15:01:07 crc kubenswrapper[4921]: I0312 15:01:07.128140 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29555461-nscpw" Mar 12 15:01:14 crc kubenswrapper[4921]: I0312 15:01:14.983549 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 15:01:14 crc kubenswrapper[4921]: E0312 15:01:14.984539 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.211079 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bstd5"] Mar 12 15:01:20 crc kubenswrapper[4921]: E0312 15:01:20.211996 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce60198f-3189-4ce6-b4a7-32387eb98fa7" containerName="keystone-cron" Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.212009 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce60198f-3189-4ce6-b4a7-32387eb98fa7" containerName="keystone-cron" Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.212210 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce60198f-3189-4ce6-b4a7-32387eb98fa7" containerName="keystone-cron" Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.213788 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.289307 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bstd5"] Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.359805 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mthvp\" (UniqueName: \"kubernetes.io/projected/96baa3f9-7cf9-499b-94ad-0f8cd1a98f76-kube-api-access-mthvp\") pod \"redhat-operators-bstd5\" (UID: \"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76\") " pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.360029 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96baa3f9-7cf9-499b-94ad-0f8cd1a98f76-utilities\") pod \"redhat-operators-bstd5\" (UID: \"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76\") " pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.360215 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96baa3f9-7cf9-499b-94ad-0f8cd1a98f76-catalog-content\") pod \"redhat-operators-bstd5\" (UID: \"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76\") " pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.462507 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mthvp\" (UniqueName: \"kubernetes.io/projected/96baa3f9-7cf9-499b-94ad-0f8cd1a98f76-kube-api-access-mthvp\") pod \"redhat-operators-bstd5\" (UID: \"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76\") " pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.462674 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96baa3f9-7cf9-499b-94ad-0f8cd1a98f76-utilities\") pod \"redhat-operators-bstd5\" (UID: \"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76\") " pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.463008 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96baa3f9-7cf9-499b-94ad-0f8cd1a98f76-catalog-content\") pod \"redhat-operators-bstd5\" (UID: \"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76\") " pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.463381 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96baa3f9-7cf9-499b-94ad-0f8cd1a98f76-utilities\") pod \"redhat-operators-bstd5\" (UID: \"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76\") " pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.463478 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96baa3f9-7cf9-499b-94ad-0f8cd1a98f76-catalog-content\") pod \"redhat-operators-bstd5\" (UID: \"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76\") " pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.486044 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mthvp\" (UniqueName: \"kubernetes.io/projected/96baa3f9-7cf9-499b-94ad-0f8cd1a98f76-kube-api-access-mthvp\") pod \"redhat-operators-bstd5\" (UID: \"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76\") " pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:20 crc kubenswrapper[4921]: I0312 15:01:20.534098 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:21 crc kubenswrapper[4921]: I0312 15:01:21.046798 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bstd5"] Mar 12 15:01:21 crc kubenswrapper[4921]: I0312 15:01:21.265082 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bstd5" event={"ID":"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76","Type":"ContainerStarted","Data":"296997162c0478ad99741b71a8a7cb862f1e1cab71e963008ac46919eef2f292"} Mar 12 15:01:21 crc kubenswrapper[4921]: I0312 15:01:21.265135 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bstd5" event={"ID":"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76","Type":"ContainerStarted","Data":"29654d2a420bcfe39825112afe552e2080cce20cfd8377aef60972aba00e726e"} Mar 12 15:01:22 crc kubenswrapper[4921]: I0312 15:01:22.276223 4921 generic.go:334] "Generic (PLEG): container finished" podID="96baa3f9-7cf9-499b-94ad-0f8cd1a98f76" containerID="296997162c0478ad99741b71a8a7cb862f1e1cab71e963008ac46919eef2f292" exitCode=0 Mar 12 15:01:22 crc kubenswrapper[4921]: I0312 15:01:22.276266 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bstd5" event={"ID":"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76","Type":"ContainerDied","Data":"296997162c0478ad99741b71a8a7cb862f1e1cab71e963008ac46919eef2f292"} Mar 12 15:01:25 crc kubenswrapper[4921]: I0312 15:01:25.983805 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 15:01:25 crc kubenswrapper[4921]: E0312 15:01:25.985503 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:01:30 crc kubenswrapper[4921]: I0312 15:01:30.341807 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bstd5" event={"ID":"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76","Type":"ContainerStarted","Data":"4aadf5f62f76113e180c8afe2258cb7e6d767cc1d75582dbc18d40f2a4729864"} Mar 12 15:01:33 crc kubenswrapper[4921]: I0312 15:01:33.370222 4921 generic.go:334] "Generic (PLEG): container finished" podID="96baa3f9-7cf9-499b-94ad-0f8cd1a98f76" containerID="4aadf5f62f76113e180c8afe2258cb7e6d767cc1d75582dbc18d40f2a4729864" exitCode=0 Mar 12 15:01:33 crc kubenswrapper[4921]: I0312 15:01:33.370383 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bstd5" event={"ID":"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76","Type":"ContainerDied","Data":"4aadf5f62f76113e180c8afe2258cb7e6d767cc1d75582dbc18d40f2a4729864"} Mar 12 15:01:35 crc kubenswrapper[4921]: I0312 15:01:35.389108 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bstd5" event={"ID":"96baa3f9-7cf9-499b-94ad-0f8cd1a98f76","Type":"ContainerStarted","Data":"c22369f301f7348dafc2e3a605e0e4954e83139b0b3f592b871cd45590f56741"} Mar 12 15:01:35 crc kubenswrapper[4921]: I0312 15:01:35.415394 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bstd5" podStartSLOduration=1.9820368990000001 podStartE2EDuration="15.415376316s" podCreationTimestamp="2026-03-12 15:01:20 +0000 UTC" firstStartedPulling="2026-03-12 15:01:21.26677151 +0000 UTC m=+6703.956843481" lastFinishedPulling="2026-03-12 15:01:34.700110937 +0000 UTC m=+6717.390182898" observedRunningTime="2026-03-12 15:01:35.404112128 +0000 UTC m=+6718.094184129" watchObservedRunningTime="2026-03-12 15:01:35.415376316 +0000 UTC m=+6718.105448287" Mar 12 15:01:36 crc kubenswrapper[4921]: I0312 15:01:36.983380 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 15:01:36 crc kubenswrapper[4921]: E0312 15:01:36.983928 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:01:40 crc kubenswrapper[4921]: I0312 15:01:40.535092 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:40 crc kubenswrapper[4921]: I0312 15:01:40.535653 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:40 crc kubenswrapper[4921]: I0312 15:01:40.585928 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:41 crc kubenswrapper[4921]: I0312 15:01:41.537126 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bstd5" Mar 12 15:01:41 crc kubenswrapper[4921]: I0312 15:01:41.610016 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bstd5"] Mar 12 15:01:41 crc kubenswrapper[4921]: I0312 15:01:41.655120 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-26m8d"] Mar 12 15:01:41 crc kubenswrapper[4921]: I0312 15:01:41.655342 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-26m8d" podUID="6bf56069-058f-4126-8b69-bd0011f99b1e" containerName="registry-server" containerID="cri-o://1b640f394bfa71e82dbf897c645c08051fb35472263979faada9c537255cf54e" gracePeriod=2 Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.206879 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.334989 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bf56069-058f-4126-8b69-bd0011f99b1e-utilities\") pod \"6bf56069-058f-4126-8b69-bd0011f99b1e\" (UID: \"6bf56069-058f-4126-8b69-bd0011f99b1e\") " Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.335083 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4572h\" (UniqueName: \"kubernetes.io/projected/6bf56069-058f-4126-8b69-bd0011f99b1e-kube-api-access-4572h\") pod \"6bf56069-058f-4126-8b69-bd0011f99b1e\" (UID: \"6bf56069-058f-4126-8b69-bd0011f99b1e\") " Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.335151 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bf56069-058f-4126-8b69-bd0011f99b1e-catalog-content\") pod \"6bf56069-058f-4126-8b69-bd0011f99b1e\" (UID: \"6bf56069-058f-4126-8b69-bd0011f99b1e\") " Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.336451 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bf56069-058f-4126-8b69-bd0011f99b1e-utilities" (OuterVolumeSpecName: "utilities") pod "6bf56069-058f-4126-8b69-bd0011f99b1e" (UID: "6bf56069-058f-4126-8b69-bd0011f99b1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.344920 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bf56069-058f-4126-8b69-bd0011f99b1e-kube-api-access-4572h" (OuterVolumeSpecName: "kube-api-access-4572h") pod "6bf56069-058f-4126-8b69-bd0011f99b1e" (UID: "6bf56069-058f-4126-8b69-bd0011f99b1e"). InnerVolumeSpecName "kube-api-access-4572h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.438007 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bf56069-058f-4126-8b69-bd0011f99b1e-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.438420 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4572h\" (UniqueName: \"kubernetes.io/projected/6bf56069-058f-4126-8b69-bd0011f99b1e-kube-api-access-4572h\") on node \"crc\" DevicePath \"\"" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.496288 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bf56069-058f-4126-8b69-bd0011f99b1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bf56069-058f-4126-8b69-bd0011f99b1e" (UID: "6bf56069-058f-4126-8b69-bd0011f99b1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.500460 4921 generic.go:334] "Generic (PLEG): container finished" podID="6bf56069-058f-4126-8b69-bd0011f99b1e" containerID="1b640f394bfa71e82dbf897c645c08051fb35472263979faada9c537255cf54e" exitCode=0 Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.501485 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-26m8d" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.504023 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-26m8d" event={"ID":"6bf56069-058f-4126-8b69-bd0011f99b1e","Type":"ContainerDied","Data":"1b640f394bfa71e82dbf897c645c08051fb35472263979faada9c537255cf54e"} Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.504081 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-26m8d" event={"ID":"6bf56069-058f-4126-8b69-bd0011f99b1e","Type":"ContainerDied","Data":"4271ef9bdfd6025d75ccf51173b7b9384deb71a28c6ad392c410c7e84a768ff7"} Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.504100 4921 scope.go:117] "RemoveContainer" containerID="1b640f394bfa71e82dbf897c645c08051fb35472263979faada9c537255cf54e" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.532163 4921 scope.go:117] "RemoveContainer" containerID="a5eeba53cc490acd0e40784a12ac754f146452ea0f69291e30323cfdbbee05d4" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.540618 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bf56069-058f-4126-8b69-bd0011f99b1e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.549111 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-26m8d"] Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.556818 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-26m8d"] Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.565088 4921 scope.go:117] "RemoveContainer" containerID="4af06eb132b9e7e75b9b8726105028878e416c7463862fe889d5a924edb0e1e2" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.616394 4921 scope.go:117] "RemoveContainer" containerID="1b640f394bfa71e82dbf897c645c08051fb35472263979faada9c537255cf54e" Mar 12 15:01:42 crc kubenswrapper[4921]: E0312 15:01:42.616896 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b640f394bfa71e82dbf897c645c08051fb35472263979faada9c537255cf54e\": container with ID starting with 1b640f394bfa71e82dbf897c645c08051fb35472263979faada9c537255cf54e not found: ID does not exist" containerID="1b640f394bfa71e82dbf897c645c08051fb35472263979faada9c537255cf54e" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.616938 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b640f394bfa71e82dbf897c645c08051fb35472263979faada9c537255cf54e"} err="failed to get container status \"1b640f394bfa71e82dbf897c645c08051fb35472263979faada9c537255cf54e\": rpc error: code = NotFound desc = could not find container \"1b640f394bfa71e82dbf897c645c08051fb35472263979faada9c537255cf54e\": container with ID starting with 1b640f394bfa71e82dbf897c645c08051fb35472263979faada9c537255cf54e not found: ID does not exist" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.616963 4921 scope.go:117] "RemoveContainer" containerID="a5eeba53cc490acd0e40784a12ac754f146452ea0f69291e30323cfdbbee05d4" Mar 12 15:01:42 crc kubenswrapper[4921]: E0312 15:01:42.617402 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5eeba53cc490acd0e40784a12ac754f146452ea0f69291e30323cfdbbee05d4\": container with ID starting with a5eeba53cc490acd0e40784a12ac754f146452ea0f69291e30323cfdbbee05d4 not found: ID does not exist" containerID="a5eeba53cc490acd0e40784a12ac754f146452ea0f69291e30323cfdbbee05d4" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.617450 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5eeba53cc490acd0e40784a12ac754f146452ea0f69291e30323cfdbbee05d4"} err="failed to get container status \"a5eeba53cc490acd0e40784a12ac754f146452ea0f69291e30323cfdbbee05d4\": rpc error: code = NotFound desc = could not find container \"a5eeba53cc490acd0e40784a12ac754f146452ea0f69291e30323cfdbbee05d4\": container with ID starting with a5eeba53cc490acd0e40784a12ac754f146452ea0f69291e30323cfdbbee05d4 not found: ID does not exist" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.617478 4921 scope.go:117] "RemoveContainer" containerID="4af06eb132b9e7e75b9b8726105028878e416c7463862fe889d5a924edb0e1e2" Mar 12 15:01:42 crc kubenswrapper[4921]: E0312 15:01:42.617859 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4af06eb132b9e7e75b9b8726105028878e416c7463862fe889d5a924edb0e1e2\": container with ID starting with 4af06eb132b9e7e75b9b8726105028878e416c7463862fe889d5a924edb0e1e2 not found: ID does not exist" containerID="4af06eb132b9e7e75b9b8726105028878e416c7463862fe889d5a924edb0e1e2" Mar 12 15:01:42 crc kubenswrapper[4921]: I0312 15:01:42.617908 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4af06eb132b9e7e75b9b8726105028878e416c7463862fe889d5a924edb0e1e2"} err="failed to get container status \"4af06eb132b9e7e75b9b8726105028878e416c7463862fe889d5a924edb0e1e2\": rpc error: code = NotFound desc = could not find container \"4af06eb132b9e7e75b9b8726105028878e416c7463862fe889d5a924edb0e1e2\": container with ID starting with 4af06eb132b9e7e75b9b8726105028878e416c7463862fe889d5a924edb0e1e2 not found: ID does not exist" Mar 12 15:01:43 crc kubenswrapper[4921]: I0312 15:01:43.992685 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bf56069-058f-4126-8b69-bd0011f99b1e" path="/var/lib/kubelet/pods/6bf56069-058f-4126-8b69-bd0011f99b1e/volumes" Mar 12 15:01:47 crc kubenswrapper[4921]: I0312 15:01:47.992681 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 15:01:47 crc kubenswrapper[4921]: E0312 15:01:47.993163 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.146174 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555462-l8fmb"] Mar 12 15:02:00 crc kubenswrapper[4921]: E0312 15:02:00.147314 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf56069-058f-4126-8b69-bd0011f99b1e" containerName="extract-utilities" Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.147330 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf56069-058f-4126-8b69-bd0011f99b1e" containerName="extract-utilities" Mar 12 15:02:00 crc kubenswrapper[4921]: E0312 15:02:00.147361 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf56069-058f-4126-8b69-bd0011f99b1e" containerName="extract-content" Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.147367 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf56069-058f-4126-8b69-bd0011f99b1e" containerName="extract-content" Mar 12 15:02:00 crc kubenswrapper[4921]: E0312 15:02:00.147379 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf56069-058f-4126-8b69-bd0011f99b1e" containerName="registry-server" Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.147385 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf56069-058f-4126-8b69-bd0011f99b1e" containerName="registry-server" Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.147617 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bf56069-058f-4126-8b69-bd0011f99b1e" containerName="registry-server" Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.148464 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555462-l8fmb" Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.150758 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.150927 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.151040 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.157119 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555462-l8fmb"] Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.210633 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8rhp\" (UniqueName: \"kubernetes.io/projected/7117519c-536f-4300-93d1-97558b45b44a-kube-api-access-r8rhp\") pod \"auto-csr-approver-29555462-l8fmb\" (UID: \"7117519c-536f-4300-93d1-97558b45b44a\") " pod="openshift-infra/auto-csr-approver-29555462-l8fmb" Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.312893 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8rhp\" (UniqueName: \"kubernetes.io/projected/7117519c-536f-4300-93d1-97558b45b44a-kube-api-access-r8rhp\") pod \"auto-csr-approver-29555462-l8fmb\" (UID: \"7117519c-536f-4300-93d1-97558b45b44a\") " pod="openshift-infra/auto-csr-approver-29555462-l8fmb" Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.334748 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8rhp\" (UniqueName: \"kubernetes.io/projected/7117519c-536f-4300-93d1-97558b45b44a-kube-api-access-r8rhp\") pod \"auto-csr-approver-29555462-l8fmb\" (UID: \"7117519c-536f-4300-93d1-97558b45b44a\") " pod="openshift-infra/auto-csr-approver-29555462-l8fmb" Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.476247 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555462-l8fmb" Mar 12 15:02:00 crc kubenswrapper[4921]: I0312 15:02:00.953242 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555462-l8fmb"] Mar 12 15:02:01 crc kubenswrapper[4921]: I0312 15:02:01.660117 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555462-l8fmb" event={"ID":"7117519c-536f-4300-93d1-97558b45b44a","Type":"ContainerStarted","Data":"3e335512f409bc0e29e3a661baa666f6bf49b4b6d4e0f1239e54cc6f524a842b"} Mar 12 15:02:02 crc kubenswrapper[4921]: I0312 15:02:02.984565 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 15:02:02 crc kubenswrapper[4921]: E0312 15:02:02.985087 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:02:03 crc kubenswrapper[4921]: I0312 15:02:03.675516 4921 generic.go:334] "Generic (PLEG): container finished" podID="7117519c-536f-4300-93d1-97558b45b44a" containerID="2df032571b1b678560c12b7aa910191593c0675a47c49cbbe1d74e6bf2f63dc6" exitCode=0 Mar 12 15:02:03 crc kubenswrapper[4921]: I0312 15:02:03.675565 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555462-l8fmb" event={"ID":"7117519c-536f-4300-93d1-97558b45b44a","Type":"ContainerDied","Data":"2df032571b1b678560c12b7aa910191593c0675a47c49cbbe1d74e6bf2f63dc6"} Mar 12 15:02:05 crc kubenswrapper[4921]: I0312 15:02:05.067625 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555462-l8fmb" Mar 12 15:02:05 crc kubenswrapper[4921]: I0312 15:02:05.221337 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8rhp\" (UniqueName: \"kubernetes.io/projected/7117519c-536f-4300-93d1-97558b45b44a-kube-api-access-r8rhp\") pod \"7117519c-536f-4300-93d1-97558b45b44a\" (UID: \"7117519c-536f-4300-93d1-97558b45b44a\") " Mar 12 15:02:05 crc kubenswrapper[4921]: I0312 15:02:05.227068 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7117519c-536f-4300-93d1-97558b45b44a-kube-api-access-r8rhp" (OuterVolumeSpecName: "kube-api-access-r8rhp") pod "7117519c-536f-4300-93d1-97558b45b44a" (UID: "7117519c-536f-4300-93d1-97558b45b44a"). InnerVolumeSpecName "kube-api-access-r8rhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:02:05 crc kubenswrapper[4921]: I0312 15:02:05.324137 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8rhp\" (UniqueName: \"kubernetes.io/projected/7117519c-536f-4300-93d1-97558b45b44a-kube-api-access-r8rhp\") on node \"crc\" DevicePath \"\"" Mar 12 15:02:05 crc kubenswrapper[4921]: I0312 15:02:05.692371 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555462-l8fmb" event={"ID":"7117519c-536f-4300-93d1-97558b45b44a","Type":"ContainerDied","Data":"3e335512f409bc0e29e3a661baa666f6bf49b4b6d4e0f1239e54cc6f524a842b"} Mar 12 15:02:05 crc kubenswrapper[4921]: I0312 15:02:05.692414 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e335512f409bc0e29e3a661baa666f6bf49b4b6d4e0f1239e54cc6f524a842b" Mar 12 15:02:05 crc kubenswrapper[4921]: I0312 15:02:05.692429 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555462-l8fmb" Mar 12 15:02:06 crc kubenswrapper[4921]: I0312 15:02:06.144266 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555456-f6hzz"] Mar 12 15:02:06 crc kubenswrapper[4921]: I0312 15:02:06.153400 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555456-f6hzz"] Mar 12 15:02:07 crc kubenswrapper[4921]: I0312 15:02:07.993344 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68143e2b-5e11-40f2-8771-fde5a32f2188" path="/var/lib/kubelet/pods/68143e2b-5e11-40f2-8771-fde5a32f2188/volumes" Mar 12 15:02:14 crc kubenswrapper[4921]: I0312 15:02:14.982979 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 15:02:14 crc kubenswrapper[4921]: E0312 15:02:14.983727 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:02:27 crc kubenswrapper[4921]: I0312 15:02:27.996609 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 15:02:28 crc kubenswrapper[4921]: I0312 15:02:28.918850 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"d008fe55876ebcaed42b479e889b95046fb05d79bcac948a95c481e8b17222c6"} Mar 12 15:02:49 crc kubenswrapper[4921]: I0312 15:02:49.801761 4921 scope.go:117] "RemoveContainer" containerID="67ac367f6c7f66d3c84623e180dce28ce62234c2d6d40b8f2be1adae461ab5b4" Mar 12 15:04:00 crc kubenswrapper[4921]: I0312 15:04:00.147424 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555464-cdx7n"] Mar 12 15:04:00 crc kubenswrapper[4921]: E0312 15:04:00.150209 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7117519c-536f-4300-93d1-97558b45b44a" containerName="oc" Mar 12 15:04:00 crc kubenswrapper[4921]: I0312 15:04:00.150242 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7117519c-536f-4300-93d1-97558b45b44a" containerName="oc" Mar 12 15:04:00 crc kubenswrapper[4921]: I0312 15:04:00.151297 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7117519c-536f-4300-93d1-97558b45b44a" containerName="oc" Mar 12 15:04:00 crc kubenswrapper[4921]: I0312 15:04:00.152663 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555464-cdx7n" Mar 12 15:04:00 crc kubenswrapper[4921]: I0312 15:04:00.158518 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:04:00 crc kubenswrapper[4921]: I0312 15:04:00.158626 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:04:00 crc kubenswrapper[4921]: I0312 15:04:00.158519 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:04:00 crc kubenswrapper[4921]: I0312 15:04:00.168589 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555464-cdx7n"] Mar 12 15:04:00 crc kubenswrapper[4921]: I0312 15:04:00.178016 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdmh8\" (UniqueName: \"kubernetes.io/projected/afb767b0-e911-44cb-b21a-25084bdd54aa-kube-api-access-zdmh8\") pod \"auto-csr-approver-29555464-cdx7n\" (UID: \"afb767b0-e911-44cb-b21a-25084bdd54aa\") " pod="openshift-infra/auto-csr-approver-29555464-cdx7n" Mar 12 15:04:00 crc kubenswrapper[4921]: I0312 15:04:00.280425 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdmh8\" (UniqueName: \"kubernetes.io/projected/afb767b0-e911-44cb-b21a-25084bdd54aa-kube-api-access-zdmh8\") pod \"auto-csr-approver-29555464-cdx7n\" (UID: \"afb767b0-e911-44cb-b21a-25084bdd54aa\") " pod="openshift-infra/auto-csr-approver-29555464-cdx7n" Mar 12 15:04:00 crc kubenswrapper[4921]: I0312 15:04:00.307556 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdmh8\" (UniqueName: \"kubernetes.io/projected/afb767b0-e911-44cb-b21a-25084bdd54aa-kube-api-access-zdmh8\") pod \"auto-csr-approver-29555464-cdx7n\" (UID: \"afb767b0-e911-44cb-b21a-25084bdd54aa\") " pod="openshift-infra/auto-csr-approver-29555464-cdx7n" Mar 12 15:04:00 crc kubenswrapper[4921]: I0312 15:04:00.492573 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555464-cdx7n" Mar 12 15:04:00 crc kubenswrapper[4921]: I0312 15:04:00.962722 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555464-cdx7n"] Mar 12 15:04:00 crc kubenswrapper[4921]: I0312 15:04:00.973886 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:04:01 crc kubenswrapper[4921]: I0312 15:04:01.730566 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555464-cdx7n" event={"ID":"afb767b0-e911-44cb-b21a-25084bdd54aa","Type":"ContainerStarted","Data":"57762c96a6b0be38d257518ed642f4b795cf71fdf3ac9bde4298189e58a526a6"} Mar 12 15:04:02 crc kubenswrapper[4921]: I0312 15:04:02.743277 4921 generic.go:334] "Generic (PLEG): container finished" podID="afb767b0-e911-44cb-b21a-25084bdd54aa" containerID="2aaacaf4d2aa64d83ce434101909e21b32af3a8b8bc1c50608480fb0a0df1fb9" exitCode=0 Mar 12 15:04:02 crc kubenswrapper[4921]: I0312 15:04:02.743335 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555464-cdx7n" event={"ID":"afb767b0-e911-44cb-b21a-25084bdd54aa","Type":"ContainerDied","Data":"2aaacaf4d2aa64d83ce434101909e21b32af3a8b8bc1c50608480fb0a0df1fb9"} Mar 12 15:04:04 crc kubenswrapper[4921]: I0312 15:04:04.148540 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555464-cdx7n" Mar 12 15:04:04 crc kubenswrapper[4921]: I0312 15:04:04.260538 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdmh8\" (UniqueName: \"kubernetes.io/projected/afb767b0-e911-44cb-b21a-25084bdd54aa-kube-api-access-zdmh8\") pod \"afb767b0-e911-44cb-b21a-25084bdd54aa\" (UID: \"afb767b0-e911-44cb-b21a-25084bdd54aa\") " Mar 12 15:04:04 crc kubenswrapper[4921]: I0312 15:04:04.270394 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb767b0-e911-44cb-b21a-25084bdd54aa-kube-api-access-zdmh8" (OuterVolumeSpecName: "kube-api-access-zdmh8") pod "afb767b0-e911-44cb-b21a-25084bdd54aa" (UID: "afb767b0-e911-44cb-b21a-25084bdd54aa"). InnerVolumeSpecName "kube-api-access-zdmh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:04:04 crc kubenswrapper[4921]: I0312 15:04:04.364251 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdmh8\" (UniqueName: \"kubernetes.io/projected/afb767b0-e911-44cb-b21a-25084bdd54aa-kube-api-access-zdmh8\") on node \"crc\" DevicePath \"\"" Mar 12 15:04:04 crc kubenswrapper[4921]: I0312 15:04:04.760204 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555464-cdx7n" event={"ID":"afb767b0-e911-44cb-b21a-25084bdd54aa","Type":"ContainerDied","Data":"57762c96a6b0be38d257518ed642f4b795cf71fdf3ac9bde4298189e58a526a6"} Mar 12 15:04:04 crc kubenswrapper[4921]: I0312 15:04:04.760241 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57762c96a6b0be38d257518ed642f4b795cf71fdf3ac9bde4298189e58a526a6" Mar 12 15:04:04 crc kubenswrapper[4921]: I0312 15:04:04.760275 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555464-cdx7n" Mar 12 15:04:05 crc kubenswrapper[4921]: I0312 15:04:05.215456 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555458-qd8sq"] Mar 12 15:04:05 crc kubenswrapper[4921]: I0312 15:04:05.223360 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555458-qd8sq"] Mar 12 15:04:05 crc kubenswrapper[4921]: I0312 15:04:05.994524 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f75c568-d4f3-4787-b73d-e08ed0712f97" path="/var/lib/kubelet/pods/1f75c568-d4f3-4787-b73d-e08ed0712f97/volumes" Mar 12 15:04:49 crc kubenswrapper[4921]: I0312 15:04:49.904284 4921 scope.go:117] "RemoveContainer" containerID="0e014f11b7f8403e227ceebe40e7dd4ea56b6624d53e75bb7ee0540ee3074e88" Mar 12 15:04:56 crc kubenswrapper[4921]: I0312 15:04:56.323922 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:04:56 crc kubenswrapper[4921]: I0312 15:04:56.324538 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:05:26 crc kubenswrapper[4921]: I0312 15:05:26.323890 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:05:26 crc kubenswrapper[4921]: I0312 15:05:26.324622 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:05:56 crc kubenswrapper[4921]: I0312 15:05:56.323730 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:05:56 crc kubenswrapper[4921]: I0312 15:05:56.324648 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:05:56 crc kubenswrapper[4921]: I0312 15:05:56.324777 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 15:05:56 crc kubenswrapper[4921]: I0312 15:05:56.326280 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d008fe55876ebcaed42b479e889b95046fb05d79bcac948a95c481e8b17222c6"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:05:56 crc kubenswrapper[4921]: I0312 15:05:56.326477 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://d008fe55876ebcaed42b479e889b95046fb05d79bcac948a95c481e8b17222c6" gracePeriod=600 Mar 12 15:05:56 crc kubenswrapper[4921]: E0312 15:05:56.562773 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae82cb49_657a_4b47_8107_0729b9edf47b.slice/crio-d008fe55876ebcaed42b479e889b95046fb05d79bcac948a95c481e8b17222c6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae82cb49_657a_4b47_8107_0729b9edf47b.slice/crio-conmon-d008fe55876ebcaed42b479e889b95046fb05d79bcac948a95c481e8b17222c6.scope\": RecentStats: unable to find data in memory cache]" Mar 12 15:05:57 crc kubenswrapper[4921]: I0312 15:05:57.355415 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="d008fe55876ebcaed42b479e889b95046fb05d79bcac948a95c481e8b17222c6" exitCode=0 Mar 12 15:05:57 crc kubenswrapper[4921]: I0312 15:05:57.355865 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"d008fe55876ebcaed42b479e889b95046fb05d79bcac948a95c481e8b17222c6"} Mar 12 15:05:57 crc kubenswrapper[4921]: I0312 15:05:57.355906 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278"} Mar 12 15:05:57 crc kubenswrapper[4921]: I0312 15:05:57.355935 4921 scope.go:117] "RemoveContainer" containerID="98ef2c4d44082bf318efc44a80342b6d55b893941ffc7b1fb149d1affa6e096e" Mar 12 15:06:00 crc kubenswrapper[4921]: I0312 15:06:00.144330 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555466-96cfs"] Mar 12 15:06:00 crc kubenswrapper[4921]: E0312 15:06:00.145331 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb767b0-e911-44cb-b21a-25084bdd54aa" containerName="oc" Mar 12 15:06:00 crc kubenswrapper[4921]: I0312 15:06:00.145345 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb767b0-e911-44cb-b21a-25084bdd54aa" containerName="oc" Mar 12 15:06:00 crc kubenswrapper[4921]: I0312 15:06:00.145536 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb767b0-e911-44cb-b21a-25084bdd54aa" containerName="oc" Mar 12 15:06:00 crc kubenswrapper[4921]: I0312 15:06:00.146225 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555466-96cfs" Mar 12 15:06:00 crc kubenswrapper[4921]: I0312 15:06:00.147823 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:06:00 crc kubenswrapper[4921]: I0312 15:06:00.148128 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:06:00 crc kubenswrapper[4921]: I0312 15:06:00.148419 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:06:00 crc kubenswrapper[4921]: I0312 15:06:00.154802 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555466-96cfs"] Mar 12 15:06:00 crc kubenswrapper[4921]: I0312 15:06:00.293824 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4br4\" (UniqueName: \"kubernetes.io/projected/e0e6cb18-b09a-47c0-bc10-eeee61328e31-kube-api-access-f4br4\") pod \"auto-csr-approver-29555466-96cfs\" (UID: \"e0e6cb18-b09a-47c0-bc10-eeee61328e31\") " pod="openshift-infra/auto-csr-approver-29555466-96cfs" Mar 12 15:06:00 crc kubenswrapper[4921]: I0312 15:06:00.397203 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4br4\" (UniqueName: \"kubernetes.io/projected/e0e6cb18-b09a-47c0-bc10-eeee61328e31-kube-api-access-f4br4\") pod \"auto-csr-approver-29555466-96cfs\" (UID: \"e0e6cb18-b09a-47c0-bc10-eeee61328e31\") " pod="openshift-infra/auto-csr-approver-29555466-96cfs" Mar 12 15:06:00 crc kubenswrapper[4921]: I0312 15:06:00.415407 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4br4\" (UniqueName: \"kubernetes.io/projected/e0e6cb18-b09a-47c0-bc10-eeee61328e31-kube-api-access-f4br4\") pod \"auto-csr-approver-29555466-96cfs\" (UID: \"e0e6cb18-b09a-47c0-bc10-eeee61328e31\") " pod="openshift-infra/auto-csr-approver-29555466-96cfs" Mar 12 15:06:00 crc kubenswrapper[4921]: I0312 15:06:00.470385 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555466-96cfs" Mar 12 15:06:00 crc kubenswrapper[4921]: I0312 15:06:00.901346 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555466-96cfs"] Mar 12 15:06:00 crc kubenswrapper[4921]: W0312 15:06:00.906270 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0e6cb18_b09a_47c0_bc10_eeee61328e31.slice/crio-bc1039d64376d8b6f174948f771c7dcba8f052b83ee3a1e600050bfdaaa407c0 WatchSource:0}: Error finding container bc1039d64376d8b6f174948f771c7dcba8f052b83ee3a1e600050bfdaaa407c0: Status 404 returned error can't find the container with id bc1039d64376d8b6f174948f771c7dcba8f052b83ee3a1e600050bfdaaa407c0 Mar 12 15:06:01 crc kubenswrapper[4921]: I0312 15:06:01.403717 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555466-96cfs" event={"ID":"e0e6cb18-b09a-47c0-bc10-eeee61328e31","Type":"ContainerStarted","Data":"bc1039d64376d8b6f174948f771c7dcba8f052b83ee3a1e600050bfdaaa407c0"} Mar 12 15:06:02 crc kubenswrapper[4921]: I0312 15:06:02.413977 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555466-96cfs" event={"ID":"e0e6cb18-b09a-47c0-bc10-eeee61328e31","Type":"ContainerStarted","Data":"e193d817f438a9c411f85d114e72b737210615639eb27c5b7ad1db5202e501db"} Mar 12 15:06:02 crc kubenswrapper[4921]: I0312 15:06:02.430845 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555466-96cfs" podStartSLOduration=1.390259217 podStartE2EDuration="2.430801576s" podCreationTimestamp="2026-03-12 15:06:00 +0000 UTC" firstStartedPulling="2026-03-12 15:06:00.908934631 +0000 UTC m=+6983.599006602" lastFinishedPulling="2026-03-12 15:06:01.94947695 +0000 UTC m=+6984.639548961" observedRunningTime="2026-03-12 15:06:02.426108251 +0000 UTC m=+6985.116180222" watchObservedRunningTime="2026-03-12 15:06:02.430801576 +0000 UTC m=+6985.120873547" Mar 12 15:06:03 crc kubenswrapper[4921]: I0312 15:06:03.429647 4921 generic.go:334] "Generic (PLEG): container finished" podID="e0e6cb18-b09a-47c0-bc10-eeee61328e31" containerID="e193d817f438a9c411f85d114e72b737210615639eb27c5b7ad1db5202e501db" exitCode=0 Mar 12 15:06:03 crc kubenswrapper[4921]: I0312 15:06:03.429880 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555466-96cfs" event={"ID":"e0e6cb18-b09a-47c0-bc10-eeee61328e31","Type":"ContainerDied","Data":"e193d817f438a9c411f85d114e72b737210615639eb27c5b7ad1db5202e501db"} Mar 12 15:06:04 crc kubenswrapper[4921]: I0312 15:06:04.880488 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555466-96cfs" Mar 12 15:06:04 crc kubenswrapper[4921]: I0312 15:06:04.995660 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4br4\" (UniqueName: \"kubernetes.io/projected/e0e6cb18-b09a-47c0-bc10-eeee61328e31-kube-api-access-f4br4\") pod \"e0e6cb18-b09a-47c0-bc10-eeee61328e31\" (UID: \"e0e6cb18-b09a-47c0-bc10-eeee61328e31\") " Mar 12 15:06:05 crc kubenswrapper[4921]: I0312 15:06:05.001436 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e6cb18-b09a-47c0-bc10-eeee61328e31-kube-api-access-f4br4" (OuterVolumeSpecName: "kube-api-access-f4br4") pod "e0e6cb18-b09a-47c0-bc10-eeee61328e31" (UID: "e0e6cb18-b09a-47c0-bc10-eeee61328e31"). InnerVolumeSpecName "kube-api-access-f4br4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:06:05 crc kubenswrapper[4921]: I0312 15:06:05.098034 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4br4\" (UniqueName: \"kubernetes.io/projected/e0e6cb18-b09a-47c0-bc10-eeee61328e31-kube-api-access-f4br4\") on node \"crc\" DevicePath \"\"" Mar 12 15:06:05 crc kubenswrapper[4921]: I0312 15:06:05.450628 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555466-96cfs" event={"ID":"e0e6cb18-b09a-47c0-bc10-eeee61328e31","Type":"ContainerDied","Data":"bc1039d64376d8b6f174948f771c7dcba8f052b83ee3a1e600050bfdaaa407c0"} Mar 12 15:06:05 crc kubenswrapper[4921]: I0312 15:06:05.450679 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc1039d64376d8b6f174948f771c7dcba8f052b83ee3a1e600050bfdaaa407c0" Mar 12 15:06:05 crc kubenswrapper[4921]: I0312 15:06:05.450736 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555466-96cfs" Mar 12 15:06:05 crc kubenswrapper[4921]: I0312 15:06:05.502338 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555460-ztlpg"] Mar 12 15:06:05 crc kubenswrapper[4921]: I0312 15:06:05.511476 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555460-ztlpg"] Mar 12 15:06:06 crc kubenswrapper[4921]: I0312 15:06:06.002864 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="987e7d06-17e6-4d6c-af5d-f72d91fbf6d9" path="/var/lib/kubelet/pods/987e7d06-17e6-4d6c-af5d-f72d91fbf6d9/volumes" Mar 12 15:06:50 crc kubenswrapper[4921]: I0312 15:06:50.009491 4921 scope.go:117] "RemoveContainer" containerID="318092902b5fc55d36a417923ec86bf93491a2e8d475695330b56c4a9b95a0bf" Mar 12 15:07:20 crc kubenswrapper[4921]: I0312 15:07:20.931005 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="ab9571cc-4c2d-4462-adc5-f84bd590bcca" containerName="galera" probeResult="failure" output="command timed out" Mar 12 15:07:56 crc kubenswrapper[4921]: I0312 15:07:56.323735 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:07:56 crc kubenswrapper[4921]: I0312 15:07:56.324308 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:08:00 crc kubenswrapper[4921]: I0312 15:08:00.164009 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555468-9j4ts"] Mar 12 15:08:00 crc kubenswrapper[4921]: E0312 15:08:00.164982 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e6cb18-b09a-47c0-bc10-eeee61328e31" containerName="oc" Mar 12 15:08:00 crc kubenswrapper[4921]: I0312 15:08:00.164995 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e6cb18-b09a-47c0-bc10-eeee61328e31" containerName="oc" Mar 12 15:08:00 crc kubenswrapper[4921]: I0312 15:08:00.165200 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0e6cb18-b09a-47c0-bc10-eeee61328e31" containerName="oc" Mar 12 15:08:00 crc kubenswrapper[4921]: I0312 15:08:00.165907 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555468-9j4ts" Mar 12 15:08:00 crc kubenswrapper[4921]: I0312 15:08:00.168326 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:08:00 crc kubenswrapper[4921]: I0312 15:08:00.169359 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:08:00 crc kubenswrapper[4921]: I0312 15:08:00.169476 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:08:00 crc kubenswrapper[4921]: I0312 15:08:00.171493 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555468-9j4ts"] Mar 12 15:08:00 crc kubenswrapper[4921]: I0312 15:08:00.323354 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zntp\" (UniqueName: \"kubernetes.io/projected/4d92c0a0-5c45-4181-bf72-187763dfba56-kube-api-access-9zntp\") pod \"auto-csr-approver-29555468-9j4ts\" (UID: \"4d92c0a0-5c45-4181-bf72-187763dfba56\") " pod="openshift-infra/auto-csr-approver-29555468-9j4ts" Mar 12 15:08:00 crc kubenswrapper[4921]: I0312 15:08:00.425640 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zntp\" (UniqueName: \"kubernetes.io/projected/4d92c0a0-5c45-4181-bf72-187763dfba56-kube-api-access-9zntp\") pod \"auto-csr-approver-29555468-9j4ts\" (UID: \"4d92c0a0-5c45-4181-bf72-187763dfba56\") " pod="openshift-infra/auto-csr-approver-29555468-9j4ts" Mar 12 15:08:00 crc kubenswrapper[4921]: I0312 15:08:00.449744 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zntp\" (UniqueName: \"kubernetes.io/projected/4d92c0a0-5c45-4181-bf72-187763dfba56-kube-api-access-9zntp\") pod \"auto-csr-approver-29555468-9j4ts\" (UID: \"4d92c0a0-5c45-4181-bf72-187763dfba56\") " pod="openshift-infra/auto-csr-approver-29555468-9j4ts" Mar 12 15:08:00 crc kubenswrapper[4921]: I0312 15:08:00.488297 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555468-9j4ts" Mar 12 15:08:00 crc kubenswrapper[4921]: I0312 15:08:00.979745 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555468-9j4ts"] Mar 12 15:08:01 crc kubenswrapper[4921]: I0312 15:08:01.326383 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555468-9j4ts" event={"ID":"4d92c0a0-5c45-4181-bf72-187763dfba56","Type":"ContainerStarted","Data":"fa3bf828b94b9a131cd107c960972634762e3acc6dc4db4e5781fdea88058f01"} Mar 12 15:08:03 crc kubenswrapper[4921]: I0312 15:08:03.355502 4921 generic.go:334] "Generic (PLEG): container finished" podID="4d92c0a0-5c45-4181-bf72-187763dfba56" containerID="d1b56603c2eee71b7d18bce67c49be5d4eaeaebc36f727555025f5f8179d4787" exitCode=0 Mar 12 15:08:03 crc kubenswrapper[4921]: I0312 15:08:03.355558 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555468-9j4ts" event={"ID":"4d92c0a0-5c45-4181-bf72-187763dfba56","Type":"ContainerDied","Data":"d1b56603c2eee71b7d18bce67c49be5d4eaeaebc36f727555025f5f8179d4787"} Mar 12 15:08:04 crc kubenswrapper[4921]: I0312 15:08:04.826642 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555468-9j4ts" Mar 12 15:08:04 crc kubenswrapper[4921]: I0312 15:08:04.910675 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zntp\" (UniqueName: \"kubernetes.io/projected/4d92c0a0-5c45-4181-bf72-187763dfba56-kube-api-access-9zntp\") pod \"4d92c0a0-5c45-4181-bf72-187763dfba56\" (UID: \"4d92c0a0-5c45-4181-bf72-187763dfba56\") " Mar 12 15:08:04 crc kubenswrapper[4921]: I0312 15:08:04.917473 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d92c0a0-5c45-4181-bf72-187763dfba56-kube-api-access-9zntp" (OuterVolumeSpecName: "kube-api-access-9zntp") pod "4d92c0a0-5c45-4181-bf72-187763dfba56" (UID: "4d92c0a0-5c45-4181-bf72-187763dfba56"). InnerVolumeSpecName "kube-api-access-9zntp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:08:05 crc kubenswrapper[4921]: I0312 15:08:05.013275 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zntp\" (UniqueName: \"kubernetes.io/projected/4d92c0a0-5c45-4181-bf72-187763dfba56-kube-api-access-9zntp\") on node \"crc\" DevicePath \"\"" Mar 12 15:08:05 crc kubenswrapper[4921]: I0312 15:08:05.377741 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555468-9j4ts" event={"ID":"4d92c0a0-5c45-4181-bf72-187763dfba56","Type":"ContainerDied","Data":"fa3bf828b94b9a131cd107c960972634762e3acc6dc4db4e5781fdea88058f01"} Mar 12 15:08:05 crc kubenswrapper[4921]: I0312 15:08:05.377775 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa3bf828b94b9a131cd107c960972634762e3acc6dc4db4e5781fdea88058f01" Mar 12 15:08:05 crc kubenswrapper[4921]: I0312 15:08:05.377797 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555468-9j4ts" Mar 12 15:08:05 crc kubenswrapper[4921]: I0312 15:08:05.906984 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555462-l8fmb"] Mar 12 15:08:05 crc kubenswrapper[4921]: I0312 15:08:05.917027 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555462-l8fmb"] Mar 12 15:08:05 crc kubenswrapper[4921]: I0312 15:08:05.995283 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7117519c-536f-4300-93d1-97558b45b44a" path="/var/lib/kubelet/pods/7117519c-536f-4300-93d1-97558b45b44a/volumes" Mar 12 15:08:26 crc kubenswrapper[4921]: I0312 15:08:26.324156 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:08:26 crc kubenswrapper[4921]: I0312 15:08:26.326121 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:08:50 crc kubenswrapper[4921]: I0312 15:08:50.254867 4921 scope.go:117] "RemoveContainer" containerID="2df032571b1b678560c12b7aa910191593c0675a47c49cbbe1d74e6bf2f63dc6" Mar 12 15:08:56 crc kubenswrapper[4921]: I0312 15:08:56.323659 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:08:56 crc kubenswrapper[4921]: I0312 15:08:56.324215 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:08:56 crc kubenswrapper[4921]: I0312 15:08:56.324258 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 15:08:56 crc kubenswrapper[4921]: I0312 15:08:56.325083 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:08:56 crc kubenswrapper[4921]: I0312 15:08:56.325132 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" gracePeriod=600 Mar 12 15:08:56 crc kubenswrapper[4921]: E0312 15:08:56.446664 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:08:56 crc kubenswrapper[4921]: I0312 15:08:56.819683 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" exitCode=0 Mar 12 15:08:56 crc kubenswrapper[4921]: I0312 15:08:56.819741 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278"} Mar 12 15:08:56 crc kubenswrapper[4921]: I0312 15:08:56.819779 4921 scope.go:117] "RemoveContainer" containerID="d008fe55876ebcaed42b479e889b95046fb05d79bcac948a95c481e8b17222c6" Mar 12 15:08:56 crc kubenswrapper[4921]: I0312 15:08:56.820558 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:08:56 crc kubenswrapper[4921]: E0312 15:08:56.820961 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.297223 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8ztxh"] Mar 12 15:08:59 crc kubenswrapper[4921]: E0312 15:08:59.298976 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d92c0a0-5c45-4181-bf72-187763dfba56" containerName="oc" Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.299068 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d92c0a0-5c45-4181-bf72-187763dfba56" containerName="oc" Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.299375 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d92c0a0-5c45-4181-bf72-187763dfba56" containerName="oc" Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.300876 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.312267 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8ztxh"] Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.394452 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/846d0bfc-e7b5-457e-be3f-da9276f60821-utilities\") pod \"certified-operators-8ztxh\" (UID: \"846d0bfc-e7b5-457e-be3f-da9276f60821\") " pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.394652 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/846d0bfc-e7b5-457e-be3f-da9276f60821-catalog-content\") pod \"certified-operators-8ztxh\" (UID: \"846d0bfc-e7b5-457e-be3f-da9276f60821\") " pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.394691 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpzgt\" (UniqueName: \"kubernetes.io/projected/846d0bfc-e7b5-457e-be3f-da9276f60821-kube-api-access-tpzgt\") pod \"certified-operators-8ztxh\" (UID: \"846d0bfc-e7b5-457e-be3f-da9276f60821\") " pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.496338 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpzgt\" (UniqueName: \"kubernetes.io/projected/846d0bfc-e7b5-457e-be3f-da9276f60821-kube-api-access-tpzgt\") pod \"certified-operators-8ztxh\" (UID: \"846d0bfc-e7b5-457e-be3f-da9276f60821\") " pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.496460 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/846d0bfc-e7b5-457e-be3f-da9276f60821-utilities\") pod \"certified-operators-8ztxh\" (UID: \"846d0bfc-e7b5-457e-be3f-da9276f60821\") " pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.496634 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/846d0bfc-e7b5-457e-be3f-da9276f60821-catalog-content\") pod \"certified-operators-8ztxh\" (UID: \"846d0bfc-e7b5-457e-be3f-da9276f60821\") " pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.497086 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/846d0bfc-e7b5-457e-be3f-da9276f60821-utilities\") pod \"certified-operators-8ztxh\" (UID: \"846d0bfc-e7b5-457e-be3f-da9276f60821\") " pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.497153 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/846d0bfc-e7b5-457e-be3f-da9276f60821-catalog-content\") pod \"certified-operators-8ztxh\" (UID: \"846d0bfc-e7b5-457e-be3f-da9276f60821\") " pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.526762 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpzgt\" (UniqueName: \"kubernetes.io/projected/846d0bfc-e7b5-457e-be3f-da9276f60821-kube-api-access-tpzgt\") pod \"certified-operators-8ztxh\" (UID: \"846d0bfc-e7b5-457e-be3f-da9276f60821\") " pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:08:59 crc kubenswrapper[4921]: I0312 15:08:59.623594 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:09:00 crc kubenswrapper[4921]: I0312 15:09:00.141392 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8ztxh"] Mar 12 15:09:00 crc kubenswrapper[4921]: I0312 15:09:00.868034 4921 generic.go:334] "Generic (PLEG): container finished" podID="846d0bfc-e7b5-457e-be3f-da9276f60821" containerID="fd9f1afb4fbfb0c34bcc614d846856be2b5c7662120ab6bf54cdf167ee22a227" exitCode=0 Mar 12 15:09:00 crc kubenswrapper[4921]: I0312 15:09:00.868295 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ztxh" event={"ID":"846d0bfc-e7b5-457e-be3f-da9276f60821","Type":"ContainerDied","Data":"fd9f1afb4fbfb0c34bcc614d846856be2b5c7662120ab6bf54cdf167ee22a227"} Mar 12 15:09:00 crc kubenswrapper[4921]: I0312 15:09:00.868318 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ztxh" event={"ID":"846d0bfc-e7b5-457e-be3f-da9276f60821","Type":"ContainerStarted","Data":"bc8b55d4794aeb1f3f10de93a1b1798f05a8dfbcee9a9524b225efc1a689cbdb"} Mar 12 15:09:01 crc kubenswrapper[4921]: I0312 15:09:01.879673 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ztxh" event={"ID":"846d0bfc-e7b5-457e-be3f-da9276f60821","Type":"ContainerStarted","Data":"c5cd8a97701fe067ca89244079eaf42513e6cbc1d8eb54ff5e0cf407c49752af"} Mar 12 15:09:02 crc kubenswrapper[4921]: I0312 15:09:02.888436 4921 generic.go:334] "Generic (PLEG): container finished" podID="846d0bfc-e7b5-457e-be3f-da9276f60821" containerID="c5cd8a97701fe067ca89244079eaf42513e6cbc1d8eb54ff5e0cf407c49752af" exitCode=0 Mar 12 15:09:02 crc kubenswrapper[4921]: I0312 15:09:02.888508 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ztxh" event={"ID":"846d0bfc-e7b5-457e-be3f-da9276f60821","Type":"ContainerDied","Data":"c5cd8a97701fe067ca89244079eaf42513e6cbc1d8eb54ff5e0cf407c49752af"} Mar 12 15:09:02 crc kubenswrapper[4921]: I0312 15:09:02.891091 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:09:03 crc kubenswrapper[4921]: I0312 15:09:03.897665 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ztxh" event={"ID":"846d0bfc-e7b5-457e-be3f-da9276f60821","Type":"ContainerStarted","Data":"4c9f86062531ea306df8f9f6250be1815dedeb78c6a8ccb1dcb3ea3bc62cbc41"} Mar 12 15:09:03 crc kubenswrapper[4921]: I0312 15:09:03.914832 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8ztxh" podStartSLOduration=2.431459572 podStartE2EDuration="4.914792369s" podCreationTimestamp="2026-03-12 15:08:59 +0000 UTC" firstStartedPulling="2026-03-12 15:09:00.870424501 +0000 UTC m=+7163.560496472" lastFinishedPulling="2026-03-12 15:09:03.353757298 +0000 UTC m=+7166.043829269" observedRunningTime="2026-03-12 15:09:03.913953193 +0000 UTC m=+7166.604025164" watchObservedRunningTime="2026-03-12 15:09:03.914792369 +0000 UTC m=+7166.604864340" Mar 12 15:09:07 crc kubenswrapper[4921]: I0312 15:09:07.991120 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:09:07 crc kubenswrapper[4921]: E0312 15:09:07.992186 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:09:09 crc kubenswrapper[4921]: I0312 15:09:09.625772 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:09:09 crc kubenswrapper[4921]: I0312 15:09:09.626262 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:09:09 crc kubenswrapper[4921]: I0312 15:09:09.680709 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:09:10 crc kubenswrapper[4921]: I0312 15:09:10.001300 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:09:10 crc kubenswrapper[4921]: I0312 15:09:10.066306 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8ztxh"] Mar 12 15:09:11 crc kubenswrapper[4921]: I0312 15:09:11.975469 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8ztxh" podUID="846d0bfc-e7b5-457e-be3f-da9276f60821" containerName="registry-server" containerID="cri-o://4c9f86062531ea306df8f9f6250be1815dedeb78c6a8ccb1dcb3ea3bc62cbc41" gracePeriod=2 Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.512579 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.667594 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/846d0bfc-e7b5-457e-be3f-da9276f60821-utilities\") pod \"846d0bfc-e7b5-457e-be3f-da9276f60821\" (UID: \"846d0bfc-e7b5-457e-be3f-da9276f60821\") " Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.667713 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/846d0bfc-e7b5-457e-be3f-da9276f60821-catalog-content\") pod \"846d0bfc-e7b5-457e-be3f-da9276f60821\" (UID: \"846d0bfc-e7b5-457e-be3f-da9276f60821\") " Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.668011 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpzgt\" (UniqueName: \"kubernetes.io/projected/846d0bfc-e7b5-457e-be3f-da9276f60821-kube-api-access-tpzgt\") pod \"846d0bfc-e7b5-457e-be3f-da9276f60821\" (UID: \"846d0bfc-e7b5-457e-be3f-da9276f60821\") " Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.668547 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/846d0bfc-e7b5-457e-be3f-da9276f60821-utilities" (OuterVolumeSpecName: "utilities") pod "846d0bfc-e7b5-457e-be3f-da9276f60821" (UID: "846d0bfc-e7b5-457e-be3f-da9276f60821"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.675990 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/846d0bfc-e7b5-457e-be3f-da9276f60821-kube-api-access-tpzgt" (OuterVolumeSpecName: "kube-api-access-tpzgt") pod "846d0bfc-e7b5-457e-be3f-da9276f60821" (UID: "846d0bfc-e7b5-457e-be3f-da9276f60821"). InnerVolumeSpecName "kube-api-access-tpzgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.771113 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/846d0bfc-e7b5-457e-be3f-da9276f60821-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.771151 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpzgt\" (UniqueName: \"kubernetes.io/projected/846d0bfc-e7b5-457e-be3f-da9276f60821-kube-api-access-tpzgt\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.868311 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/846d0bfc-e7b5-457e-be3f-da9276f60821-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "846d0bfc-e7b5-457e-be3f-da9276f60821" (UID: "846d0bfc-e7b5-457e-be3f-da9276f60821"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.873454 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/846d0bfc-e7b5-457e-be3f-da9276f60821-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.985139 4921 generic.go:334] "Generic (PLEG): container finished" podID="846d0bfc-e7b5-457e-be3f-da9276f60821" containerID="4c9f86062531ea306df8f9f6250be1815dedeb78c6a8ccb1dcb3ea3bc62cbc41" exitCode=0 Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.985174 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ztxh" event={"ID":"846d0bfc-e7b5-457e-be3f-da9276f60821","Type":"ContainerDied","Data":"4c9f86062531ea306df8f9f6250be1815dedeb78c6a8ccb1dcb3ea3bc62cbc41"} Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.985194 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ztxh" event={"ID":"846d0bfc-e7b5-457e-be3f-da9276f60821","Type":"ContainerDied","Data":"bc8b55d4794aeb1f3f10de93a1b1798f05a8dfbcee9a9524b225efc1a689cbdb"} Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.985210 4921 scope.go:117] "RemoveContainer" containerID="4c9f86062531ea306df8f9f6250be1815dedeb78c6a8ccb1dcb3ea3bc62cbc41" Mar 12 15:09:12 crc kubenswrapper[4921]: I0312 15:09:12.985217 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ztxh" Mar 12 15:09:13 crc kubenswrapper[4921]: I0312 15:09:13.017861 4921 scope.go:117] "RemoveContainer" containerID="c5cd8a97701fe067ca89244079eaf42513e6cbc1d8eb54ff5e0cf407c49752af" Mar 12 15:09:13 crc kubenswrapper[4921]: I0312 15:09:13.027461 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8ztxh"] Mar 12 15:09:13 crc kubenswrapper[4921]: I0312 15:09:13.036278 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8ztxh"] Mar 12 15:09:13 crc kubenswrapper[4921]: I0312 15:09:13.050129 4921 scope.go:117] "RemoveContainer" containerID="fd9f1afb4fbfb0c34bcc614d846856be2b5c7662120ab6bf54cdf167ee22a227" Mar 12 15:09:13 crc kubenswrapper[4921]: I0312 15:09:13.097502 4921 scope.go:117] "RemoveContainer" containerID="4c9f86062531ea306df8f9f6250be1815dedeb78c6a8ccb1dcb3ea3bc62cbc41" Mar 12 15:09:13 crc kubenswrapper[4921]: E0312 15:09:13.098057 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c9f86062531ea306df8f9f6250be1815dedeb78c6a8ccb1dcb3ea3bc62cbc41\": container with ID starting with 4c9f86062531ea306df8f9f6250be1815dedeb78c6a8ccb1dcb3ea3bc62cbc41 not found: ID does not exist" containerID="4c9f86062531ea306df8f9f6250be1815dedeb78c6a8ccb1dcb3ea3bc62cbc41" Mar 12 15:09:13 crc kubenswrapper[4921]: I0312 15:09:13.098157 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9f86062531ea306df8f9f6250be1815dedeb78c6a8ccb1dcb3ea3bc62cbc41"} err="failed to get container status \"4c9f86062531ea306df8f9f6250be1815dedeb78c6a8ccb1dcb3ea3bc62cbc41\": rpc error: code = NotFound desc = could not find container \"4c9f86062531ea306df8f9f6250be1815dedeb78c6a8ccb1dcb3ea3bc62cbc41\": container with ID starting with 4c9f86062531ea306df8f9f6250be1815dedeb78c6a8ccb1dcb3ea3bc62cbc41 not found: ID does not exist" Mar 12 15:09:13 crc kubenswrapper[4921]: I0312 15:09:13.098196 4921 scope.go:117] "RemoveContainer" containerID="c5cd8a97701fe067ca89244079eaf42513e6cbc1d8eb54ff5e0cf407c49752af" Mar 12 15:09:13 crc kubenswrapper[4921]: E0312 15:09:13.098611 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5cd8a97701fe067ca89244079eaf42513e6cbc1d8eb54ff5e0cf407c49752af\": container with ID starting with c5cd8a97701fe067ca89244079eaf42513e6cbc1d8eb54ff5e0cf407c49752af not found: ID does not exist" containerID="c5cd8a97701fe067ca89244079eaf42513e6cbc1d8eb54ff5e0cf407c49752af" Mar 12 15:09:13 crc kubenswrapper[4921]: I0312 15:09:13.098666 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5cd8a97701fe067ca89244079eaf42513e6cbc1d8eb54ff5e0cf407c49752af"} err="failed to get container status \"c5cd8a97701fe067ca89244079eaf42513e6cbc1d8eb54ff5e0cf407c49752af\": rpc error: code = NotFound desc = could not find container \"c5cd8a97701fe067ca89244079eaf42513e6cbc1d8eb54ff5e0cf407c49752af\": container with ID starting with c5cd8a97701fe067ca89244079eaf42513e6cbc1d8eb54ff5e0cf407c49752af not found: ID does not exist" Mar 12 15:09:13 crc kubenswrapper[4921]: I0312 15:09:13.098682 4921 scope.go:117] "RemoveContainer" containerID="fd9f1afb4fbfb0c34bcc614d846856be2b5c7662120ab6bf54cdf167ee22a227" Mar 12 15:09:13 crc kubenswrapper[4921]: E0312 15:09:13.099190 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd9f1afb4fbfb0c34bcc614d846856be2b5c7662120ab6bf54cdf167ee22a227\": container with ID starting with fd9f1afb4fbfb0c34bcc614d846856be2b5c7662120ab6bf54cdf167ee22a227 not found: ID does not exist" containerID="fd9f1afb4fbfb0c34bcc614d846856be2b5c7662120ab6bf54cdf167ee22a227" Mar 12 15:09:13 crc kubenswrapper[4921]: I0312 15:09:13.099210 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd9f1afb4fbfb0c34bcc614d846856be2b5c7662120ab6bf54cdf167ee22a227"} err="failed to get container status \"fd9f1afb4fbfb0c34bcc614d846856be2b5c7662120ab6bf54cdf167ee22a227\": rpc error: code = NotFound desc = could not find container \"fd9f1afb4fbfb0c34bcc614d846856be2b5c7662120ab6bf54cdf167ee22a227\": container with ID starting with fd9f1afb4fbfb0c34bcc614d846856be2b5c7662120ab6bf54cdf167ee22a227 not found: ID does not exist" Mar 12 15:09:14 crc kubenswrapper[4921]: I0312 15:09:14.001342 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="846d0bfc-e7b5-457e-be3f-da9276f60821" path="/var/lib/kubelet/pods/846d0bfc-e7b5-457e-be3f-da9276f60821/volumes" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.651807 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-28smp"] Mar 12 15:09:17 crc kubenswrapper[4921]: E0312 15:09:17.652703 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="846d0bfc-e7b5-457e-be3f-da9276f60821" containerName="extract-utilities" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.652716 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="846d0bfc-e7b5-457e-be3f-da9276f60821" containerName="extract-utilities" Mar 12 15:09:17 crc kubenswrapper[4921]: E0312 15:09:17.652735 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="846d0bfc-e7b5-457e-be3f-da9276f60821" containerName="extract-content" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.652742 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="846d0bfc-e7b5-457e-be3f-da9276f60821" containerName="extract-content" Mar 12 15:09:17 crc kubenswrapper[4921]: E0312 15:09:17.652769 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="846d0bfc-e7b5-457e-be3f-da9276f60821" containerName="registry-server" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.652775 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="846d0bfc-e7b5-457e-be3f-da9276f60821" containerName="registry-server" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.652956 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="846d0bfc-e7b5-457e-be3f-da9276f60821" containerName="registry-server" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.654497 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.661770 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-28smp"] Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.776122 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad37800-a8a2-4dc1-8537-69c29bbb8106-utilities\") pod \"community-operators-28smp\" (UID: \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\") " pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.776161 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7wkp\" (UniqueName: \"kubernetes.io/projected/7ad37800-a8a2-4dc1-8537-69c29bbb8106-kube-api-access-z7wkp\") pod \"community-operators-28smp\" (UID: \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\") " pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.776242 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad37800-a8a2-4dc1-8537-69c29bbb8106-catalog-content\") pod \"community-operators-28smp\" (UID: \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\") " pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.877665 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad37800-a8a2-4dc1-8537-69c29bbb8106-utilities\") pod \"community-operators-28smp\" (UID: \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\") " pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.877718 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7wkp\" (UniqueName: \"kubernetes.io/projected/7ad37800-a8a2-4dc1-8537-69c29bbb8106-kube-api-access-z7wkp\") pod \"community-operators-28smp\" (UID: \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\") " pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.877860 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad37800-a8a2-4dc1-8537-69c29bbb8106-catalog-content\") pod \"community-operators-28smp\" (UID: \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\") " pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.878378 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad37800-a8a2-4dc1-8537-69c29bbb8106-utilities\") pod \"community-operators-28smp\" (UID: \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\") " pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.878403 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad37800-a8a2-4dc1-8537-69c29bbb8106-catalog-content\") pod \"community-operators-28smp\" (UID: \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\") " pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.897908 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7wkp\" (UniqueName: \"kubernetes.io/projected/7ad37800-a8a2-4dc1-8537-69c29bbb8106-kube-api-access-z7wkp\") pod \"community-operators-28smp\" (UID: \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\") " pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:17 crc kubenswrapper[4921]: I0312 15:09:17.971961 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:18 crc kubenswrapper[4921]: I0312 15:09:18.512229 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-28smp"] Mar 12 15:09:19 crc kubenswrapper[4921]: I0312 15:09:19.059294 4921 generic.go:334] "Generic (PLEG): container finished" podID="7ad37800-a8a2-4dc1-8537-69c29bbb8106" containerID="92101dac6be90aceb7d4f16aac02a393e066cb78c59321469d8cb696c0f8b351" exitCode=0 Mar 12 15:09:19 crc kubenswrapper[4921]: I0312 15:09:19.059333 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28smp" event={"ID":"7ad37800-a8a2-4dc1-8537-69c29bbb8106","Type":"ContainerDied","Data":"92101dac6be90aceb7d4f16aac02a393e066cb78c59321469d8cb696c0f8b351"} Mar 12 15:09:19 crc kubenswrapper[4921]: I0312 15:09:19.059610 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28smp" event={"ID":"7ad37800-a8a2-4dc1-8537-69c29bbb8106","Type":"ContainerStarted","Data":"88d74112438a57a0f34a53f61cb6d56547388a1010bf91133786627f6dea917c"} Mar 12 15:09:20 crc kubenswrapper[4921]: I0312 15:09:20.983712 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:09:20 crc kubenswrapper[4921]: E0312 15:09:20.984373 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:09:21 crc kubenswrapper[4921]: I0312 15:09:21.085839 4921 generic.go:334] "Generic (PLEG): container finished" podID="7ad37800-a8a2-4dc1-8537-69c29bbb8106" containerID="03d4baa518d806326f693d6417830e52a913e042b351c82be188dce494566a19" exitCode=0 Mar 12 15:09:21 crc kubenswrapper[4921]: I0312 15:09:21.085882 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28smp" event={"ID":"7ad37800-a8a2-4dc1-8537-69c29bbb8106","Type":"ContainerDied","Data":"03d4baa518d806326f693d6417830e52a913e042b351c82be188dce494566a19"} Mar 12 15:09:22 crc kubenswrapper[4921]: I0312 15:09:22.104661 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28smp" event={"ID":"7ad37800-a8a2-4dc1-8537-69c29bbb8106","Type":"ContainerStarted","Data":"75cdc4977303eb9f15145227d307baa459bee702277512966b0596313425c04d"} Mar 12 15:09:22 crc kubenswrapper[4921]: I0312 15:09:22.132088 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-28smp" podStartSLOduration=2.673238805 podStartE2EDuration="5.132071811s" podCreationTimestamp="2026-03-12 15:09:17 +0000 UTC" firstStartedPulling="2026-03-12 15:09:19.062058478 +0000 UTC m=+7181.752130449" lastFinishedPulling="2026-03-12 15:09:21.520891484 +0000 UTC m=+7184.210963455" observedRunningTime="2026-03-12 15:09:22.124443725 +0000 UTC m=+7184.814515686" watchObservedRunningTime="2026-03-12 15:09:22.132071811 +0000 UTC m=+7184.822143782" Mar 12 15:09:27 crc kubenswrapper[4921]: I0312 15:09:27.972911 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:27 crc kubenswrapper[4921]: I0312 15:09:27.973450 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:28 crc kubenswrapper[4921]: I0312 15:09:28.029497 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:28 crc kubenswrapper[4921]: I0312 15:09:28.200945 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:28 crc kubenswrapper[4921]: I0312 15:09:28.266677 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-28smp"] Mar 12 15:09:30 crc kubenswrapper[4921]: I0312 15:09:30.165226 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-28smp" podUID="7ad37800-a8a2-4dc1-8537-69c29bbb8106" containerName="registry-server" containerID="cri-o://75cdc4977303eb9f15145227d307baa459bee702277512966b0596313425c04d" gracePeriod=2 Mar 12 15:09:30 crc kubenswrapper[4921]: I0312 15:09:30.687711 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:30 crc kubenswrapper[4921]: I0312 15:09:30.760294 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad37800-a8a2-4dc1-8537-69c29bbb8106-utilities\") pod \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\" (UID: \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\") " Mar 12 15:09:30 crc kubenswrapper[4921]: I0312 15:09:30.760572 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad37800-a8a2-4dc1-8537-69c29bbb8106-catalog-content\") pod \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\" (UID: \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\") " Mar 12 15:09:30 crc kubenswrapper[4921]: I0312 15:09:30.760630 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7wkp\" (UniqueName: \"kubernetes.io/projected/7ad37800-a8a2-4dc1-8537-69c29bbb8106-kube-api-access-z7wkp\") pod \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\" (UID: \"7ad37800-a8a2-4dc1-8537-69c29bbb8106\") " Mar 12 15:09:30 crc kubenswrapper[4921]: I0312 15:09:30.761350 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ad37800-a8a2-4dc1-8537-69c29bbb8106-utilities" (OuterVolumeSpecName: "utilities") pod "7ad37800-a8a2-4dc1-8537-69c29bbb8106" (UID: "7ad37800-a8a2-4dc1-8537-69c29bbb8106"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:09:30 crc kubenswrapper[4921]: I0312 15:09:30.768097 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ad37800-a8a2-4dc1-8537-69c29bbb8106-kube-api-access-z7wkp" (OuterVolumeSpecName: "kube-api-access-z7wkp") pod "7ad37800-a8a2-4dc1-8537-69c29bbb8106" (UID: "7ad37800-a8a2-4dc1-8537-69c29bbb8106"). InnerVolumeSpecName "kube-api-access-z7wkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:09:30 crc kubenswrapper[4921]: I0312 15:09:30.825356 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ad37800-a8a2-4dc1-8537-69c29bbb8106-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ad37800-a8a2-4dc1-8537-69c29bbb8106" (UID: "7ad37800-a8a2-4dc1-8537-69c29bbb8106"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:09:30 crc kubenswrapper[4921]: I0312 15:09:30.862791 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ad37800-a8a2-4dc1-8537-69c29bbb8106-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:30 crc kubenswrapper[4921]: I0312 15:09:30.862869 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7wkp\" (UniqueName: \"kubernetes.io/projected/7ad37800-a8a2-4dc1-8537-69c29bbb8106-kube-api-access-z7wkp\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:30 crc kubenswrapper[4921]: I0312 15:09:30.862880 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ad37800-a8a2-4dc1-8537-69c29bbb8106-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.178953 4921 generic.go:334] "Generic (PLEG): container finished" podID="7ad37800-a8a2-4dc1-8537-69c29bbb8106" containerID="75cdc4977303eb9f15145227d307baa459bee702277512966b0596313425c04d" exitCode=0 Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.179000 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28smp" event={"ID":"7ad37800-a8a2-4dc1-8537-69c29bbb8106","Type":"ContainerDied","Data":"75cdc4977303eb9f15145227d307baa459bee702277512966b0596313425c04d"} Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.179030 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28smp" event={"ID":"7ad37800-a8a2-4dc1-8537-69c29bbb8106","Type":"ContainerDied","Data":"88d74112438a57a0f34a53f61cb6d56547388a1010bf91133786627f6dea917c"} Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.179049 4921 scope.go:117] "RemoveContainer" containerID="75cdc4977303eb9f15145227d307baa459bee702277512966b0596313425c04d" Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.179210 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28smp" Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.213216 4921 scope.go:117] "RemoveContainer" containerID="03d4baa518d806326f693d6417830e52a913e042b351c82be188dce494566a19" Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.219872 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-28smp"] Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.234373 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-28smp"] Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.243017 4921 scope.go:117] "RemoveContainer" containerID="92101dac6be90aceb7d4f16aac02a393e066cb78c59321469d8cb696c0f8b351" Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.283727 4921 scope.go:117] "RemoveContainer" containerID="75cdc4977303eb9f15145227d307baa459bee702277512966b0596313425c04d" Mar 12 15:09:31 crc kubenswrapper[4921]: E0312 15:09:31.284269 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75cdc4977303eb9f15145227d307baa459bee702277512966b0596313425c04d\": container with ID starting with 75cdc4977303eb9f15145227d307baa459bee702277512966b0596313425c04d not found: ID does not exist" containerID="75cdc4977303eb9f15145227d307baa459bee702277512966b0596313425c04d" Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.284314 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75cdc4977303eb9f15145227d307baa459bee702277512966b0596313425c04d"} err="failed to get container status \"75cdc4977303eb9f15145227d307baa459bee702277512966b0596313425c04d\": rpc error: code = NotFound desc = could not find container \"75cdc4977303eb9f15145227d307baa459bee702277512966b0596313425c04d\": container with ID starting with 75cdc4977303eb9f15145227d307baa459bee702277512966b0596313425c04d not found: ID does not exist" Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.284338 4921 scope.go:117] "RemoveContainer" containerID="03d4baa518d806326f693d6417830e52a913e042b351c82be188dce494566a19" Mar 12 15:09:31 crc kubenswrapper[4921]: E0312 15:09:31.284660 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d4baa518d806326f693d6417830e52a913e042b351c82be188dce494566a19\": container with ID starting with 03d4baa518d806326f693d6417830e52a913e042b351c82be188dce494566a19 not found: ID does not exist" containerID="03d4baa518d806326f693d6417830e52a913e042b351c82be188dce494566a19" Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.284688 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d4baa518d806326f693d6417830e52a913e042b351c82be188dce494566a19"} err="failed to get container status \"03d4baa518d806326f693d6417830e52a913e042b351c82be188dce494566a19\": rpc error: code = NotFound desc = could not find container \"03d4baa518d806326f693d6417830e52a913e042b351c82be188dce494566a19\": container with ID starting with 03d4baa518d806326f693d6417830e52a913e042b351c82be188dce494566a19 not found: ID does not exist" Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.284711 4921 scope.go:117] "RemoveContainer" containerID="92101dac6be90aceb7d4f16aac02a393e066cb78c59321469d8cb696c0f8b351" Mar 12 15:09:31 crc kubenswrapper[4921]: E0312 15:09:31.284984 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92101dac6be90aceb7d4f16aac02a393e066cb78c59321469d8cb696c0f8b351\": container with ID starting with 92101dac6be90aceb7d4f16aac02a393e066cb78c59321469d8cb696c0f8b351 not found: ID does not exist" containerID="92101dac6be90aceb7d4f16aac02a393e066cb78c59321469d8cb696c0f8b351" Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.285017 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92101dac6be90aceb7d4f16aac02a393e066cb78c59321469d8cb696c0f8b351"} err="failed to get container status \"92101dac6be90aceb7d4f16aac02a393e066cb78c59321469d8cb696c0f8b351\": rpc error: code = NotFound desc = could not find container \"92101dac6be90aceb7d4f16aac02a393e066cb78c59321469d8cb696c0f8b351\": container with ID starting with 92101dac6be90aceb7d4f16aac02a393e066cb78c59321469d8cb696c0f8b351 not found: ID does not exist" Mar 12 15:09:31 crc kubenswrapper[4921]: I0312 15:09:31.993925 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ad37800-a8a2-4dc1-8537-69c29bbb8106" path="/var/lib/kubelet/pods/7ad37800-a8a2-4dc1-8537-69c29bbb8106/volumes" Mar 12 15:09:34 crc kubenswrapper[4921]: I0312 15:09:34.986343 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:09:34 crc kubenswrapper[4921]: E0312 15:09:34.987645 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:09:49 crc kubenswrapper[4921]: I0312 15:09:49.983935 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:09:49 crc kubenswrapper[4921]: E0312 15:09:49.984838 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.148065 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555470-mbhr7"] Mar 12 15:10:00 crc kubenswrapper[4921]: E0312 15:10:00.149002 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad37800-a8a2-4dc1-8537-69c29bbb8106" containerName="extract-content" Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.149015 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad37800-a8a2-4dc1-8537-69c29bbb8106" containerName="extract-content" Mar 12 15:10:00 crc kubenswrapper[4921]: E0312 15:10:00.149037 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad37800-a8a2-4dc1-8537-69c29bbb8106" containerName="registry-server" Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.149043 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad37800-a8a2-4dc1-8537-69c29bbb8106" containerName="registry-server" Mar 12 15:10:00 crc kubenswrapper[4921]: E0312 15:10:00.149064 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ad37800-a8a2-4dc1-8537-69c29bbb8106" containerName="extract-utilities" Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.149070 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ad37800-a8a2-4dc1-8537-69c29bbb8106" containerName="extract-utilities" Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.149255 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ad37800-a8a2-4dc1-8537-69c29bbb8106" containerName="registry-server" Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.149890 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555470-mbhr7" Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.151570 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.154432 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.156026 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.160293 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555470-mbhr7"] Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.295034 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmgck\" (UniqueName: \"kubernetes.io/projected/c0764e67-c3d7-4d48-a31d-e6403fe9d048-kube-api-access-bmgck\") pod \"auto-csr-approver-29555470-mbhr7\" (UID: \"c0764e67-c3d7-4d48-a31d-e6403fe9d048\") " pod="openshift-infra/auto-csr-approver-29555470-mbhr7" Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.397189 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmgck\" (UniqueName: \"kubernetes.io/projected/c0764e67-c3d7-4d48-a31d-e6403fe9d048-kube-api-access-bmgck\") pod \"auto-csr-approver-29555470-mbhr7\" (UID: \"c0764e67-c3d7-4d48-a31d-e6403fe9d048\") " pod="openshift-infra/auto-csr-approver-29555470-mbhr7" Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.421869 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmgck\" (UniqueName: \"kubernetes.io/projected/c0764e67-c3d7-4d48-a31d-e6403fe9d048-kube-api-access-bmgck\") pod \"auto-csr-approver-29555470-mbhr7\" (UID: \"c0764e67-c3d7-4d48-a31d-e6403fe9d048\") " pod="openshift-infra/auto-csr-approver-29555470-mbhr7" Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.475559 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555470-mbhr7" Mar 12 15:10:00 crc kubenswrapper[4921]: I0312 15:10:00.923068 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555470-mbhr7"] Mar 12 15:10:01 crc kubenswrapper[4921]: I0312 15:10:01.438684 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555470-mbhr7" event={"ID":"c0764e67-c3d7-4d48-a31d-e6403fe9d048","Type":"ContainerStarted","Data":"7369293a69cdbc4361fa328a79363013eee843653da7a7048bc9e551c8540477"} Mar 12 15:10:03 crc kubenswrapper[4921]: I0312 15:10:03.470840 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555470-mbhr7" event={"ID":"c0764e67-c3d7-4d48-a31d-e6403fe9d048","Type":"ContainerStarted","Data":"5c8daea8bc01256c00c454f0edea323222cca69e7f906ef89a16e3e65e5c71f9"} Mar 12 15:10:03 crc kubenswrapper[4921]: I0312 15:10:03.492956 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555470-mbhr7" podStartSLOduration=1.333966186 podStartE2EDuration="3.492935197s" podCreationTimestamp="2026-03-12 15:10:00 +0000 UTC" firstStartedPulling="2026-03-12 15:10:00.924028334 +0000 UTC m=+7223.614100305" lastFinishedPulling="2026-03-12 15:10:03.082997335 +0000 UTC m=+7225.773069316" observedRunningTime="2026-03-12 15:10:03.484486845 +0000 UTC m=+7226.174558816" watchObservedRunningTime="2026-03-12 15:10:03.492935197 +0000 UTC m=+7226.183007168" Mar 12 15:10:04 crc kubenswrapper[4921]: I0312 15:10:04.482497 4921 generic.go:334] "Generic (PLEG): container finished" podID="c0764e67-c3d7-4d48-a31d-e6403fe9d048" containerID="5c8daea8bc01256c00c454f0edea323222cca69e7f906ef89a16e3e65e5c71f9" exitCode=0 Mar 12 15:10:04 crc kubenswrapper[4921]: I0312 15:10:04.482558 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555470-mbhr7" event={"ID":"c0764e67-c3d7-4d48-a31d-e6403fe9d048","Type":"ContainerDied","Data":"5c8daea8bc01256c00c454f0edea323222cca69e7f906ef89a16e3e65e5c71f9"} Mar 12 15:10:04 crc kubenswrapper[4921]: I0312 15:10:04.983544 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:10:04 crc kubenswrapper[4921]: E0312 15:10:04.984035 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:10:05 crc kubenswrapper[4921]: I0312 15:10:05.903253 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555470-mbhr7" Mar 12 15:10:06 crc kubenswrapper[4921]: I0312 15:10:06.038684 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmgck\" (UniqueName: \"kubernetes.io/projected/c0764e67-c3d7-4d48-a31d-e6403fe9d048-kube-api-access-bmgck\") pod \"c0764e67-c3d7-4d48-a31d-e6403fe9d048\" (UID: \"c0764e67-c3d7-4d48-a31d-e6403fe9d048\") " Mar 12 15:10:06 crc kubenswrapper[4921]: I0312 15:10:06.049104 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0764e67-c3d7-4d48-a31d-e6403fe9d048-kube-api-access-bmgck" (OuterVolumeSpecName: "kube-api-access-bmgck") pod "c0764e67-c3d7-4d48-a31d-e6403fe9d048" (UID: "c0764e67-c3d7-4d48-a31d-e6403fe9d048"). InnerVolumeSpecName "kube-api-access-bmgck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:06 crc kubenswrapper[4921]: I0312 15:10:06.142698 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmgck\" (UniqueName: \"kubernetes.io/projected/c0764e67-c3d7-4d48-a31d-e6403fe9d048-kube-api-access-bmgck\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:06 crc kubenswrapper[4921]: I0312 15:10:06.498607 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555470-mbhr7" event={"ID":"c0764e67-c3d7-4d48-a31d-e6403fe9d048","Type":"ContainerDied","Data":"7369293a69cdbc4361fa328a79363013eee843653da7a7048bc9e551c8540477"} Mar 12 15:10:06 crc kubenswrapper[4921]: I0312 15:10:06.498644 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7369293a69cdbc4361fa328a79363013eee843653da7a7048bc9e551c8540477" Mar 12 15:10:06 crc kubenswrapper[4921]: I0312 15:10:06.498669 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555470-mbhr7" Mar 12 15:10:06 crc kubenswrapper[4921]: I0312 15:10:06.550757 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555464-cdx7n"] Mar 12 15:10:06 crc kubenswrapper[4921]: I0312 15:10:06.558718 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555464-cdx7n"] Mar 12 15:10:08 crc kubenswrapper[4921]: I0312 15:10:08.007578 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb767b0-e911-44cb-b21a-25084bdd54aa" path="/var/lib/kubelet/pods/afb767b0-e911-44cb-b21a-25084bdd54aa/volumes" Mar 12 15:10:18 crc kubenswrapper[4921]: I0312 15:10:18.984131 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:10:18 crc kubenswrapper[4921]: E0312 15:10:18.985200 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:10:29 crc kubenswrapper[4921]: I0312 15:10:29.984307 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:10:29 crc kubenswrapper[4921]: E0312 15:10:29.985208 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.169091 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8xc5r"] Mar 12 15:10:43 crc kubenswrapper[4921]: E0312 15:10:43.170261 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0764e67-c3d7-4d48-a31d-e6403fe9d048" containerName="oc" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.170282 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0764e67-c3d7-4d48-a31d-e6403fe9d048" containerName="oc" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.170538 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0764e67-c3d7-4d48-a31d-e6403fe9d048" containerName="oc" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.172541 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.187695 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xc5r"] Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.283879 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b250e3-3df7-4192-a26f-c04e85d2023c-utilities\") pod \"redhat-marketplace-8xc5r\" (UID: \"37b250e3-3df7-4192-a26f-c04e85d2023c\") " pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.284025 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2glr\" (UniqueName: \"kubernetes.io/projected/37b250e3-3df7-4192-a26f-c04e85d2023c-kube-api-access-z2glr\") pod \"redhat-marketplace-8xc5r\" (UID: \"37b250e3-3df7-4192-a26f-c04e85d2023c\") " pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.284063 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b250e3-3df7-4192-a26f-c04e85d2023c-catalog-content\") pod \"redhat-marketplace-8xc5r\" (UID: \"37b250e3-3df7-4192-a26f-c04e85d2023c\") " pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.387043 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b250e3-3df7-4192-a26f-c04e85d2023c-utilities\") pod \"redhat-marketplace-8xc5r\" (UID: \"37b250e3-3df7-4192-a26f-c04e85d2023c\") " pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.387198 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2glr\" (UniqueName: \"kubernetes.io/projected/37b250e3-3df7-4192-a26f-c04e85d2023c-kube-api-access-z2glr\") pod \"redhat-marketplace-8xc5r\" (UID: \"37b250e3-3df7-4192-a26f-c04e85d2023c\") " pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.387234 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b250e3-3df7-4192-a26f-c04e85d2023c-catalog-content\") pod \"redhat-marketplace-8xc5r\" (UID: \"37b250e3-3df7-4192-a26f-c04e85d2023c\") " pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.387625 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b250e3-3df7-4192-a26f-c04e85d2023c-utilities\") pod \"redhat-marketplace-8xc5r\" (UID: \"37b250e3-3df7-4192-a26f-c04e85d2023c\") " pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.387895 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b250e3-3df7-4192-a26f-c04e85d2023c-catalog-content\") pod \"redhat-marketplace-8xc5r\" (UID: \"37b250e3-3df7-4192-a26f-c04e85d2023c\") " pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.410830 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2glr\" (UniqueName: \"kubernetes.io/projected/37b250e3-3df7-4192-a26f-c04e85d2023c-kube-api-access-z2glr\") pod \"redhat-marketplace-8xc5r\" (UID: \"37b250e3-3df7-4192-a26f-c04e85d2023c\") " pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.501742 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:43 crc kubenswrapper[4921]: I0312 15:10:43.966514 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xc5r"] Mar 12 15:10:44 crc kubenswrapper[4921]: I0312 15:10:44.928297 4921 generic.go:334] "Generic (PLEG): container finished" podID="37b250e3-3df7-4192-a26f-c04e85d2023c" containerID="082056f136269cc34e282f084b54eb6a2a88e2b993f332a94261540229b9957d" exitCode=0 Mar 12 15:10:44 crc kubenswrapper[4921]: I0312 15:10:44.928365 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xc5r" event={"ID":"37b250e3-3df7-4192-a26f-c04e85d2023c","Type":"ContainerDied","Data":"082056f136269cc34e282f084b54eb6a2a88e2b993f332a94261540229b9957d"} Mar 12 15:10:44 crc kubenswrapper[4921]: I0312 15:10:44.928571 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xc5r" event={"ID":"37b250e3-3df7-4192-a26f-c04e85d2023c","Type":"ContainerStarted","Data":"b3501b413dddfc6080691691aa1e618312423130e6db8fe9e87c73ea1f8010f6"} Mar 12 15:10:44 crc kubenswrapper[4921]: I0312 15:10:44.983581 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:10:44 crc kubenswrapper[4921]: E0312 15:10:44.983879 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:10:45 crc kubenswrapper[4921]: I0312 15:10:45.938936 4921 generic.go:334] "Generic (PLEG): container finished" podID="37b250e3-3df7-4192-a26f-c04e85d2023c" containerID="a61426a05b81ff5085138f21e240d4376d4b439a72500310e7ace0abaa77ce1f" exitCode=0 Mar 12 15:10:45 crc kubenswrapper[4921]: I0312 15:10:45.938984 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xc5r" event={"ID":"37b250e3-3df7-4192-a26f-c04e85d2023c","Type":"ContainerDied","Data":"a61426a05b81ff5085138f21e240d4376d4b439a72500310e7ace0abaa77ce1f"} Mar 12 15:10:46 crc kubenswrapper[4921]: I0312 15:10:46.950888 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xc5r" event={"ID":"37b250e3-3df7-4192-a26f-c04e85d2023c","Type":"ContainerStarted","Data":"7f8b634da4801157cdc1d96a64e957afb59bd934b2ed417eb3492f88977838b8"} Mar 12 15:10:46 crc kubenswrapper[4921]: I0312 15:10:46.976006 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8xc5r" podStartSLOduration=2.551684534 podStartE2EDuration="3.975987345s" podCreationTimestamp="2026-03-12 15:10:43 +0000 UTC" firstStartedPulling="2026-03-12 15:10:44.930638551 +0000 UTC m=+7267.620710522" lastFinishedPulling="2026-03-12 15:10:46.354941362 +0000 UTC m=+7269.045013333" observedRunningTime="2026-03-12 15:10:46.97260738 +0000 UTC m=+7269.662679361" watchObservedRunningTime="2026-03-12 15:10:46.975987345 +0000 UTC m=+7269.666059336" Mar 12 15:10:50 crc kubenswrapper[4921]: I0312 15:10:50.394054 4921 scope.go:117] "RemoveContainer" containerID="2aaacaf4d2aa64d83ce434101909e21b32af3a8b8bc1c50608480fb0a0df1fb9" Mar 12 15:10:53 crc kubenswrapper[4921]: I0312 15:10:53.501906 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:53 crc kubenswrapper[4921]: I0312 15:10:53.502341 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:53 crc kubenswrapper[4921]: I0312 15:10:53.553920 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:54 crc kubenswrapper[4921]: I0312 15:10:54.055711 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:55 crc kubenswrapper[4921]: I0312 15:10:55.187430 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xc5r"] Mar 12 15:10:55 crc kubenswrapper[4921]: I0312 15:10:55.984230 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:10:55 crc kubenswrapper[4921]: E0312 15:10:55.984864 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:10:56 crc kubenswrapper[4921]: I0312 15:10:56.023486 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8xc5r" podUID="37b250e3-3df7-4192-a26f-c04e85d2023c" containerName="registry-server" containerID="cri-o://7f8b634da4801157cdc1d96a64e957afb59bd934b2ed417eb3492f88977838b8" gracePeriod=2 Mar 12 15:10:57 crc kubenswrapper[4921]: I0312 15:10:57.034940 4921 generic.go:334] "Generic (PLEG): container finished" podID="37b250e3-3df7-4192-a26f-c04e85d2023c" containerID="7f8b634da4801157cdc1d96a64e957afb59bd934b2ed417eb3492f88977838b8" exitCode=0 Mar 12 15:10:57 crc kubenswrapper[4921]: I0312 15:10:57.035493 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xc5r" event={"ID":"37b250e3-3df7-4192-a26f-c04e85d2023c","Type":"ContainerDied","Data":"7f8b634da4801157cdc1d96a64e957afb59bd934b2ed417eb3492f88977838b8"} Mar 12 15:10:57 crc kubenswrapper[4921]: I0312 15:10:57.036542 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xc5r" event={"ID":"37b250e3-3df7-4192-a26f-c04e85d2023c","Type":"ContainerDied","Data":"b3501b413dddfc6080691691aa1e618312423130e6db8fe9e87c73ea1f8010f6"} Mar 12 15:10:57 crc kubenswrapper[4921]: I0312 15:10:57.036606 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3501b413dddfc6080691691aa1e618312423130e6db8fe9e87c73ea1f8010f6" Mar 12 15:10:57 crc kubenswrapper[4921]: I0312 15:10:57.050042 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:57 crc kubenswrapper[4921]: I0312 15:10:57.091899 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b250e3-3df7-4192-a26f-c04e85d2023c-catalog-content\") pod \"37b250e3-3df7-4192-a26f-c04e85d2023c\" (UID: \"37b250e3-3df7-4192-a26f-c04e85d2023c\") " Mar 12 15:10:57 crc kubenswrapper[4921]: I0312 15:10:57.092307 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b250e3-3df7-4192-a26f-c04e85d2023c-utilities\") pod \"37b250e3-3df7-4192-a26f-c04e85d2023c\" (UID: \"37b250e3-3df7-4192-a26f-c04e85d2023c\") " Mar 12 15:10:57 crc kubenswrapper[4921]: I0312 15:10:57.092499 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2glr\" (UniqueName: \"kubernetes.io/projected/37b250e3-3df7-4192-a26f-c04e85d2023c-kube-api-access-z2glr\") pod \"37b250e3-3df7-4192-a26f-c04e85d2023c\" (UID: \"37b250e3-3df7-4192-a26f-c04e85d2023c\") " Mar 12 15:10:57 crc kubenswrapper[4921]: I0312 15:10:57.093352 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b250e3-3df7-4192-a26f-c04e85d2023c-utilities" (OuterVolumeSpecName: "utilities") pod "37b250e3-3df7-4192-a26f-c04e85d2023c" (UID: "37b250e3-3df7-4192-a26f-c04e85d2023c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:10:57 crc kubenswrapper[4921]: I0312 15:10:57.093579 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37b250e3-3df7-4192-a26f-c04e85d2023c-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:57 crc kubenswrapper[4921]: I0312 15:10:57.098621 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b250e3-3df7-4192-a26f-c04e85d2023c-kube-api-access-z2glr" (OuterVolumeSpecName: "kube-api-access-z2glr") pod "37b250e3-3df7-4192-a26f-c04e85d2023c" (UID: "37b250e3-3df7-4192-a26f-c04e85d2023c"). InnerVolumeSpecName "kube-api-access-z2glr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:10:57 crc kubenswrapper[4921]: I0312 15:10:57.130470 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37b250e3-3df7-4192-a26f-c04e85d2023c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37b250e3-3df7-4192-a26f-c04e85d2023c" (UID: "37b250e3-3df7-4192-a26f-c04e85d2023c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:10:57 crc kubenswrapper[4921]: I0312 15:10:57.195728 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2glr\" (UniqueName: \"kubernetes.io/projected/37b250e3-3df7-4192-a26f-c04e85d2023c-kube-api-access-z2glr\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:57 crc kubenswrapper[4921]: I0312 15:10:57.195765 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37b250e3-3df7-4192-a26f-c04e85d2023c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:10:58 crc kubenswrapper[4921]: I0312 15:10:58.043496 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8xc5r" Mar 12 15:10:58 crc kubenswrapper[4921]: I0312 15:10:58.075384 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xc5r"] Mar 12 15:10:58 crc kubenswrapper[4921]: I0312 15:10:58.093252 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xc5r"] Mar 12 15:10:59 crc kubenswrapper[4921]: I0312 15:10:59.992802 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b250e3-3df7-4192-a26f-c04e85d2023c" path="/var/lib/kubelet/pods/37b250e3-3df7-4192-a26f-c04e85d2023c/volumes" Mar 12 15:11:06 crc kubenswrapper[4921]: I0312 15:11:06.983692 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:11:06 crc kubenswrapper[4921]: E0312 15:11:06.985103 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:11:21 crc kubenswrapper[4921]: I0312 15:11:21.983685 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:11:21 crc kubenswrapper[4921]: E0312 15:11:21.984503 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:11:32 crc kubenswrapper[4921]: I0312 15:11:32.983447 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:11:32 crc kubenswrapper[4921]: E0312 15:11:32.984196 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:11:46 crc kubenswrapper[4921]: I0312 15:11:46.985275 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:11:46 crc kubenswrapper[4921]: E0312 15:11:46.986123 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.142930 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555472-km95k"] Mar 12 15:12:00 crc kubenswrapper[4921]: E0312 15:12:00.143787 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b250e3-3df7-4192-a26f-c04e85d2023c" containerName="extract-content" Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.143799 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b250e3-3df7-4192-a26f-c04e85d2023c" containerName="extract-content" Mar 12 15:12:00 crc kubenswrapper[4921]: E0312 15:12:00.143835 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b250e3-3df7-4192-a26f-c04e85d2023c" containerName="registry-server" Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.143841 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b250e3-3df7-4192-a26f-c04e85d2023c" containerName="registry-server" Mar 12 15:12:00 crc kubenswrapper[4921]: E0312 15:12:00.143853 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b250e3-3df7-4192-a26f-c04e85d2023c" containerName="extract-utilities" Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.143861 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b250e3-3df7-4192-a26f-c04e85d2023c" containerName="extract-utilities" Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.144040 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b250e3-3df7-4192-a26f-c04e85d2023c" containerName="registry-server" Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.144651 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555472-km95k" Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.150247 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.150418 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.150536 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.153570 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555472-km95k"] Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.198233 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbtch\" (UniqueName: \"kubernetes.io/projected/81e90165-7578-4c29-883a-644ab365bfff-kube-api-access-kbtch\") pod \"auto-csr-approver-29555472-km95k\" (UID: \"81e90165-7578-4c29-883a-644ab365bfff\") " pod="openshift-infra/auto-csr-approver-29555472-km95k" Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.300473 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbtch\" (UniqueName: \"kubernetes.io/projected/81e90165-7578-4c29-883a-644ab365bfff-kube-api-access-kbtch\") pod \"auto-csr-approver-29555472-km95k\" (UID: \"81e90165-7578-4c29-883a-644ab365bfff\") " pod="openshift-infra/auto-csr-approver-29555472-km95k" Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.319423 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbtch\" (UniqueName: \"kubernetes.io/projected/81e90165-7578-4c29-883a-644ab365bfff-kube-api-access-kbtch\") pod \"auto-csr-approver-29555472-km95k\" (UID: \"81e90165-7578-4c29-883a-644ab365bfff\") " pod="openshift-infra/auto-csr-approver-29555472-km95k" Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.466884 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555472-km95k" Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.887712 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555472-km95k"] Mar 12 15:12:00 crc kubenswrapper[4921]: I0312 15:12:00.983599 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:12:00 crc kubenswrapper[4921]: E0312 15:12:00.984059 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:12:01 crc kubenswrapper[4921]: I0312 15:12:01.658681 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555472-km95k" event={"ID":"81e90165-7578-4c29-883a-644ab365bfff","Type":"ContainerStarted","Data":"03d17f637859fbc8c3a71a5b305968a18813b7a9db489b493a4113afa4f96489"} Mar 12 15:12:02 crc kubenswrapper[4921]: I0312 15:12:02.668790 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555472-km95k" event={"ID":"81e90165-7578-4c29-883a-644ab365bfff","Type":"ContainerStarted","Data":"49b3cbd7bc2c5effb9ac02846a06fee2fb4759ff125d59a1b944e7690beea75b"} Mar 12 15:12:02 crc kubenswrapper[4921]: I0312 15:12:02.683640 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555472-km95k" podStartSLOduration=1.320684114 podStartE2EDuration="2.683615661s" podCreationTimestamp="2026-03-12 15:12:00 +0000 UTC" firstStartedPulling="2026-03-12 15:12:00.904028234 +0000 UTC m=+7343.594100195" lastFinishedPulling="2026-03-12 15:12:02.266959771 +0000 UTC m=+7344.957031742" observedRunningTime="2026-03-12 15:12:02.678268925 +0000 UTC m=+7345.368340896" watchObservedRunningTime="2026-03-12 15:12:02.683615661 +0000 UTC m=+7345.373687632" Mar 12 15:12:03 crc kubenswrapper[4921]: I0312 15:12:03.678427 4921 generic.go:334] "Generic (PLEG): container finished" podID="81e90165-7578-4c29-883a-644ab365bfff" containerID="49b3cbd7bc2c5effb9ac02846a06fee2fb4759ff125d59a1b944e7690beea75b" exitCode=0 Mar 12 15:12:03 crc kubenswrapper[4921]: I0312 15:12:03.678528 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555472-km95k" event={"ID":"81e90165-7578-4c29-883a-644ab365bfff","Type":"ContainerDied","Data":"49b3cbd7bc2c5effb9ac02846a06fee2fb4759ff125d59a1b944e7690beea75b"} Mar 12 15:12:05 crc kubenswrapper[4921]: I0312 15:12:05.108792 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555472-km95k" Mar 12 15:12:05 crc kubenswrapper[4921]: I0312 15:12:05.194055 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbtch\" (UniqueName: \"kubernetes.io/projected/81e90165-7578-4c29-883a-644ab365bfff-kube-api-access-kbtch\") pod \"81e90165-7578-4c29-883a-644ab365bfff\" (UID: \"81e90165-7578-4c29-883a-644ab365bfff\") " Mar 12 15:12:05 crc kubenswrapper[4921]: I0312 15:12:05.199399 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e90165-7578-4c29-883a-644ab365bfff-kube-api-access-kbtch" (OuterVolumeSpecName: "kube-api-access-kbtch") pod "81e90165-7578-4c29-883a-644ab365bfff" (UID: "81e90165-7578-4c29-883a-644ab365bfff"). InnerVolumeSpecName "kube-api-access-kbtch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:12:05 crc kubenswrapper[4921]: I0312 15:12:05.296916 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbtch\" (UniqueName: \"kubernetes.io/projected/81e90165-7578-4c29-883a-644ab365bfff-kube-api-access-kbtch\") on node \"crc\" DevicePath \"\"" Mar 12 15:12:05 crc kubenswrapper[4921]: I0312 15:12:05.694379 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555472-km95k" event={"ID":"81e90165-7578-4c29-883a-644ab365bfff","Type":"ContainerDied","Data":"03d17f637859fbc8c3a71a5b305968a18813b7a9db489b493a4113afa4f96489"} Mar 12 15:12:05 crc kubenswrapper[4921]: I0312 15:12:05.694419 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03d17f637859fbc8c3a71a5b305968a18813b7a9db489b493a4113afa4f96489" Mar 12 15:12:05 crc kubenswrapper[4921]: I0312 15:12:05.694491 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555472-km95k" Mar 12 15:12:05 crc kubenswrapper[4921]: I0312 15:12:05.748074 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555466-96cfs"] Mar 12 15:12:05 crc kubenswrapper[4921]: I0312 15:12:05.757312 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555466-96cfs"] Mar 12 15:12:05 crc kubenswrapper[4921]: I0312 15:12:05.993514 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e6cb18-b09a-47c0-bc10-eeee61328e31" path="/var/lib/kubelet/pods/e0e6cb18-b09a-47c0-bc10-eeee61328e31/volumes" Mar 12 15:12:11 crc kubenswrapper[4921]: I0312 15:12:11.983796 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:12:11 crc kubenswrapper[4921]: E0312 15:12:11.984745 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:12:24 crc kubenswrapper[4921]: I0312 15:12:24.400019 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-fp4rs" podUID="001425f5-0a2a-4bdc-a437-d6f9ba3687b4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.64:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 15:12:24 crc kubenswrapper[4921]: I0312 15:12:24.418660 4921 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jn8d5" podUID="aabfc338-f7a1-46a8-a02a-daf1adc64862" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.51:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 15:12:25 crc kubenswrapper[4921]: I0312 15:12:25.983058 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:12:25 crc kubenswrapper[4921]: E0312 15:12:25.983784 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:12:38 crc kubenswrapper[4921]: I0312 15:12:38.982903 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:12:38 crc kubenswrapper[4921]: E0312 15:12:38.983726 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:12:50 crc kubenswrapper[4921]: I0312 15:12:50.499105 4921 scope.go:117] "RemoveContainer" containerID="e193d817f438a9c411f85d114e72b737210615639eb27c5b7ad1db5202e501db" Mar 12 15:12:52 crc kubenswrapper[4921]: I0312 15:12:52.985341 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:12:52 crc kubenswrapper[4921]: E0312 15:12:52.986800 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:13:04 crc kubenswrapper[4921]: I0312 15:13:04.983842 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:13:04 crc kubenswrapper[4921]: E0312 15:13:04.984683 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:13:17 crc kubenswrapper[4921]: I0312 15:13:17.993946 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:13:17 crc kubenswrapper[4921]: E0312 15:13:17.994895 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:13:28 crc kubenswrapper[4921]: I0312 15:13:28.983894 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:13:28 crc kubenswrapper[4921]: E0312 15:13:28.984689 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:13:41 crc kubenswrapper[4921]: I0312 15:13:41.985370 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:13:41 crc kubenswrapper[4921]: E0312 15:13:41.986438 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:13:53 crc kubenswrapper[4921]: I0312 15:13:53.984035 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:13:53 crc kubenswrapper[4921]: E0312 15:13:53.984874 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:14:00 crc kubenswrapper[4921]: I0312 15:14:00.148116 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555474-9sn7h"] Mar 12 15:14:00 crc kubenswrapper[4921]: E0312 15:14:00.149184 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e90165-7578-4c29-883a-644ab365bfff" containerName="oc" Mar 12 15:14:00 crc kubenswrapper[4921]: I0312 15:14:00.149197 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e90165-7578-4c29-883a-644ab365bfff" containerName="oc" Mar 12 15:14:00 crc kubenswrapper[4921]: I0312 15:14:00.149394 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e90165-7578-4c29-883a-644ab365bfff" containerName="oc" Mar 12 15:14:00 crc kubenswrapper[4921]: I0312 15:14:00.150048 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555474-9sn7h" Mar 12 15:14:00 crc kubenswrapper[4921]: I0312 15:14:00.152651 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:14:00 crc kubenswrapper[4921]: I0312 15:14:00.152794 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:14:00 crc kubenswrapper[4921]: I0312 15:14:00.152969 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:14:00 crc kubenswrapper[4921]: I0312 15:14:00.168708 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555474-9sn7h"] Mar 12 15:14:00 crc kubenswrapper[4921]: I0312 15:14:00.299361 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrftg\" (UniqueName: \"kubernetes.io/projected/5a1892d6-8db6-484b-bdf1-75c061ee506a-kube-api-access-jrftg\") pod \"auto-csr-approver-29555474-9sn7h\" (UID: \"5a1892d6-8db6-484b-bdf1-75c061ee506a\") " pod="openshift-infra/auto-csr-approver-29555474-9sn7h" Mar 12 15:14:00 crc kubenswrapper[4921]: I0312 15:14:00.401555 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrftg\" (UniqueName: \"kubernetes.io/projected/5a1892d6-8db6-484b-bdf1-75c061ee506a-kube-api-access-jrftg\") pod \"auto-csr-approver-29555474-9sn7h\" (UID: \"5a1892d6-8db6-484b-bdf1-75c061ee506a\") " pod="openshift-infra/auto-csr-approver-29555474-9sn7h" Mar 12 15:14:00 crc kubenswrapper[4921]: I0312 15:14:00.423220 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrftg\" (UniqueName: \"kubernetes.io/projected/5a1892d6-8db6-484b-bdf1-75c061ee506a-kube-api-access-jrftg\") pod \"auto-csr-approver-29555474-9sn7h\" (UID: \"5a1892d6-8db6-484b-bdf1-75c061ee506a\") " pod="openshift-infra/auto-csr-approver-29555474-9sn7h" Mar 12 15:14:00 crc kubenswrapper[4921]: I0312 15:14:00.469629 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555474-9sn7h" Mar 12 15:14:01 crc kubenswrapper[4921]: I0312 15:14:01.005079 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555474-9sn7h"] Mar 12 15:14:01 crc kubenswrapper[4921]: I0312 15:14:01.288547 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555474-9sn7h" event={"ID":"5a1892d6-8db6-484b-bdf1-75c061ee506a","Type":"ContainerStarted","Data":"5c5f1032f03e601e9f4372a4ffc86b8da36944cc108155c20488a89c6ad20962"} Mar 12 15:14:02 crc kubenswrapper[4921]: I0312 15:14:02.298777 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555474-9sn7h" event={"ID":"5a1892d6-8db6-484b-bdf1-75c061ee506a","Type":"ContainerStarted","Data":"34db1c5a592b52e623a90e94d06110b85a45f62ab11a08bbc0513beb736161f2"} Mar 12 15:14:02 crc kubenswrapper[4921]: I0312 15:14:02.315616 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555474-9sn7h" podStartSLOduration=1.358253224 podStartE2EDuration="2.315598844s" podCreationTimestamp="2026-03-12 15:14:00 +0000 UTC" firstStartedPulling="2026-03-12 15:14:01.010129791 +0000 UTC m=+7463.700201762" lastFinishedPulling="2026-03-12 15:14:01.967475411 +0000 UTC m=+7464.657547382" observedRunningTime="2026-03-12 15:14:02.313013844 +0000 UTC m=+7465.003085825" watchObservedRunningTime="2026-03-12 15:14:02.315598844 +0000 UTC m=+7465.005670815" Mar 12 15:14:03 crc kubenswrapper[4921]: I0312 15:14:03.307732 4921 generic.go:334] "Generic (PLEG): container finished" podID="5a1892d6-8db6-484b-bdf1-75c061ee506a" containerID="34db1c5a592b52e623a90e94d06110b85a45f62ab11a08bbc0513beb736161f2" exitCode=0 Mar 12 15:14:03 crc kubenswrapper[4921]: I0312 15:14:03.307805 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555474-9sn7h" event={"ID":"5a1892d6-8db6-484b-bdf1-75c061ee506a","Type":"ContainerDied","Data":"34db1c5a592b52e623a90e94d06110b85a45f62ab11a08bbc0513beb736161f2"} Mar 12 15:14:04 crc kubenswrapper[4921]: I0312 15:14:04.709729 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555474-9sn7h" Mar 12 15:14:04 crc kubenswrapper[4921]: I0312 15:14:04.790887 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrftg\" (UniqueName: \"kubernetes.io/projected/5a1892d6-8db6-484b-bdf1-75c061ee506a-kube-api-access-jrftg\") pod \"5a1892d6-8db6-484b-bdf1-75c061ee506a\" (UID: \"5a1892d6-8db6-484b-bdf1-75c061ee506a\") " Mar 12 15:14:04 crc kubenswrapper[4921]: I0312 15:14:04.797222 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1892d6-8db6-484b-bdf1-75c061ee506a-kube-api-access-jrftg" (OuterVolumeSpecName: "kube-api-access-jrftg") pod "5a1892d6-8db6-484b-bdf1-75c061ee506a" (UID: "5a1892d6-8db6-484b-bdf1-75c061ee506a"). InnerVolumeSpecName "kube-api-access-jrftg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:14:04 crc kubenswrapper[4921]: I0312 15:14:04.893081 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrftg\" (UniqueName: \"kubernetes.io/projected/5a1892d6-8db6-484b-bdf1-75c061ee506a-kube-api-access-jrftg\") on node \"crc\" DevicePath \"\"" Mar 12 15:14:05 crc kubenswrapper[4921]: I0312 15:14:05.324903 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555474-9sn7h" event={"ID":"5a1892d6-8db6-484b-bdf1-75c061ee506a","Type":"ContainerDied","Data":"5c5f1032f03e601e9f4372a4ffc86b8da36944cc108155c20488a89c6ad20962"} Mar 12 15:14:05 crc kubenswrapper[4921]: I0312 15:14:05.325304 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c5f1032f03e601e9f4372a4ffc86b8da36944cc108155c20488a89c6ad20962" Mar 12 15:14:05 crc kubenswrapper[4921]: I0312 15:14:05.325171 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555474-9sn7h" Mar 12 15:14:05 crc kubenswrapper[4921]: I0312 15:14:05.384511 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555468-9j4ts"] Mar 12 15:14:05 crc kubenswrapper[4921]: I0312 15:14:05.396780 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555468-9j4ts"] Mar 12 15:14:05 crc kubenswrapper[4921]: I0312 15:14:05.994791 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d92c0a0-5c45-4181-bf72-187763dfba56" path="/var/lib/kubelet/pods/4d92c0a0-5c45-4181-bf72-187763dfba56/volumes" Mar 12 15:14:06 crc kubenswrapper[4921]: I0312 15:14:06.984276 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:14:07 crc kubenswrapper[4921]: I0312 15:14:07.353303 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"e35138a155ccdd9d0ac38b057cae71bd9423544b9bf4cb58d46480243f059f38"} Mar 12 15:14:50 crc kubenswrapper[4921]: I0312 15:14:50.609615 4921 scope.go:117] "RemoveContainer" containerID="d1b56603c2eee71b7d18bce67c49be5d4eaeaebc36f727555025f5f8179d4787" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.171060 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6"] Mar 12 15:15:00 crc kubenswrapper[4921]: E0312 15:15:00.172782 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1892d6-8db6-484b-bdf1-75c061ee506a" containerName="oc" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.172807 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1892d6-8db6-484b-bdf1-75c061ee506a" containerName="oc" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.173144 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1892d6-8db6-484b-bdf1-75c061ee506a" containerName="oc" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.174266 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.176608 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.179452 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.185043 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-config-volume\") pod \"collect-profiles-29555475-xrvq6\" (UID: \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.185116 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r785k\" (UniqueName: \"kubernetes.io/projected/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-kube-api-access-r785k\") pod \"collect-profiles-29555475-xrvq6\" (UID: \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.185370 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-secret-volume\") pod \"collect-profiles-29555475-xrvq6\" (UID: \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.192011 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6"] Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.287690 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-secret-volume\") pod \"collect-profiles-29555475-xrvq6\" (UID: \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.288288 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-config-volume\") pod \"collect-profiles-29555475-xrvq6\" (UID: \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.288327 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r785k\" (UniqueName: \"kubernetes.io/projected/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-kube-api-access-r785k\") pod \"collect-profiles-29555475-xrvq6\" (UID: \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.289038 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-config-volume\") pod \"collect-profiles-29555475-xrvq6\" (UID: \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.295805 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-secret-volume\") pod \"collect-profiles-29555475-xrvq6\" (UID: \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.308007 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r785k\" (UniqueName: \"kubernetes.io/projected/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-kube-api-access-r785k\") pod \"collect-profiles-29555475-xrvq6\" (UID: \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" Mar 12 15:15:00 crc kubenswrapper[4921]: I0312 15:15:00.505711 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" Mar 12 15:15:01 crc kubenswrapper[4921]: I0312 15:15:01.031729 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6"] Mar 12 15:15:01 crc kubenswrapper[4921]: I0312 15:15:01.150337 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" event={"ID":"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d","Type":"ContainerStarted","Data":"413d6df4727de21f7ebd44b01777efdbe60d8bdc4dd919d2fd295f9c3f626b6a"} Mar 12 15:15:02 crc kubenswrapper[4921]: I0312 15:15:02.160930 4921 generic.go:334] "Generic (PLEG): container finished" podID="a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d" containerID="3da3e6f57a53a95762944b6d64a52a058514e991b1e7b4a656987e6c51c04b27" exitCode=0 Mar 12 15:15:02 crc kubenswrapper[4921]: I0312 15:15:02.160995 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" event={"ID":"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d","Type":"ContainerDied","Data":"3da3e6f57a53a95762944b6d64a52a058514e991b1e7b4a656987e6c51c04b27"} Mar 12 15:15:03 crc kubenswrapper[4921]: I0312 15:15:03.596044 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" Mar 12 15:15:03 crc kubenswrapper[4921]: I0312 15:15:03.674274 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-secret-volume\") pod \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\" (UID: \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\") " Mar 12 15:15:03 crc kubenswrapper[4921]: I0312 15:15:03.674365 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-config-volume\") pod \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\" (UID: \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\") " Mar 12 15:15:03 crc kubenswrapper[4921]: I0312 15:15:03.674589 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r785k\" (UniqueName: \"kubernetes.io/projected/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-kube-api-access-r785k\") pod \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\" (UID: \"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d\") " Mar 12 15:15:03 crc kubenswrapper[4921]: I0312 15:15:03.675946 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-config-volume" (OuterVolumeSpecName: "config-volume") pod "a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d" (UID: "a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:15:03 crc kubenswrapper[4921]: I0312 15:15:03.684479 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-kube-api-access-r785k" (OuterVolumeSpecName: "kube-api-access-r785k") pod "a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d" (UID: "a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d"). InnerVolumeSpecName "kube-api-access-r785k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:15:03 crc kubenswrapper[4921]: I0312 15:15:03.692919 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d" (UID: "a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:15:03 crc kubenswrapper[4921]: I0312 15:15:03.777480 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:15:03 crc kubenswrapper[4921]: I0312 15:15:03.777514 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:15:03 crc kubenswrapper[4921]: I0312 15:15:03.777525 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r785k\" (UniqueName: \"kubernetes.io/projected/a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d-kube-api-access-r785k\") on node \"crc\" DevicePath \"\"" Mar 12 15:15:04 crc kubenswrapper[4921]: I0312 15:15:04.183415 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" event={"ID":"a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d","Type":"ContainerDied","Data":"413d6df4727de21f7ebd44b01777efdbe60d8bdc4dd919d2fd295f9c3f626b6a"} Mar 12 15:15:04 crc kubenswrapper[4921]: I0312 15:15:04.183459 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="413d6df4727de21f7ebd44b01777efdbe60d8bdc4dd919d2fd295f9c3f626b6a" Mar 12 15:15:04 crc kubenswrapper[4921]: I0312 15:15:04.183506 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555475-xrvq6" Mar 12 15:15:04 crc kubenswrapper[4921]: I0312 15:15:04.698330 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6"] Mar 12 15:15:04 crc kubenswrapper[4921]: I0312 15:15:04.706773 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555430-cjlq6"] Mar 12 15:15:06 crc kubenswrapper[4921]: I0312 15:15:06.002102 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="118e714c-50a7-422c-9c55-a03871013348" path="/var/lib/kubelet/pods/118e714c-50a7-422c-9c55-a03871013348/volumes" Mar 12 15:15:26 crc kubenswrapper[4921]: I0312 15:15:26.747406 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4qg9c"] Mar 12 15:15:26 crc kubenswrapper[4921]: E0312 15:15:26.750612 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d" containerName="collect-profiles" Mar 12 15:15:26 crc kubenswrapper[4921]: I0312 15:15:26.750634 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d" containerName="collect-profiles" Mar 12 15:15:26 crc kubenswrapper[4921]: I0312 15:15:26.751485 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ea3eda-0129-49e8-b20f-9d60f8ea2b7d" containerName="collect-profiles" Mar 12 15:15:26 crc kubenswrapper[4921]: I0312 15:15:26.756975 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:26 crc kubenswrapper[4921]: I0312 15:15:26.794066 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4qg9c"] Mar 12 15:15:26 crc kubenswrapper[4921]: I0312 15:15:26.818206 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-catalog-content\") pod \"redhat-operators-4qg9c\" (UID: \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\") " pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:26 crc kubenswrapper[4921]: I0312 15:15:26.818518 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-utilities\") pod \"redhat-operators-4qg9c\" (UID: \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\") " pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:26 crc kubenswrapper[4921]: I0312 15:15:26.818602 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjmn7\" (UniqueName: \"kubernetes.io/projected/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-kube-api-access-kjmn7\") pod \"redhat-operators-4qg9c\" (UID: \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\") " pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:26 crc kubenswrapper[4921]: I0312 15:15:26.921220 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-catalog-content\") pod \"redhat-operators-4qg9c\" (UID: \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\") " pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:26 crc kubenswrapper[4921]: I0312 15:15:26.921373 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-utilities\") pod \"redhat-operators-4qg9c\" (UID: \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\") " pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:26 crc kubenswrapper[4921]: I0312 15:15:26.921405 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjmn7\" (UniqueName: \"kubernetes.io/projected/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-kube-api-access-kjmn7\") pod \"redhat-operators-4qg9c\" (UID: \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\") " pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:26 crc kubenswrapper[4921]: I0312 15:15:26.921851 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-catalog-content\") pod \"redhat-operators-4qg9c\" (UID: \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\") " pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:26 crc kubenswrapper[4921]: I0312 15:15:26.922181 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-utilities\") pod \"redhat-operators-4qg9c\" (UID: \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\") " pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:26 crc kubenswrapper[4921]: I0312 15:15:26.949696 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjmn7\" (UniqueName: \"kubernetes.io/projected/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-kube-api-access-kjmn7\") pod \"redhat-operators-4qg9c\" (UID: \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\") " pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:27 crc kubenswrapper[4921]: I0312 15:15:27.105522 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:27 crc kubenswrapper[4921]: I0312 15:15:27.739788 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4qg9c"] Mar 12 15:15:28 crc kubenswrapper[4921]: I0312 15:15:28.429733 4921 generic.go:334] "Generic (PLEG): container finished" podID="c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" containerID="0309bd2dd63d332aafc99255ef883a24193736869574d9faec4a6852a22342ab" exitCode=0 Mar 12 15:15:28 crc kubenswrapper[4921]: I0312 15:15:28.430014 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qg9c" event={"ID":"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b","Type":"ContainerDied","Data":"0309bd2dd63d332aafc99255ef883a24193736869574d9faec4a6852a22342ab"} Mar 12 15:15:28 crc kubenswrapper[4921]: I0312 15:15:28.430157 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qg9c" event={"ID":"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b","Type":"ContainerStarted","Data":"46ca75f8efb4009bab286494a6f0c2a09b612132c492663b35e564fee3da90a9"} Mar 12 15:15:28 crc kubenswrapper[4921]: I0312 15:15:28.435796 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:15:30 crc kubenswrapper[4921]: I0312 15:15:30.467744 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qg9c" event={"ID":"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b","Type":"ContainerStarted","Data":"42088d74da2a26c43758f9552cb1efe7c88a47c4ba138287f6d41dc4f111ca40"} Mar 12 15:15:32 crc kubenswrapper[4921]: I0312 15:15:32.488749 4921 generic.go:334] "Generic (PLEG): container finished" podID="c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" containerID="42088d74da2a26c43758f9552cb1efe7c88a47c4ba138287f6d41dc4f111ca40" exitCode=0 Mar 12 15:15:32 crc kubenswrapper[4921]: I0312 15:15:32.489125 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qg9c" event={"ID":"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b","Type":"ContainerDied","Data":"42088d74da2a26c43758f9552cb1efe7c88a47c4ba138287f6d41dc4f111ca40"} Mar 12 15:15:34 crc kubenswrapper[4921]: I0312 15:15:34.517261 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qg9c" event={"ID":"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b","Type":"ContainerStarted","Data":"71d5faece67e9bc9a8019fa2f992572406bd14b2048fb7c89913ad72b9f79adf"} Mar 12 15:15:34 crc kubenswrapper[4921]: I0312 15:15:34.545774 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4qg9c" podStartSLOduration=3.572050885 podStartE2EDuration="8.545752168s" podCreationTimestamp="2026-03-12 15:15:26 +0000 UTC" firstStartedPulling="2026-03-12 15:15:28.435524484 +0000 UTC m=+7551.125596445" lastFinishedPulling="2026-03-12 15:15:33.409225747 +0000 UTC m=+7556.099297728" observedRunningTime="2026-03-12 15:15:34.537964986 +0000 UTC m=+7557.228036967" watchObservedRunningTime="2026-03-12 15:15:34.545752168 +0000 UTC m=+7557.235824139" Mar 12 15:15:37 crc kubenswrapper[4921]: I0312 15:15:37.106320 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:37 crc kubenswrapper[4921]: I0312 15:15:37.106967 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:38 crc kubenswrapper[4921]: I0312 15:15:38.156989 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4qg9c" podUID="c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" containerName="registry-server" probeResult="failure" output=< Mar 12 15:15:38 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 15:15:38 crc kubenswrapper[4921]: > Mar 12 15:15:47 crc kubenswrapper[4921]: I0312 15:15:47.177100 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:47 crc kubenswrapper[4921]: I0312 15:15:47.258242 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:47 crc kubenswrapper[4921]: I0312 15:15:47.428050 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4qg9c"] Mar 12 15:15:48 crc kubenswrapper[4921]: I0312 15:15:48.655486 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4qg9c" podUID="c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" containerName="registry-server" containerID="cri-o://71d5faece67e9bc9a8019fa2f992572406bd14b2048fb7c89913ad72b9f79adf" gracePeriod=2 Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.271530 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.386290 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjmn7\" (UniqueName: \"kubernetes.io/projected/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-kube-api-access-kjmn7\") pod \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\" (UID: \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\") " Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.386797 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-catalog-content\") pod \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\" (UID: \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\") " Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.386863 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-utilities\") pod \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\" (UID: \"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b\") " Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.387651 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-utilities" (OuterVolumeSpecName: "utilities") pod "c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" (UID: "c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.396471 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-kube-api-access-kjmn7" (OuterVolumeSpecName: "kube-api-access-kjmn7") pod "c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" (UID: "c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b"). InnerVolumeSpecName "kube-api-access-kjmn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.489995 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjmn7\" (UniqueName: \"kubernetes.io/projected/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-kube-api-access-kjmn7\") on node \"crc\" DevicePath \"\"" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.490052 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.518073 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" (UID: "c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.593433 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.668650 4921 generic.go:334] "Generic (PLEG): container finished" podID="c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" containerID="71d5faece67e9bc9a8019fa2f992572406bd14b2048fb7c89913ad72b9f79adf" exitCode=0 Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.668701 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qg9c" event={"ID":"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b","Type":"ContainerDied","Data":"71d5faece67e9bc9a8019fa2f992572406bd14b2048fb7c89913ad72b9f79adf"} Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.668730 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qg9c" event={"ID":"c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b","Type":"ContainerDied","Data":"46ca75f8efb4009bab286494a6f0c2a09b612132c492663b35e564fee3da90a9"} Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.668747 4921 scope.go:117] "RemoveContainer" containerID="71d5faece67e9bc9a8019fa2f992572406bd14b2048fb7c89913ad72b9f79adf" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.668777 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qg9c" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.689959 4921 scope.go:117] "RemoveContainer" containerID="42088d74da2a26c43758f9552cb1efe7c88a47c4ba138287f6d41dc4f111ca40" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.711767 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4qg9c"] Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.721370 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4qg9c"] Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.726446 4921 scope.go:117] "RemoveContainer" containerID="0309bd2dd63d332aafc99255ef883a24193736869574d9faec4a6852a22342ab" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.775395 4921 scope.go:117] "RemoveContainer" containerID="71d5faece67e9bc9a8019fa2f992572406bd14b2048fb7c89913ad72b9f79adf" Mar 12 15:15:49 crc kubenswrapper[4921]: E0312 15:15:49.776069 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71d5faece67e9bc9a8019fa2f992572406bd14b2048fb7c89913ad72b9f79adf\": container with ID starting with 71d5faece67e9bc9a8019fa2f992572406bd14b2048fb7c89913ad72b9f79adf not found: ID does not exist" containerID="71d5faece67e9bc9a8019fa2f992572406bd14b2048fb7c89913ad72b9f79adf" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.776124 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71d5faece67e9bc9a8019fa2f992572406bd14b2048fb7c89913ad72b9f79adf"} err="failed to get container status \"71d5faece67e9bc9a8019fa2f992572406bd14b2048fb7c89913ad72b9f79adf\": rpc error: code = NotFound desc = could not find container \"71d5faece67e9bc9a8019fa2f992572406bd14b2048fb7c89913ad72b9f79adf\": container with ID starting with 71d5faece67e9bc9a8019fa2f992572406bd14b2048fb7c89913ad72b9f79adf not found: ID does not exist" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.776163 4921 scope.go:117] "RemoveContainer" containerID="42088d74da2a26c43758f9552cb1efe7c88a47c4ba138287f6d41dc4f111ca40" Mar 12 15:15:49 crc kubenswrapper[4921]: E0312 15:15:49.776548 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42088d74da2a26c43758f9552cb1efe7c88a47c4ba138287f6d41dc4f111ca40\": container with ID starting with 42088d74da2a26c43758f9552cb1efe7c88a47c4ba138287f6d41dc4f111ca40 not found: ID does not exist" containerID="42088d74da2a26c43758f9552cb1efe7c88a47c4ba138287f6d41dc4f111ca40" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.776586 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42088d74da2a26c43758f9552cb1efe7c88a47c4ba138287f6d41dc4f111ca40"} err="failed to get container status \"42088d74da2a26c43758f9552cb1efe7c88a47c4ba138287f6d41dc4f111ca40\": rpc error: code = NotFound desc = could not find container \"42088d74da2a26c43758f9552cb1efe7c88a47c4ba138287f6d41dc4f111ca40\": container with ID starting with 42088d74da2a26c43758f9552cb1efe7c88a47c4ba138287f6d41dc4f111ca40 not found: ID does not exist" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.776608 4921 scope.go:117] "RemoveContainer" containerID="0309bd2dd63d332aafc99255ef883a24193736869574d9faec4a6852a22342ab" Mar 12 15:15:49 crc kubenswrapper[4921]: E0312 15:15:49.780889 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0309bd2dd63d332aafc99255ef883a24193736869574d9faec4a6852a22342ab\": container with ID starting with 0309bd2dd63d332aafc99255ef883a24193736869574d9faec4a6852a22342ab not found: ID does not exist" containerID="0309bd2dd63d332aafc99255ef883a24193736869574d9faec4a6852a22342ab" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.780957 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0309bd2dd63d332aafc99255ef883a24193736869574d9faec4a6852a22342ab"} err="failed to get container status \"0309bd2dd63d332aafc99255ef883a24193736869574d9faec4a6852a22342ab\": rpc error: code = NotFound desc = could not find container \"0309bd2dd63d332aafc99255ef883a24193736869574d9faec4a6852a22342ab\": container with ID starting with 0309bd2dd63d332aafc99255ef883a24193736869574d9faec4a6852a22342ab not found: ID does not exist" Mar 12 15:15:49 crc kubenswrapper[4921]: I0312 15:15:49.997511 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" path="/var/lib/kubelet/pods/c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b/volumes" Mar 12 15:15:50 crc kubenswrapper[4921]: I0312 15:15:50.751385 4921 scope.go:117] "RemoveContainer" containerID="b6b57bc1dbe0c66620682ae98b002231d25a3501b243d6cccc1010358e35ad2a" Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.163116 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555476-wc6zf"] Mar 12 15:16:00 crc kubenswrapper[4921]: E0312 15:16:00.164264 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" containerName="extract-content" Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.164281 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" containerName="extract-content" Mar 12 15:16:00 crc kubenswrapper[4921]: E0312 15:16:00.164292 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" containerName="extract-utilities" Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.164299 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" containerName="extract-utilities" Mar 12 15:16:00 crc kubenswrapper[4921]: E0312 15:16:00.164326 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" containerName="registry-server" Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.164335 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" containerName="registry-server" Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.164604 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d16a5a-3e6f-4e68-9d20-41e8a184bd5b" containerName="registry-server" Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.165478 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555476-wc6zf" Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.169522 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.169839 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.176661 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.181112 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555476-wc6zf"] Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.276492 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmcj9\" (UniqueName: \"kubernetes.io/projected/e564772e-8d64-4f52-a570-a89fb04b1560-kube-api-access-mmcj9\") pod \"auto-csr-approver-29555476-wc6zf\" (UID: \"e564772e-8d64-4f52-a570-a89fb04b1560\") " pod="openshift-infra/auto-csr-approver-29555476-wc6zf" Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.380140 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmcj9\" (UniqueName: \"kubernetes.io/projected/e564772e-8d64-4f52-a570-a89fb04b1560-kube-api-access-mmcj9\") pod \"auto-csr-approver-29555476-wc6zf\" (UID: \"e564772e-8d64-4f52-a570-a89fb04b1560\") " pod="openshift-infra/auto-csr-approver-29555476-wc6zf" Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.408743 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmcj9\" (UniqueName: \"kubernetes.io/projected/e564772e-8d64-4f52-a570-a89fb04b1560-kube-api-access-mmcj9\") pod \"auto-csr-approver-29555476-wc6zf\" (UID: \"e564772e-8d64-4f52-a570-a89fb04b1560\") " pod="openshift-infra/auto-csr-approver-29555476-wc6zf" Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.487579 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555476-wc6zf" Mar 12 15:16:00 crc kubenswrapper[4921]: I0312 15:16:00.974575 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555476-wc6zf"] Mar 12 15:16:01 crc kubenswrapper[4921]: I0312 15:16:01.807614 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555476-wc6zf" event={"ID":"e564772e-8d64-4f52-a570-a89fb04b1560","Type":"ContainerStarted","Data":"25c4b17dd38a8d8e446e6b17b12ea822e8870a46c96e419009d52291f53c1078"} Mar 12 15:16:02 crc kubenswrapper[4921]: I0312 15:16:02.822508 4921 generic.go:334] "Generic (PLEG): container finished" podID="e564772e-8d64-4f52-a570-a89fb04b1560" containerID="0bc7cb9c8a0c61cda77e7fff50e03fdded791dae73ba4b596295c5210254e41f" exitCode=0 Mar 12 15:16:02 crc kubenswrapper[4921]: I0312 15:16:02.822706 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555476-wc6zf" event={"ID":"e564772e-8d64-4f52-a570-a89fb04b1560","Type":"ContainerDied","Data":"0bc7cb9c8a0c61cda77e7fff50e03fdded791dae73ba4b596295c5210254e41f"} Mar 12 15:16:04 crc kubenswrapper[4921]: I0312 15:16:04.373877 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555476-wc6zf" Mar 12 15:16:04 crc kubenswrapper[4921]: I0312 15:16:04.384725 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmcj9\" (UniqueName: \"kubernetes.io/projected/e564772e-8d64-4f52-a570-a89fb04b1560-kube-api-access-mmcj9\") pod \"e564772e-8d64-4f52-a570-a89fb04b1560\" (UID: \"e564772e-8d64-4f52-a570-a89fb04b1560\") " Mar 12 15:16:04 crc kubenswrapper[4921]: I0312 15:16:04.398016 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e564772e-8d64-4f52-a570-a89fb04b1560-kube-api-access-mmcj9" (OuterVolumeSpecName: "kube-api-access-mmcj9") pod "e564772e-8d64-4f52-a570-a89fb04b1560" (UID: "e564772e-8d64-4f52-a570-a89fb04b1560"). InnerVolumeSpecName "kube-api-access-mmcj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:16:04 crc kubenswrapper[4921]: I0312 15:16:04.488577 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmcj9\" (UniqueName: \"kubernetes.io/projected/e564772e-8d64-4f52-a570-a89fb04b1560-kube-api-access-mmcj9\") on node \"crc\" DevicePath \"\"" Mar 12 15:16:04 crc kubenswrapper[4921]: I0312 15:16:04.846500 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555476-wc6zf" event={"ID":"e564772e-8d64-4f52-a570-a89fb04b1560","Type":"ContainerDied","Data":"25c4b17dd38a8d8e446e6b17b12ea822e8870a46c96e419009d52291f53c1078"} Mar 12 15:16:04 crc kubenswrapper[4921]: I0312 15:16:04.846586 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25c4b17dd38a8d8e446e6b17b12ea822e8870a46c96e419009d52291f53c1078" Mar 12 15:16:04 crc kubenswrapper[4921]: I0312 15:16:04.846646 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555476-wc6zf" Mar 12 15:16:05 crc kubenswrapper[4921]: I0312 15:16:05.470976 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555470-mbhr7"] Mar 12 15:16:05 crc kubenswrapper[4921]: I0312 15:16:05.501909 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555470-mbhr7"] Mar 12 15:16:06 crc kubenswrapper[4921]: I0312 15:16:06.005211 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0764e67-c3d7-4d48-a31d-e6403fe9d048" path="/var/lib/kubelet/pods/c0764e67-c3d7-4d48-a31d-e6403fe9d048/volumes" Mar 12 15:16:26 crc kubenswrapper[4921]: I0312 15:16:26.323621 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:16:26 crc kubenswrapper[4921]: I0312 15:16:26.324300 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:16:50 crc kubenswrapper[4921]: I0312 15:16:50.866620 4921 scope.go:117] "RemoveContainer" containerID="7f8b634da4801157cdc1d96a64e957afb59bd934b2ed417eb3492f88977838b8" Mar 12 15:16:50 crc kubenswrapper[4921]: I0312 15:16:50.900548 4921 scope.go:117] "RemoveContainer" containerID="a61426a05b81ff5085138f21e240d4376d4b439a72500310e7ace0abaa77ce1f" Mar 12 15:16:50 crc kubenswrapper[4921]: I0312 15:16:50.942353 4921 scope.go:117] "RemoveContainer" containerID="082056f136269cc34e282f084b54eb6a2a88e2b993f332a94261540229b9957d" Mar 12 15:16:50 crc kubenswrapper[4921]: I0312 15:16:50.976872 4921 scope.go:117] "RemoveContainer" containerID="5c8daea8bc01256c00c454f0edea323222cca69e7f906ef89a16e3e65e5c71f9" Mar 12 15:16:56 crc kubenswrapper[4921]: I0312 15:16:56.323789 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:16:56 crc kubenswrapper[4921]: I0312 15:16:56.324978 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:17:26 crc kubenswrapper[4921]: I0312 15:17:26.324040 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:17:26 crc kubenswrapper[4921]: I0312 15:17:26.325166 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:17:26 crc kubenswrapper[4921]: I0312 15:17:26.325247 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 15:17:26 crc kubenswrapper[4921]: I0312 15:17:26.326429 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e35138a155ccdd9d0ac38b057cae71bd9423544b9bf4cb58d46480243f059f38"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:17:26 crc kubenswrapper[4921]: I0312 15:17:26.326500 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://e35138a155ccdd9d0ac38b057cae71bd9423544b9bf4cb58d46480243f059f38" gracePeriod=600 Mar 12 15:17:26 crc kubenswrapper[4921]: I0312 15:17:26.699675 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="e35138a155ccdd9d0ac38b057cae71bd9423544b9bf4cb58d46480243f059f38" exitCode=0 Mar 12 15:17:26 crc kubenswrapper[4921]: I0312 15:17:26.699788 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"e35138a155ccdd9d0ac38b057cae71bd9423544b9bf4cb58d46480243f059f38"} Mar 12 15:17:26 crc kubenswrapper[4921]: I0312 15:17:26.700293 4921 scope.go:117] "RemoveContainer" containerID="83792a03e4222b1c0587fd4d328b351542967ba51e61d020fcb2d4204aa1c278" Mar 12 15:17:27 crc kubenswrapper[4921]: I0312 15:17:27.717330 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f"} Mar 12 15:18:00 crc kubenswrapper[4921]: I0312 15:18:00.163733 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555478-nbrck"] Mar 12 15:18:00 crc kubenswrapper[4921]: E0312 15:18:00.165321 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e564772e-8d64-4f52-a570-a89fb04b1560" containerName="oc" Mar 12 15:18:00 crc kubenswrapper[4921]: I0312 15:18:00.165342 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="e564772e-8d64-4f52-a570-a89fb04b1560" containerName="oc" Mar 12 15:18:00 crc kubenswrapper[4921]: I0312 15:18:00.165663 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="e564772e-8d64-4f52-a570-a89fb04b1560" containerName="oc" Mar 12 15:18:00 crc kubenswrapper[4921]: I0312 15:18:00.166843 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555478-nbrck" Mar 12 15:18:00 crc kubenswrapper[4921]: I0312 15:18:00.170412 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:18:00 crc kubenswrapper[4921]: I0312 15:18:00.171923 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:18:00 crc kubenswrapper[4921]: I0312 15:18:00.208205 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:18:00 crc kubenswrapper[4921]: I0312 15:18:00.212906 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555478-nbrck"] Mar 12 15:18:00 crc kubenswrapper[4921]: I0312 15:18:00.326017 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgz4d\" (UniqueName: \"kubernetes.io/projected/5ef400fc-80d5-42bd-9a95-b63a725790e4-kube-api-access-tgz4d\") pod \"auto-csr-approver-29555478-nbrck\" (UID: \"5ef400fc-80d5-42bd-9a95-b63a725790e4\") " pod="openshift-infra/auto-csr-approver-29555478-nbrck" Mar 12 15:18:00 crc kubenswrapper[4921]: I0312 15:18:00.428384 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgz4d\" (UniqueName: \"kubernetes.io/projected/5ef400fc-80d5-42bd-9a95-b63a725790e4-kube-api-access-tgz4d\") pod \"auto-csr-approver-29555478-nbrck\" (UID: \"5ef400fc-80d5-42bd-9a95-b63a725790e4\") " pod="openshift-infra/auto-csr-approver-29555478-nbrck" Mar 12 15:18:00 crc kubenswrapper[4921]: I0312 15:18:00.458526 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgz4d\" (UniqueName: \"kubernetes.io/projected/5ef400fc-80d5-42bd-9a95-b63a725790e4-kube-api-access-tgz4d\") pod \"auto-csr-approver-29555478-nbrck\" (UID: \"5ef400fc-80d5-42bd-9a95-b63a725790e4\") " pod="openshift-infra/auto-csr-approver-29555478-nbrck" Mar 12 15:18:00 crc kubenswrapper[4921]: I0312 15:18:00.525780 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555478-nbrck" Mar 12 15:18:01 crc kubenswrapper[4921]: I0312 15:18:01.054531 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555478-nbrck"] Mar 12 15:18:01 crc kubenswrapper[4921]: I0312 15:18:01.099258 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555478-nbrck" event={"ID":"5ef400fc-80d5-42bd-9a95-b63a725790e4","Type":"ContainerStarted","Data":"38756ffcf2c370ee6738fe75d03814b59618b2d445d4970a24e3cc0ad1bffae6"} Mar 12 15:18:03 crc kubenswrapper[4921]: I0312 15:18:03.120709 4921 generic.go:334] "Generic (PLEG): container finished" podID="5ef400fc-80d5-42bd-9a95-b63a725790e4" containerID="bd382fc7d191d23425fab8be0a14c6b79e6fada1e4101839597734931d0c0a6b" exitCode=0 Mar 12 15:18:03 crc kubenswrapper[4921]: I0312 15:18:03.120823 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555478-nbrck" event={"ID":"5ef400fc-80d5-42bd-9a95-b63a725790e4","Type":"ContainerDied","Data":"bd382fc7d191d23425fab8be0a14c6b79e6fada1e4101839597734931d0c0a6b"} Mar 12 15:18:04 crc kubenswrapper[4921]: I0312 15:18:04.532786 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555478-nbrck" Mar 12 15:18:04 crc kubenswrapper[4921]: I0312 15:18:04.659393 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgz4d\" (UniqueName: \"kubernetes.io/projected/5ef400fc-80d5-42bd-9a95-b63a725790e4-kube-api-access-tgz4d\") pod \"5ef400fc-80d5-42bd-9a95-b63a725790e4\" (UID: \"5ef400fc-80d5-42bd-9a95-b63a725790e4\") " Mar 12 15:18:04 crc kubenswrapper[4921]: I0312 15:18:04.667080 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ef400fc-80d5-42bd-9a95-b63a725790e4-kube-api-access-tgz4d" (OuterVolumeSpecName: "kube-api-access-tgz4d") pod "5ef400fc-80d5-42bd-9a95-b63a725790e4" (UID: "5ef400fc-80d5-42bd-9a95-b63a725790e4"). InnerVolumeSpecName "kube-api-access-tgz4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:18:04 crc kubenswrapper[4921]: I0312 15:18:04.762686 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgz4d\" (UniqueName: \"kubernetes.io/projected/5ef400fc-80d5-42bd-9a95-b63a725790e4-kube-api-access-tgz4d\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:05 crc kubenswrapper[4921]: I0312 15:18:05.139152 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555478-nbrck" event={"ID":"5ef400fc-80d5-42bd-9a95-b63a725790e4","Type":"ContainerDied","Data":"38756ffcf2c370ee6738fe75d03814b59618b2d445d4970a24e3cc0ad1bffae6"} Mar 12 15:18:05 crc kubenswrapper[4921]: I0312 15:18:05.139192 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38756ffcf2c370ee6738fe75d03814b59618b2d445d4970a24e3cc0ad1bffae6" Mar 12 15:18:05 crc kubenswrapper[4921]: I0312 15:18:05.139259 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555478-nbrck" Mar 12 15:18:05 crc kubenswrapper[4921]: I0312 15:18:05.608234 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555472-km95k"] Mar 12 15:18:05 crc kubenswrapper[4921]: I0312 15:18:05.616590 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555472-km95k"] Mar 12 15:18:06 crc kubenswrapper[4921]: I0312 15:18:05.999912 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e90165-7578-4c29-883a-644ab365bfff" path="/var/lib/kubelet/pods/81e90165-7578-4c29-883a-644ab365bfff/volumes" Mar 12 15:18:51 crc kubenswrapper[4921]: I0312 15:18:51.134009 4921 scope.go:117] "RemoveContainer" containerID="49b3cbd7bc2c5effb9ac02846a06fee2fb4759ff125d59a1b944e7690beea75b" Mar 12 15:18:55 crc kubenswrapper[4921]: I0312 15:18:55.703135 4921 generic.go:334] "Generic (PLEG): container finished" podID="b061c47e-9c37-48ed-a879-9263d780de9f" containerID="7f87c3a680a9388c9ad8b04a2749f8310e51245fe52013e18a20e5f8f7775e41" exitCode=0 Mar 12 15:18:55 crc kubenswrapper[4921]: I0312 15:18:55.703270 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b061c47e-9c37-48ed-a879-9263d780de9f","Type":"ContainerDied","Data":"7f87c3a680a9388c9ad8b04a2749f8310e51245fe52013e18a20e5f8f7775e41"} Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.552114 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.653760 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b061c47e-9c37-48ed-a879-9263d780de9f-openstack-config\") pod \"b061c47e-9c37-48ed-a879-9263d780de9f\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.654270 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b061c47e-9c37-48ed-a879-9263d780de9f-test-operator-ephemeral-temporary\") pod \"b061c47e-9c37-48ed-a879-9263d780de9f\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.654584 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-ssh-key\") pod \"b061c47e-9c37-48ed-a879-9263d780de9f\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.654739 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b061c47e-9c37-48ed-a879-9263d780de9f-test-operator-ephemeral-workdir\") pod \"b061c47e-9c37-48ed-a879-9263d780de9f\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.654829 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nqqk\" (UniqueName: \"kubernetes.io/projected/b061c47e-9c37-48ed-a879-9263d780de9f-kube-api-access-9nqqk\") pod \"b061c47e-9c37-48ed-a879-9263d780de9f\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.654927 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-ca-certs\") pod \"b061c47e-9c37-48ed-a879-9263d780de9f\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.655003 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b061c47e-9c37-48ed-a879-9263d780de9f-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "b061c47e-9c37-48ed-a879-9263d780de9f" (UID: "b061c47e-9c37-48ed-a879-9263d780de9f"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.655150 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-openstack-config-secret\") pod \"b061c47e-9c37-48ed-a879-9263d780de9f\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.655228 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"b061c47e-9c37-48ed-a879-9263d780de9f\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.655296 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b061c47e-9c37-48ed-a879-9263d780de9f-config-data\") pod \"b061c47e-9c37-48ed-a879-9263d780de9f\" (UID: \"b061c47e-9c37-48ed-a879-9263d780de9f\") " Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.655998 4921 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b061c47e-9c37-48ed-a879-9263d780de9f-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.657675 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b061c47e-9c37-48ed-a879-9263d780de9f-config-data" (OuterVolumeSpecName: "config-data") pod "b061c47e-9c37-48ed-a879-9263d780de9f" (UID: "b061c47e-9c37-48ed-a879-9263d780de9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.662863 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b061c47e-9c37-48ed-a879-9263d780de9f-kube-api-access-9nqqk" (OuterVolumeSpecName: "kube-api-access-9nqqk") pod "b061c47e-9c37-48ed-a879-9263d780de9f" (UID: "b061c47e-9c37-48ed-a879-9263d780de9f"). InnerVolumeSpecName "kube-api-access-9nqqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.663500 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "test-operator-logs") pod "b061c47e-9c37-48ed-a879-9263d780de9f" (UID: "b061c47e-9c37-48ed-a879-9263d780de9f"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.668352 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b061c47e-9c37-48ed-a879-9263d780de9f-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "b061c47e-9c37-48ed-a879-9263d780de9f" (UID: "b061c47e-9c37-48ed-a879-9263d780de9f"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.690667 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "b061c47e-9c37-48ed-a879-9263d780de9f" (UID: "b061c47e-9c37-48ed-a879-9263d780de9f"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.700374 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b061c47e-9c37-48ed-a879-9263d780de9f" (UID: "b061c47e-9c37-48ed-a879-9263d780de9f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.703142 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b061c47e-9c37-48ed-a879-9263d780de9f" (UID: "b061c47e-9c37-48ed-a879-9263d780de9f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.718400 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b061c47e-9c37-48ed-a879-9263d780de9f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b061c47e-9c37-48ed-a879-9263d780de9f" (UID: "b061c47e-9c37-48ed-a879-9263d780de9f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.730127 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b061c47e-9c37-48ed-a879-9263d780de9f","Type":"ContainerDied","Data":"470c994aec24362bae4a0fe564f434cb8f71bf79cf999e9d406b81c3e7b4ca7f"} Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.730187 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="470c994aec24362bae4a0fe564f434cb8f71bf79cf999e9d406b81c3e7b4ca7f" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.730244 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.758789 4921 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b061c47e-9c37-48ed-a879-9263d780de9f-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.758929 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nqqk\" (UniqueName: \"kubernetes.io/projected/b061c47e-9c37-48ed-a879-9263d780de9f-kube-api-access-9nqqk\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.758987 4921 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.759039 4921 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.759139 4921 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.759205 4921 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b061c47e-9c37-48ed-a879-9263d780de9f-config-data\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.759260 4921 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b061c47e-9c37-48ed-a879-9263d780de9f-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.759312 4921 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b061c47e-9c37-48ed-a879-9263d780de9f-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.788425 4921 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Mar 12 15:18:57 crc kubenswrapper[4921]: I0312 15:18:57.861799 4921 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.603224 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 12 15:19:04 crc kubenswrapper[4921]: E0312 15:19:04.604467 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b061c47e-9c37-48ed-a879-9263d780de9f" containerName="tempest-tests-tempest-tests-runner" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.604490 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="b061c47e-9c37-48ed-a879-9263d780de9f" containerName="tempest-tests-tempest-tests-runner" Mar 12 15:19:04 crc kubenswrapper[4921]: E0312 15:19:04.604516 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef400fc-80d5-42bd-9a95-b63a725790e4" containerName="oc" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.604524 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef400fc-80d5-42bd-9a95-b63a725790e4" containerName="oc" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.604846 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef400fc-80d5-42bd-9a95-b63a725790e4" containerName="oc" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.604872 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="b061c47e-9c37-48ed-a879-9263d780de9f" containerName="tempest-tests-tempest-tests-runner" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.605899 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.608841 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-r9plt" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.621771 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.759505 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5d16b762-c737-4831-ae57-099f1da5d7fb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.759693 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzr8g\" (UniqueName: \"kubernetes.io/projected/5d16b762-c737-4831-ae57-099f1da5d7fb-kube-api-access-fzr8g\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5d16b762-c737-4831-ae57-099f1da5d7fb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.861748 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzr8g\" (UniqueName: \"kubernetes.io/projected/5d16b762-c737-4831-ae57-099f1da5d7fb-kube-api-access-fzr8g\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5d16b762-c737-4831-ae57-099f1da5d7fb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.861932 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5d16b762-c737-4831-ae57-099f1da5d7fb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.863545 4921 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5d16b762-c737-4831-ae57-099f1da5d7fb\") device mount path \"/mnt/openstack/pv17\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.895097 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzr8g\" (UniqueName: \"kubernetes.io/projected/5d16b762-c737-4831-ae57-099f1da5d7fb-kube-api-access-fzr8g\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5d16b762-c737-4831-ae57-099f1da5d7fb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.897205 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5d16b762-c737-4831-ae57-099f1da5d7fb\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 15:19:04 crc kubenswrapper[4921]: I0312 15:19:04.937444 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 12 15:19:05 crc kubenswrapper[4921]: I0312 15:19:05.488734 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 12 15:19:05 crc kubenswrapper[4921]: I0312 15:19:05.844449 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"5d16b762-c737-4831-ae57-099f1da5d7fb","Type":"ContainerStarted","Data":"b5f64ad55858ac70621bcc506c151b18607fd6b499cafacf591a313de4566166"} Mar 12 15:19:06 crc kubenswrapper[4921]: I0312 15:19:06.859175 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"5d16b762-c737-4831-ae57-099f1da5d7fb","Type":"ContainerStarted","Data":"b99496f0c9bf58821e48ef38be2c1214bdbf9f98715187c1a36218d756e332bc"} Mar 12 15:19:06 crc kubenswrapper[4921]: I0312 15:19:06.876189 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.038877683 podStartE2EDuration="2.876170277s" podCreationTimestamp="2026-03-12 15:19:04 +0000 UTC" firstStartedPulling="2026-03-12 15:19:05.500601588 +0000 UTC m=+7768.190673559" lastFinishedPulling="2026-03-12 15:19:06.337894172 +0000 UTC m=+7769.027966153" observedRunningTime="2026-03-12 15:19:06.873316878 +0000 UTC m=+7769.563388849" watchObservedRunningTime="2026-03-12 15:19:06.876170277 +0000 UTC m=+7769.566242248" Mar 12 15:19:26 crc kubenswrapper[4921]: I0312 15:19:26.323664 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:19:26 crc kubenswrapper[4921]: I0312 15:19:26.324657 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:19:31 crc kubenswrapper[4921]: I0312 15:19:31.704771 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tcf58/must-gather-nhsgw"] Mar 12 15:19:31 crc kubenswrapper[4921]: I0312 15:19:31.708792 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/must-gather-nhsgw" Mar 12 15:19:31 crc kubenswrapper[4921]: I0312 15:19:31.717423 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-tcf58"/"openshift-service-ca.crt" Mar 12 15:19:31 crc kubenswrapper[4921]: I0312 15:19:31.719486 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-tcf58"/"default-dockercfg-l689r" Mar 12 15:19:31 crc kubenswrapper[4921]: I0312 15:19:31.720175 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-tcf58"/"kube-root-ca.crt" Mar 12 15:19:31 crc kubenswrapper[4921]: I0312 15:19:31.725424 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tcf58/must-gather-nhsgw"] Mar 12 15:19:31 crc kubenswrapper[4921]: I0312 15:19:31.784653 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/67c4e944-fd05-4288-934f-5ecbe702e2b6-must-gather-output\") pod \"must-gather-nhsgw\" (UID: \"67c4e944-fd05-4288-934f-5ecbe702e2b6\") " pod="openshift-must-gather-tcf58/must-gather-nhsgw" Mar 12 15:19:31 crc kubenswrapper[4921]: I0312 15:19:31.785132 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhb6v\" (UniqueName: \"kubernetes.io/projected/67c4e944-fd05-4288-934f-5ecbe702e2b6-kube-api-access-bhb6v\") pod \"must-gather-nhsgw\" (UID: \"67c4e944-fd05-4288-934f-5ecbe702e2b6\") " pod="openshift-must-gather-tcf58/must-gather-nhsgw" Mar 12 15:19:31 crc kubenswrapper[4921]: I0312 15:19:31.888205 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhb6v\" (UniqueName: \"kubernetes.io/projected/67c4e944-fd05-4288-934f-5ecbe702e2b6-kube-api-access-bhb6v\") pod \"must-gather-nhsgw\" (UID: \"67c4e944-fd05-4288-934f-5ecbe702e2b6\") " pod="openshift-must-gather-tcf58/must-gather-nhsgw" Mar 12 15:19:31 crc kubenswrapper[4921]: I0312 15:19:31.888401 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/67c4e944-fd05-4288-934f-5ecbe702e2b6-must-gather-output\") pod \"must-gather-nhsgw\" (UID: \"67c4e944-fd05-4288-934f-5ecbe702e2b6\") " pod="openshift-must-gather-tcf58/must-gather-nhsgw" Mar 12 15:19:31 crc kubenswrapper[4921]: I0312 15:19:31.888945 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/67c4e944-fd05-4288-934f-5ecbe702e2b6-must-gather-output\") pod \"must-gather-nhsgw\" (UID: \"67c4e944-fd05-4288-934f-5ecbe702e2b6\") " pod="openshift-must-gather-tcf58/must-gather-nhsgw" Mar 12 15:19:31 crc kubenswrapper[4921]: I0312 15:19:31.911323 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhb6v\" (UniqueName: \"kubernetes.io/projected/67c4e944-fd05-4288-934f-5ecbe702e2b6-kube-api-access-bhb6v\") pod \"must-gather-nhsgw\" (UID: \"67c4e944-fd05-4288-934f-5ecbe702e2b6\") " pod="openshift-must-gather-tcf58/must-gather-nhsgw" Mar 12 15:19:32 crc kubenswrapper[4921]: I0312 15:19:32.029918 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/must-gather-nhsgw" Mar 12 15:19:32 crc kubenswrapper[4921]: I0312 15:19:32.588138 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tcf58/must-gather-nhsgw"] Mar 12 15:19:33 crc kubenswrapper[4921]: I0312 15:19:33.012484 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5622m"] Mar 12 15:19:33 crc kubenswrapper[4921]: I0312 15:19:33.014674 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:33 crc kubenswrapper[4921]: I0312 15:19:33.034092 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5622m"] Mar 12 15:19:33 crc kubenswrapper[4921]: I0312 15:19:33.119945 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8698537-b9bf-41de-9d12-68d07948c6e4-utilities\") pod \"community-operators-5622m\" (UID: \"e8698537-b9bf-41de-9d12-68d07948c6e4\") " pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:33 crc kubenswrapper[4921]: I0312 15:19:33.120461 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8698537-b9bf-41de-9d12-68d07948c6e4-catalog-content\") pod \"community-operators-5622m\" (UID: \"e8698537-b9bf-41de-9d12-68d07948c6e4\") " pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:33 crc kubenswrapper[4921]: I0312 15:19:33.120638 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwmqv\" (UniqueName: \"kubernetes.io/projected/e8698537-b9bf-41de-9d12-68d07948c6e4-kube-api-access-vwmqv\") pod \"community-operators-5622m\" (UID: \"e8698537-b9bf-41de-9d12-68d07948c6e4\") " pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:33 crc kubenswrapper[4921]: I0312 15:19:33.148705 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcf58/must-gather-nhsgw" event={"ID":"67c4e944-fd05-4288-934f-5ecbe702e2b6","Type":"ContainerStarted","Data":"c13d8f47b8e849d3e4a7c371b57d13fff102c63ff29c1fe42636815fc6bf6534"} Mar 12 15:19:33 crc kubenswrapper[4921]: I0312 15:19:33.223206 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8698537-b9bf-41de-9d12-68d07948c6e4-utilities\") pod \"community-operators-5622m\" (UID: \"e8698537-b9bf-41de-9d12-68d07948c6e4\") " pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:33 crc kubenswrapper[4921]: I0312 15:19:33.223371 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8698537-b9bf-41de-9d12-68d07948c6e4-catalog-content\") pod \"community-operators-5622m\" (UID: \"e8698537-b9bf-41de-9d12-68d07948c6e4\") " pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:33 crc kubenswrapper[4921]: I0312 15:19:33.223430 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwmqv\" (UniqueName: \"kubernetes.io/projected/e8698537-b9bf-41de-9d12-68d07948c6e4-kube-api-access-vwmqv\") pod \"community-operators-5622m\" (UID: \"e8698537-b9bf-41de-9d12-68d07948c6e4\") " pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:33 crc kubenswrapper[4921]: I0312 15:19:33.224407 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8698537-b9bf-41de-9d12-68d07948c6e4-catalog-content\") pod \"community-operators-5622m\" (UID: \"e8698537-b9bf-41de-9d12-68d07948c6e4\") " pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:33 crc kubenswrapper[4921]: I0312 15:19:33.224491 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8698537-b9bf-41de-9d12-68d07948c6e4-utilities\") pod \"community-operators-5622m\" (UID: \"e8698537-b9bf-41de-9d12-68d07948c6e4\") " pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:33 crc kubenswrapper[4921]: I0312 15:19:33.244905 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwmqv\" (UniqueName: \"kubernetes.io/projected/e8698537-b9bf-41de-9d12-68d07948c6e4-kube-api-access-vwmqv\") pod \"community-operators-5622m\" (UID: \"e8698537-b9bf-41de-9d12-68d07948c6e4\") " pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:33 crc kubenswrapper[4921]: I0312 15:19:33.342022 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:34 crc kubenswrapper[4921]: W0312 15:19:34.009954 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8698537_b9bf_41de_9d12_68d07948c6e4.slice/crio-ebc072a77b8dc071f5956d6c51ed6c6cd1adfb1b0b02de66f17bce90f1fffcb7 WatchSource:0}: Error finding container ebc072a77b8dc071f5956d6c51ed6c6cd1adfb1b0b02de66f17bce90f1fffcb7: Status 404 returned error can't find the container with id ebc072a77b8dc071f5956d6c51ed6c6cd1adfb1b0b02de66f17bce90f1fffcb7 Mar 12 15:19:34 crc kubenswrapper[4921]: I0312 15:19:34.011389 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5622m"] Mar 12 15:19:34 crc kubenswrapper[4921]: I0312 15:19:34.164480 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5622m" event={"ID":"e8698537-b9bf-41de-9d12-68d07948c6e4","Type":"ContainerStarted","Data":"ebc072a77b8dc071f5956d6c51ed6c6cd1adfb1b0b02de66f17bce90f1fffcb7"} Mar 12 15:19:35 crc kubenswrapper[4921]: I0312 15:19:35.179738 4921 generic.go:334] "Generic (PLEG): container finished" podID="e8698537-b9bf-41de-9d12-68d07948c6e4" containerID="17f73e7cd5f8b387c17b5f2dc25012af7bc8523c7ad554ca5fb5a1fa3c33756f" exitCode=0 Mar 12 15:19:35 crc kubenswrapper[4921]: I0312 15:19:35.179863 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5622m" event={"ID":"e8698537-b9bf-41de-9d12-68d07948c6e4","Type":"ContainerDied","Data":"17f73e7cd5f8b387c17b5f2dc25012af7bc8523c7ad554ca5fb5a1fa3c33756f"} Mar 12 15:19:40 crc kubenswrapper[4921]: I0312 15:19:40.260572 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcf58/must-gather-nhsgw" event={"ID":"67c4e944-fd05-4288-934f-5ecbe702e2b6","Type":"ContainerStarted","Data":"d422d7efbeeafdc3adc7ce07afb9585227326b52c138b103905cc028668a2e26"} Mar 12 15:19:40 crc kubenswrapper[4921]: I0312 15:19:40.261564 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcf58/must-gather-nhsgw" event={"ID":"67c4e944-fd05-4288-934f-5ecbe702e2b6","Type":"ContainerStarted","Data":"36f8982c976b2d67bd2a6d42d369b39129ee567469a0f23d436f05d556bc937a"} Mar 12 15:19:40 crc kubenswrapper[4921]: I0312 15:19:40.278215 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tcf58/must-gather-nhsgw" podStartSLOduration=2.374170812 podStartE2EDuration="9.27819681s" podCreationTimestamp="2026-03-12 15:19:31 +0000 UTC" firstStartedPulling="2026-03-12 15:19:32.598022385 +0000 UTC m=+7795.288094356" lastFinishedPulling="2026-03-12 15:19:39.502048373 +0000 UTC m=+7802.192120354" observedRunningTime="2026-03-12 15:19:40.274763413 +0000 UTC m=+7802.964835404" watchObservedRunningTime="2026-03-12 15:19:40.27819681 +0000 UTC m=+7802.968268781" Mar 12 15:19:44 crc kubenswrapper[4921]: I0312 15:19:44.306425 4921 generic.go:334] "Generic (PLEG): container finished" podID="e8698537-b9bf-41de-9d12-68d07948c6e4" containerID="04956ce737bbb68527d53bff54375fe2cef2f475068cd3f09ed2c9d3a2abc785" exitCode=0 Mar 12 15:19:44 crc kubenswrapper[4921]: I0312 15:19:44.306541 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5622m" event={"ID":"e8698537-b9bf-41de-9d12-68d07948c6e4","Type":"ContainerDied","Data":"04956ce737bbb68527d53bff54375fe2cef2f475068cd3f09ed2c9d3a2abc785"} Mar 12 15:19:45 crc kubenswrapper[4921]: E0312 15:19:45.085494 4921 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.192:54174->38.102.83.192:46775: write tcp 38.102.83.192:54174->38.102.83.192:46775: write: broken pipe Mar 12 15:19:45 crc kubenswrapper[4921]: I0312 15:19:45.795249 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tcf58/crc-debug-k6n9c"] Mar 12 15:19:45 crc kubenswrapper[4921]: I0312 15:19:45.802041 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/crc-debug-k6n9c" Mar 12 15:19:45 crc kubenswrapper[4921]: I0312 15:19:45.880567 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a873062-4c06-41b3-9078-4132ee69b343-host\") pod \"crc-debug-k6n9c\" (UID: \"5a873062-4c06-41b3-9078-4132ee69b343\") " pod="openshift-must-gather-tcf58/crc-debug-k6n9c" Mar 12 15:19:45 crc kubenswrapper[4921]: I0312 15:19:45.881128 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snvs8\" (UniqueName: \"kubernetes.io/projected/5a873062-4c06-41b3-9078-4132ee69b343-kube-api-access-snvs8\") pod \"crc-debug-k6n9c\" (UID: \"5a873062-4c06-41b3-9078-4132ee69b343\") " pod="openshift-must-gather-tcf58/crc-debug-k6n9c" Mar 12 15:19:45 crc kubenswrapper[4921]: I0312 15:19:45.983839 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a873062-4c06-41b3-9078-4132ee69b343-host\") pod \"crc-debug-k6n9c\" (UID: \"5a873062-4c06-41b3-9078-4132ee69b343\") " pod="openshift-must-gather-tcf58/crc-debug-k6n9c" Mar 12 15:19:45 crc kubenswrapper[4921]: I0312 15:19:45.984020 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snvs8\" (UniqueName: \"kubernetes.io/projected/5a873062-4c06-41b3-9078-4132ee69b343-kube-api-access-snvs8\") pod \"crc-debug-k6n9c\" (UID: \"5a873062-4c06-41b3-9078-4132ee69b343\") " pod="openshift-must-gather-tcf58/crc-debug-k6n9c" Mar 12 15:19:45 crc kubenswrapper[4921]: I0312 15:19:45.984600 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a873062-4c06-41b3-9078-4132ee69b343-host\") pod \"crc-debug-k6n9c\" (UID: \"5a873062-4c06-41b3-9078-4132ee69b343\") " pod="openshift-must-gather-tcf58/crc-debug-k6n9c" Mar 12 15:19:46 crc kubenswrapper[4921]: I0312 15:19:46.017214 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snvs8\" (UniqueName: \"kubernetes.io/projected/5a873062-4c06-41b3-9078-4132ee69b343-kube-api-access-snvs8\") pod \"crc-debug-k6n9c\" (UID: \"5a873062-4c06-41b3-9078-4132ee69b343\") " pod="openshift-must-gather-tcf58/crc-debug-k6n9c" Mar 12 15:19:46 crc kubenswrapper[4921]: I0312 15:19:46.128035 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/crc-debug-k6n9c" Mar 12 15:19:46 crc kubenswrapper[4921]: W0312 15:19:46.167856 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a873062_4c06_41b3_9078_4132ee69b343.slice/crio-38cb17d6d010b1fec7d9edd73b46bdcfe90955fb9d54a14eea2cdce4c72caf8c WatchSource:0}: Error finding container 38cb17d6d010b1fec7d9edd73b46bdcfe90955fb9d54a14eea2cdce4c72caf8c: Status 404 returned error can't find the container with id 38cb17d6d010b1fec7d9edd73b46bdcfe90955fb9d54a14eea2cdce4c72caf8c Mar 12 15:19:46 crc kubenswrapper[4921]: I0312 15:19:46.327592 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcf58/crc-debug-k6n9c" event={"ID":"5a873062-4c06-41b3-9078-4132ee69b343","Type":"ContainerStarted","Data":"38cb17d6d010b1fec7d9edd73b46bdcfe90955fb9d54a14eea2cdce4c72caf8c"} Mar 12 15:19:46 crc kubenswrapper[4921]: I0312 15:19:46.330282 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5622m" event={"ID":"e8698537-b9bf-41de-9d12-68d07948c6e4","Type":"ContainerStarted","Data":"9438e3c715d4bc0965203020087beb38d714da9d0fb11f93848d220f2d0ae6ce"} Mar 12 15:19:46 crc kubenswrapper[4921]: I0312 15:19:46.362201 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5622m" podStartSLOduration=6.130309723 podStartE2EDuration="14.362175538s" podCreationTimestamp="2026-03-12 15:19:32 +0000 UTC" firstStartedPulling="2026-03-12 15:19:37.719472423 +0000 UTC m=+7800.409544404" lastFinishedPulling="2026-03-12 15:19:45.951338248 +0000 UTC m=+7808.641410219" observedRunningTime="2026-03-12 15:19:46.353469868 +0000 UTC m=+7809.043541849" watchObservedRunningTime="2026-03-12 15:19:46.362175538 +0000 UTC m=+7809.052247509" Mar 12 15:19:53 crc kubenswrapper[4921]: I0312 15:19:53.343413 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:53 crc kubenswrapper[4921]: I0312 15:19:53.345333 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:53 crc kubenswrapper[4921]: I0312 15:19:53.509593 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:54 crc kubenswrapper[4921]: I0312 15:19:54.527851 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5622m" Mar 12 15:19:54 crc kubenswrapper[4921]: I0312 15:19:54.610846 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5622m"] Mar 12 15:19:54 crc kubenswrapper[4921]: I0312 15:19:54.684383 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p9jwl"] Mar 12 15:19:54 crc kubenswrapper[4921]: I0312 15:19:54.684749 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p9jwl" podUID="3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" containerName="registry-server" containerID="cri-o://9f946deaa317dec1a6d085307e0b2b2510010f28bb5468f01c87f677fbac90ee" gracePeriod=2 Mar 12 15:19:55 crc kubenswrapper[4921]: I0312 15:19:55.477628 4921 generic.go:334] "Generic (PLEG): container finished" podID="3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" containerID="9f946deaa317dec1a6d085307e0b2b2510010f28bb5468f01c87f677fbac90ee" exitCode=0 Mar 12 15:19:55 crc kubenswrapper[4921]: I0312 15:19:55.477792 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9jwl" event={"ID":"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1","Type":"ContainerDied","Data":"9f946deaa317dec1a6d085307e0b2b2510010f28bb5468f01c87f677fbac90ee"} Mar 12 15:19:56 crc kubenswrapper[4921]: I0312 15:19:56.324625 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:19:56 crc kubenswrapper[4921]: I0312 15:19:56.325669 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.145107 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9jwl" Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.210705 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-utilities\") pod \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\" (UID: \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\") " Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.210776 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-catalog-content\") pod \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\" (UID: \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\") " Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.210986 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r87xh\" (UniqueName: \"kubernetes.io/projected/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-kube-api-access-r87xh\") pod \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\" (UID: \"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1\") " Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.213667 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-utilities" (OuterVolumeSpecName: "utilities") pod "3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" (UID: "3676f34a-5a2a-49ce-96fc-ec150d8cd6d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.232226 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-kube-api-access-r87xh" (OuterVolumeSpecName: "kube-api-access-r87xh") pod "3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" (UID: "3676f34a-5a2a-49ce-96fc-ec150d8cd6d1"). InnerVolumeSpecName "kube-api-access-r87xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.266721 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" (UID: "3676f34a-5a2a-49ce-96fc-ec150d8cd6d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.315080 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.315544 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r87xh\" (UniqueName: \"kubernetes.io/projected/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-kube-api-access-r87xh\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.315556 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.527233 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcf58/crc-debug-k6n9c" event={"ID":"5a873062-4c06-41b3-9078-4132ee69b343","Type":"ContainerStarted","Data":"11a2ce2cf26685312e91198842d10ea17050f901589e9477b94761370703cc4a"} Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.541606 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p9jwl" event={"ID":"3676f34a-5a2a-49ce-96fc-ec150d8cd6d1","Type":"ContainerDied","Data":"47ac42119835c1fffacb0ded8f5954f03b09b09fa408da7ca743b1940c6b4f02"} Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.541688 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p9jwl" Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.541890 4921 scope.go:117] "RemoveContainer" containerID="9f946deaa317dec1a6d085307e0b2b2510010f28bb5468f01c87f677fbac90ee" Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.559135 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tcf58/crc-debug-k6n9c" podStartSLOduration=1.956916154 podStartE2EDuration="13.559107454s" podCreationTimestamp="2026-03-12 15:19:45 +0000 UTC" firstStartedPulling="2026-03-12 15:19:46.170737267 +0000 UTC m=+7808.860809238" lastFinishedPulling="2026-03-12 15:19:57.772928567 +0000 UTC m=+7820.463000538" observedRunningTime="2026-03-12 15:19:58.542600462 +0000 UTC m=+7821.232672443" watchObservedRunningTime="2026-03-12 15:19:58.559107454 +0000 UTC m=+7821.249179425" Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.596522 4921 scope.go:117] "RemoveContainer" containerID="11b4fed238329cc5f698a62cbfcae7a83cdede498550ed9f6cd995d608dc9e95" Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.597961 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p9jwl"] Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.610679 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p9jwl"] Mar 12 15:19:58 crc kubenswrapper[4921]: I0312 15:19:58.629615 4921 scope.go:117] "RemoveContainer" containerID="f9205d5e51f29134dc1104be11b480e5c6d1de432f6dfa9b9f1b3901ce2eabaf" Mar 12 15:19:59 crc kubenswrapper[4921]: I0312 15:19:59.998509 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" path="/var/lib/kubelet/pods/3676f34a-5a2a-49ce-96fc-ec150d8cd6d1/volumes" Mar 12 15:20:00 crc kubenswrapper[4921]: I0312 15:20:00.151671 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555480-jhdsm"] Mar 12 15:20:00 crc kubenswrapper[4921]: E0312 15:20:00.152220 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" containerName="extract-utilities" Mar 12 15:20:00 crc kubenswrapper[4921]: I0312 15:20:00.152240 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" containerName="extract-utilities" Mar 12 15:20:00 crc kubenswrapper[4921]: E0312 15:20:00.152262 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" containerName="extract-content" Mar 12 15:20:00 crc kubenswrapper[4921]: I0312 15:20:00.152268 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" containerName="extract-content" Mar 12 15:20:00 crc kubenswrapper[4921]: E0312 15:20:00.152279 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" containerName="registry-server" Mar 12 15:20:00 crc kubenswrapper[4921]: I0312 15:20:00.152285 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" containerName="registry-server" Mar 12 15:20:00 crc kubenswrapper[4921]: I0312 15:20:00.152510 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="3676f34a-5a2a-49ce-96fc-ec150d8cd6d1" containerName="registry-server" Mar 12 15:20:00 crc kubenswrapper[4921]: I0312 15:20:00.153321 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555480-jhdsm" Mar 12 15:20:00 crc kubenswrapper[4921]: I0312 15:20:00.157881 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:20:00 crc kubenswrapper[4921]: I0312 15:20:00.158188 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:20:00 crc kubenswrapper[4921]: I0312 15:20:00.168412 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:20:00 crc kubenswrapper[4921]: I0312 15:20:00.172001 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555480-jhdsm"] Mar 12 15:20:00 crc kubenswrapper[4921]: I0312 15:20:00.174454 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29glr\" (UniqueName: \"kubernetes.io/projected/28baf337-488f-431b-8f48-49fe07810f9e-kube-api-access-29glr\") pod \"auto-csr-approver-29555480-jhdsm\" (UID: \"28baf337-488f-431b-8f48-49fe07810f9e\") " pod="openshift-infra/auto-csr-approver-29555480-jhdsm" Mar 12 15:20:00 crc kubenswrapper[4921]: I0312 15:20:00.276013 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29glr\" (UniqueName: \"kubernetes.io/projected/28baf337-488f-431b-8f48-49fe07810f9e-kube-api-access-29glr\") pod \"auto-csr-approver-29555480-jhdsm\" (UID: \"28baf337-488f-431b-8f48-49fe07810f9e\") " pod="openshift-infra/auto-csr-approver-29555480-jhdsm" Mar 12 15:20:00 crc kubenswrapper[4921]: I0312 15:20:00.298752 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29glr\" (UniqueName: \"kubernetes.io/projected/28baf337-488f-431b-8f48-49fe07810f9e-kube-api-access-29glr\") pod \"auto-csr-approver-29555480-jhdsm\" (UID: \"28baf337-488f-431b-8f48-49fe07810f9e\") " pod="openshift-infra/auto-csr-approver-29555480-jhdsm" Mar 12 15:20:00 crc kubenswrapper[4921]: I0312 15:20:00.482908 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555480-jhdsm" Mar 12 15:20:01 crc kubenswrapper[4921]: I0312 15:20:01.002641 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555480-jhdsm"] Mar 12 15:20:01 crc kubenswrapper[4921]: W0312 15:20:01.007269 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28baf337_488f_431b_8f48_49fe07810f9e.slice/crio-0e547f81a67142e395d715be69a7fd3227b39b2e58aa7c445943f43a1d1b2737 WatchSource:0}: Error finding container 0e547f81a67142e395d715be69a7fd3227b39b2e58aa7c445943f43a1d1b2737: Status 404 returned error can't find the container with id 0e547f81a67142e395d715be69a7fd3227b39b2e58aa7c445943f43a1d1b2737 Mar 12 15:20:01 crc kubenswrapper[4921]: I0312 15:20:01.583607 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555480-jhdsm" event={"ID":"28baf337-488f-431b-8f48-49fe07810f9e","Type":"ContainerStarted","Data":"0e547f81a67142e395d715be69a7fd3227b39b2e58aa7c445943f43a1d1b2737"} Mar 12 15:20:04 crc kubenswrapper[4921]: I0312 15:20:04.614767 4921 generic.go:334] "Generic (PLEG): container finished" podID="28baf337-488f-431b-8f48-49fe07810f9e" containerID="6734177d1276d460b3666c08a4038268df44a6c4aea5144bf67e7b2557966335" exitCode=0 Mar 12 15:20:04 crc kubenswrapper[4921]: I0312 15:20:04.614954 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555480-jhdsm" event={"ID":"28baf337-488f-431b-8f48-49fe07810f9e","Type":"ContainerDied","Data":"6734177d1276d460b3666c08a4038268df44a6c4aea5144bf67e7b2557966335"} Mar 12 15:20:06 crc kubenswrapper[4921]: I0312 15:20:06.055393 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555480-jhdsm" Mar 12 15:20:06 crc kubenswrapper[4921]: I0312 15:20:06.061964 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29glr\" (UniqueName: \"kubernetes.io/projected/28baf337-488f-431b-8f48-49fe07810f9e-kube-api-access-29glr\") pod \"28baf337-488f-431b-8f48-49fe07810f9e\" (UID: \"28baf337-488f-431b-8f48-49fe07810f9e\") " Mar 12 15:20:06 crc kubenswrapper[4921]: I0312 15:20:06.072686 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28baf337-488f-431b-8f48-49fe07810f9e-kube-api-access-29glr" (OuterVolumeSpecName: "kube-api-access-29glr") pod "28baf337-488f-431b-8f48-49fe07810f9e" (UID: "28baf337-488f-431b-8f48-49fe07810f9e"). InnerVolumeSpecName "kube-api-access-29glr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:20:06 crc kubenswrapper[4921]: I0312 15:20:06.166103 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29glr\" (UniqueName: \"kubernetes.io/projected/28baf337-488f-431b-8f48-49fe07810f9e-kube-api-access-29glr\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:06 crc kubenswrapper[4921]: I0312 15:20:06.637754 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555480-jhdsm" event={"ID":"28baf337-488f-431b-8f48-49fe07810f9e","Type":"ContainerDied","Data":"0e547f81a67142e395d715be69a7fd3227b39b2e58aa7c445943f43a1d1b2737"} Mar 12 15:20:06 crc kubenswrapper[4921]: I0312 15:20:06.638160 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e547f81a67142e395d715be69a7fd3227b39b2e58aa7c445943f43a1d1b2737" Mar 12 15:20:06 crc kubenswrapper[4921]: I0312 15:20:06.637927 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555480-jhdsm" Mar 12 15:20:07 crc kubenswrapper[4921]: I0312 15:20:07.143025 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555474-9sn7h"] Mar 12 15:20:07 crc kubenswrapper[4921]: I0312 15:20:07.154599 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555474-9sn7h"] Mar 12 15:20:07 crc kubenswrapper[4921]: I0312 15:20:07.996787 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a1892d6-8db6-484b-bdf1-75c061ee506a" path="/var/lib/kubelet/pods/5a1892d6-8db6-484b-bdf1-75c061ee506a/volumes" Mar 12 15:20:20 crc kubenswrapper[4921]: I0312 15:20:20.883024 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j86xl"] Mar 12 15:20:20 crc kubenswrapper[4921]: E0312 15:20:20.885642 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28baf337-488f-431b-8f48-49fe07810f9e" containerName="oc" Mar 12 15:20:20 crc kubenswrapper[4921]: I0312 15:20:20.885661 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="28baf337-488f-431b-8f48-49fe07810f9e" containerName="oc" Mar 12 15:20:20 crc kubenswrapper[4921]: I0312 15:20:20.885893 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="28baf337-488f-431b-8f48-49fe07810f9e" containerName="oc" Mar 12 15:20:20 crc kubenswrapper[4921]: I0312 15:20:20.887756 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:20 crc kubenswrapper[4921]: I0312 15:20:20.896615 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j86xl"] Mar 12 15:20:20 crc kubenswrapper[4921]: I0312 15:20:20.953270 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s46pv\" (UniqueName: \"kubernetes.io/projected/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-kube-api-access-s46pv\") pod \"certified-operators-j86xl\" (UID: \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\") " pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:20 crc kubenswrapper[4921]: I0312 15:20:20.953402 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-catalog-content\") pod \"certified-operators-j86xl\" (UID: \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\") " pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:20 crc kubenswrapper[4921]: I0312 15:20:20.953799 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-utilities\") pod \"certified-operators-j86xl\" (UID: \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\") " pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:21 crc kubenswrapper[4921]: I0312 15:20:21.056633 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-utilities\") pod \"certified-operators-j86xl\" (UID: \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\") " pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:21 crc kubenswrapper[4921]: I0312 15:20:21.056879 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s46pv\" (UniqueName: \"kubernetes.io/projected/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-kube-api-access-s46pv\") pod \"certified-operators-j86xl\" (UID: \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\") " pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:21 crc kubenswrapper[4921]: I0312 15:20:21.056959 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-catalog-content\") pod \"certified-operators-j86xl\" (UID: \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\") " pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:21 crc kubenswrapper[4921]: I0312 15:20:21.057206 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-utilities\") pod \"certified-operators-j86xl\" (UID: \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\") " pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:21 crc kubenswrapper[4921]: I0312 15:20:21.057479 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-catalog-content\") pod \"certified-operators-j86xl\" (UID: \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\") " pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:21 crc kubenswrapper[4921]: I0312 15:20:21.079169 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s46pv\" (UniqueName: \"kubernetes.io/projected/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-kube-api-access-s46pv\") pod \"certified-operators-j86xl\" (UID: \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\") " pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:21 crc kubenswrapper[4921]: I0312 15:20:21.215528 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:21 crc kubenswrapper[4921]: I0312 15:20:21.797018 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j86xl"] Mar 12 15:20:22 crc kubenswrapper[4921]: I0312 15:20:22.790361 4921 generic.go:334] "Generic (PLEG): container finished" podID="f2c601cc-83d1-42ec-8c21-f5f92ab9e778" containerID="e2fd43fa408ce7b9337c3aa26b7617b74c4c9dea9335459502f176bde576df1e" exitCode=0 Mar 12 15:20:22 crc kubenswrapper[4921]: I0312 15:20:22.790479 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j86xl" event={"ID":"f2c601cc-83d1-42ec-8c21-f5f92ab9e778","Type":"ContainerDied","Data":"e2fd43fa408ce7b9337c3aa26b7617b74c4c9dea9335459502f176bde576df1e"} Mar 12 15:20:22 crc kubenswrapper[4921]: I0312 15:20:22.790920 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j86xl" event={"ID":"f2c601cc-83d1-42ec-8c21-f5f92ab9e778","Type":"ContainerStarted","Data":"b931b7bb72ff2a6971ee4ed3993bb3a175f5a8cdbe72b1be9ac080ecb590c7cc"} Mar 12 15:20:23 crc kubenswrapper[4921]: I0312 15:20:23.802758 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j86xl" event={"ID":"f2c601cc-83d1-42ec-8c21-f5f92ab9e778","Type":"ContainerStarted","Data":"dd4c02a020bc1d870c07693d265cc4b06b2a0d04ff49562cde9860051f7b4a3d"} Mar 12 15:20:25 crc kubenswrapper[4921]: I0312 15:20:25.828555 4921 generic.go:334] "Generic (PLEG): container finished" podID="f2c601cc-83d1-42ec-8c21-f5f92ab9e778" containerID="dd4c02a020bc1d870c07693d265cc4b06b2a0d04ff49562cde9860051f7b4a3d" exitCode=0 Mar 12 15:20:25 crc kubenswrapper[4921]: I0312 15:20:25.828657 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j86xl" event={"ID":"f2c601cc-83d1-42ec-8c21-f5f92ab9e778","Type":"ContainerDied","Data":"dd4c02a020bc1d870c07693d265cc4b06b2a0d04ff49562cde9860051f7b4a3d"} Mar 12 15:20:26 crc kubenswrapper[4921]: I0312 15:20:26.323646 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:20:26 crc kubenswrapper[4921]: I0312 15:20:26.324223 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:20:26 crc kubenswrapper[4921]: I0312 15:20:26.324270 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 15:20:26 crc kubenswrapper[4921]: I0312 15:20:26.324982 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:20:26 crc kubenswrapper[4921]: I0312 15:20:26.325040 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" gracePeriod=600 Mar 12 15:20:26 crc kubenswrapper[4921]: E0312 15:20:26.483458 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:20:26 crc kubenswrapper[4921]: I0312 15:20:26.844727 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j86xl" event={"ID":"f2c601cc-83d1-42ec-8c21-f5f92ab9e778","Type":"ContainerStarted","Data":"f1964924a053bba5ec9f308a4608abbbca8346be801a376f5c3e614e6174f64c"} Mar 12 15:20:26 crc kubenswrapper[4921]: I0312 15:20:26.847241 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" exitCode=0 Mar 12 15:20:26 crc kubenswrapper[4921]: I0312 15:20:26.847275 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f"} Mar 12 15:20:26 crc kubenswrapper[4921]: I0312 15:20:26.847301 4921 scope.go:117] "RemoveContainer" containerID="e35138a155ccdd9d0ac38b057cae71bd9423544b9bf4cb58d46480243f059f38" Mar 12 15:20:26 crc kubenswrapper[4921]: I0312 15:20:26.848675 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:20:26 crc kubenswrapper[4921]: E0312 15:20:26.849073 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:20:26 crc kubenswrapper[4921]: I0312 15:20:26.883535 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j86xl" podStartSLOduration=3.365347795 podStartE2EDuration="6.883506666s" podCreationTimestamp="2026-03-12 15:20:20 +0000 UTC" firstStartedPulling="2026-03-12 15:20:22.792788067 +0000 UTC m=+7845.482860038" lastFinishedPulling="2026-03-12 15:20:26.310946938 +0000 UTC m=+7849.001018909" observedRunningTime="2026-03-12 15:20:26.867593503 +0000 UTC m=+7849.557665494" watchObservedRunningTime="2026-03-12 15:20:26.883506666 +0000 UTC m=+7849.573578637" Mar 12 15:20:31 crc kubenswrapper[4921]: I0312 15:20:31.216526 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:31 crc kubenswrapper[4921]: I0312 15:20:31.217414 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:32 crc kubenswrapper[4921]: I0312 15:20:32.267926 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-j86xl" podUID="f2c601cc-83d1-42ec-8c21-f5f92ab9e778" containerName="registry-server" probeResult="failure" output=< Mar 12 15:20:32 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 15:20:32 crc kubenswrapper[4921]: > Mar 12 15:20:38 crc kubenswrapper[4921]: I0312 15:20:38.983392 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:20:38 crc kubenswrapper[4921]: E0312 15:20:38.984478 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:20:41 crc kubenswrapper[4921]: I0312 15:20:41.280697 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:41 crc kubenswrapper[4921]: I0312 15:20:41.333920 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:41 crc kubenswrapper[4921]: I0312 15:20:41.525936 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j86xl"] Mar 12 15:20:43 crc kubenswrapper[4921]: I0312 15:20:43.025934 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j86xl" podUID="f2c601cc-83d1-42ec-8c21-f5f92ab9e778" containerName="registry-server" containerID="cri-o://f1964924a053bba5ec9f308a4608abbbca8346be801a376f5c3e614e6174f64c" gracePeriod=2 Mar 12 15:20:43 crc kubenswrapper[4921]: I0312 15:20:43.533396 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:43 crc kubenswrapper[4921]: I0312 15:20:43.620722 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s46pv\" (UniqueName: \"kubernetes.io/projected/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-kube-api-access-s46pv\") pod \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\" (UID: \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\") " Mar 12 15:20:43 crc kubenswrapper[4921]: I0312 15:20:43.620893 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-utilities\") pod \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\" (UID: \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\") " Mar 12 15:20:43 crc kubenswrapper[4921]: I0312 15:20:43.620960 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-catalog-content\") pod \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\" (UID: \"f2c601cc-83d1-42ec-8c21-f5f92ab9e778\") " Mar 12 15:20:43 crc kubenswrapper[4921]: I0312 15:20:43.622074 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-utilities" (OuterVolumeSpecName: "utilities") pod "f2c601cc-83d1-42ec-8c21-f5f92ab9e778" (UID: "f2c601cc-83d1-42ec-8c21-f5f92ab9e778"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:20:43 crc kubenswrapper[4921]: I0312 15:20:43.648035 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-kube-api-access-s46pv" (OuterVolumeSpecName: "kube-api-access-s46pv") pod "f2c601cc-83d1-42ec-8c21-f5f92ab9e778" (UID: "f2c601cc-83d1-42ec-8c21-f5f92ab9e778"). InnerVolumeSpecName "kube-api-access-s46pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:20:43 crc kubenswrapper[4921]: I0312 15:20:43.700320 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2c601cc-83d1-42ec-8c21-f5f92ab9e778" (UID: "f2c601cc-83d1-42ec-8c21-f5f92ab9e778"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:20:43 crc kubenswrapper[4921]: I0312 15:20:43.725003 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:43 crc kubenswrapper[4921]: I0312 15:20:43.725046 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:43 crc kubenswrapper[4921]: I0312 15:20:43.725058 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s46pv\" (UniqueName: \"kubernetes.io/projected/f2c601cc-83d1-42ec-8c21-f5f92ab9e778-kube-api-access-s46pv\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.041453 4921 generic.go:334] "Generic (PLEG): container finished" podID="f2c601cc-83d1-42ec-8c21-f5f92ab9e778" containerID="f1964924a053bba5ec9f308a4608abbbca8346be801a376f5c3e614e6174f64c" exitCode=0 Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.041525 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j86xl" event={"ID":"f2c601cc-83d1-42ec-8c21-f5f92ab9e778","Type":"ContainerDied","Data":"f1964924a053bba5ec9f308a4608abbbca8346be801a376f5c3e614e6174f64c"} Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.041543 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j86xl" Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.041579 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j86xl" event={"ID":"f2c601cc-83d1-42ec-8c21-f5f92ab9e778","Type":"ContainerDied","Data":"b931b7bb72ff2a6971ee4ed3993bb3a175f5a8cdbe72b1be9ac080ecb590c7cc"} Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.041614 4921 scope.go:117] "RemoveContainer" containerID="f1964924a053bba5ec9f308a4608abbbca8346be801a376f5c3e614e6174f64c" Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.063704 4921 scope.go:117] "RemoveContainer" containerID="dd4c02a020bc1d870c07693d265cc4b06b2a0d04ff49562cde9860051f7b4a3d" Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.072952 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j86xl"] Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.088096 4921 scope.go:117] "RemoveContainer" containerID="e2fd43fa408ce7b9337c3aa26b7617b74c4c9dea9335459502f176bde576df1e" Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.096724 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j86xl"] Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.144086 4921 scope.go:117] "RemoveContainer" containerID="f1964924a053bba5ec9f308a4608abbbca8346be801a376f5c3e614e6174f64c" Mar 12 15:20:44 crc kubenswrapper[4921]: E0312 15:20:44.144989 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1964924a053bba5ec9f308a4608abbbca8346be801a376f5c3e614e6174f64c\": container with ID starting with f1964924a053bba5ec9f308a4608abbbca8346be801a376f5c3e614e6174f64c not found: ID does not exist" containerID="f1964924a053bba5ec9f308a4608abbbca8346be801a376f5c3e614e6174f64c" Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.145066 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1964924a053bba5ec9f308a4608abbbca8346be801a376f5c3e614e6174f64c"} err="failed to get container status \"f1964924a053bba5ec9f308a4608abbbca8346be801a376f5c3e614e6174f64c\": rpc error: code = NotFound desc = could not find container \"f1964924a053bba5ec9f308a4608abbbca8346be801a376f5c3e614e6174f64c\": container with ID starting with f1964924a053bba5ec9f308a4608abbbca8346be801a376f5c3e614e6174f64c not found: ID does not exist" Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.145107 4921 scope.go:117] "RemoveContainer" containerID="dd4c02a020bc1d870c07693d265cc4b06b2a0d04ff49562cde9860051f7b4a3d" Mar 12 15:20:44 crc kubenswrapper[4921]: E0312 15:20:44.145508 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd4c02a020bc1d870c07693d265cc4b06b2a0d04ff49562cde9860051f7b4a3d\": container with ID starting with dd4c02a020bc1d870c07693d265cc4b06b2a0d04ff49562cde9860051f7b4a3d not found: ID does not exist" containerID="dd4c02a020bc1d870c07693d265cc4b06b2a0d04ff49562cde9860051f7b4a3d" Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.145571 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4c02a020bc1d870c07693d265cc4b06b2a0d04ff49562cde9860051f7b4a3d"} err="failed to get container status \"dd4c02a020bc1d870c07693d265cc4b06b2a0d04ff49562cde9860051f7b4a3d\": rpc error: code = NotFound desc = could not find container \"dd4c02a020bc1d870c07693d265cc4b06b2a0d04ff49562cde9860051f7b4a3d\": container with ID starting with dd4c02a020bc1d870c07693d265cc4b06b2a0d04ff49562cde9860051f7b4a3d not found: ID does not exist" Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.145612 4921 scope.go:117] "RemoveContainer" containerID="e2fd43fa408ce7b9337c3aa26b7617b74c4c9dea9335459502f176bde576df1e" Mar 12 15:20:44 crc kubenswrapper[4921]: E0312 15:20:44.146103 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2fd43fa408ce7b9337c3aa26b7617b74c4c9dea9335459502f176bde576df1e\": container with ID starting with e2fd43fa408ce7b9337c3aa26b7617b74c4c9dea9335459502f176bde576df1e not found: ID does not exist" containerID="e2fd43fa408ce7b9337c3aa26b7617b74c4c9dea9335459502f176bde576df1e" Mar 12 15:20:44 crc kubenswrapper[4921]: I0312 15:20:44.146177 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2fd43fa408ce7b9337c3aa26b7617b74c4c9dea9335459502f176bde576df1e"} err="failed to get container status \"e2fd43fa408ce7b9337c3aa26b7617b74c4c9dea9335459502f176bde576df1e\": rpc error: code = NotFound desc = could not find container \"e2fd43fa408ce7b9337c3aa26b7617b74c4c9dea9335459502f176bde576df1e\": container with ID starting with e2fd43fa408ce7b9337c3aa26b7617b74c4c9dea9335459502f176bde576df1e not found: ID does not exist" Mar 12 15:20:45 crc kubenswrapper[4921]: I0312 15:20:45.997494 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2c601cc-83d1-42ec-8c21-f5f92ab9e778" path="/var/lib/kubelet/pods/f2c601cc-83d1-42ec-8c21-f5f92ab9e778/volumes" Mar 12 15:20:49 crc kubenswrapper[4921]: I0312 15:20:49.120984 4921 generic.go:334] "Generic (PLEG): container finished" podID="5a873062-4c06-41b3-9078-4132ee69b343" containerID="11a2ce2cf26685312e91198842d10ea17050f901589e9477b94761370703cc4a" exitCode=0 Mar 12 15:20:49 crc kubenswrapper[4921]: I0312 15:20:49.121036 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcf58/crc-debug-k6n9c" event={"ID":"5a873062-4c06-41b3-9078-4132ee69b343","Type":"ContainerDied","Data":"11a2ce2cf26685312e91198842d10ea17050f901589e9477b94761370703cc4a"} Mar 12 15:20:50 crc kubenswrapper[4921]: I0312 15:20:50.264689 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/crc-debug-k6n9c" Mar 12 15:20:50 crc kubenswrapper[4921]: I0312 15:20:50.307676 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tcf58/crc-debug-k6n9c"] Mar 12 15:20:50 crc kubenswrapper[4921]: I0312 15:20:50.321461 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tcf58/crc-debug-k6n9c"] Mar 12 15:20:50 crc kubenswrapper[4921]: I0312 15:20:50.388757 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a873062-4c06-41b3-9078-4132ee69b343-host\") pod \"5a873062-4c06-41b3-9078-4132ee69b343\" (UID: \"5a873062-4c06-41b3-9078-4132ee69b343\") " Mar 12 15:20:50 crc kubenswrapper[4921]: I0312 15:20:50.388903 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snvs8\" (UniqueName: \"kubernetes.io/projected/5a873062-4c06-41b3-9078-4132ee69b343-kube-api-access-snvs8\") pod \"5a873062-4c06-41b3-9078-4132ee69b343\" (UID: \"5a873062-4c06-41b3-9078-4132ee69b343\") " Mar 12 15:20:50 crc kubenswrapper[4921]: I0312 15:20:50.388882 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a873062-4c06-41b3-9078-4132ee69b343-host" (OuterVolumeSpecName: "host") pod "5a873062-4c06-41b3-9078-4132ee69b343" (UID: "5a873062-4c06-41b3-9078-4132ee69b343"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:20:50 crc kubenswrapper[4921]: I0312 15:20:50.389555 4921 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a873062-4c06-41b3-9078-4132ee69b343-host\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:50 crc kubenswrapper[4921]: I0312 15:20:50.399131 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a873062-4c06-41b3-9078-4132ee69b343-kube-api-access-snvs8" (OuterVolumeSpecName: "kube-api-access-snvs8") pod "5a873062-4c06-41b3-9078-4132ee69b343" (UID: "5a873062-4c06-41b3-9078-4132ee69b343"). InnerVolumeSpecName "kube-api-access-snvs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:20:50 crc kubenswrapper[4921]: I0312 15:20:50.492034 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snvs8\" (UniqueName: \"kubernetes.io/projected/5a873062-4c06-41b3-9078-4132ee69b343-kube-api-access-snvs8\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.142707 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38cb17d6d010b1fec7d9edd73b46bdcfe90955fb9d54a14eea2cdce4c72caf8c" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.142750 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/crc-debug-k6n9c" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.257172 4921 scope.go:117] "RemoveContainer" containerID="34db1c5a592b52e623a90e94d06110b85a45f62ab11a08bbc0513beb736161f2" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.631645 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tcf58/crc-debug-tqpnl"] Mar 12 15:20:51 crc kubenswrapper[4921]: E0312 15:20:51.633669 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2c601cc-83d1-42ec-8c21-f5f92ab9e778" containerName="extract-content" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.633701 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c601cc-83d1-42ec-8c21-f5f92ab9e778" containerName="extract-content" Mar 12 15:20:51 crc kubenswrapper[4921]: E0312 15:20:51.633827 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a873062-4c06-41b3-9078-4132ee69b343" containerName="container-00" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.633843 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a873062-4c06-41b3-9078-4132ee69b343" containerName="container-00" Mar 12 15:20:51 crc kubenswrapper[4921]: E0312 15:20:51.633856 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2c601cc-83d1-42ec-8c21-f5f92ab9e778" containerName="registry-server" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.633864 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c601cc-83d1-42ec-8c21-f5f92ab9e778" containerName="registry-server" Mar 12 15:20:51 crc kubenswrapper[4921]: E0312 15:20:51.633890 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2c601cc-83d1-42ec-8c21-f5f92ab9e778" containerName="extract-utilities" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.633900 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c601cc-83d1-42ec-8c21-f5f92ab9e778" containerName="extract-utilities" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.634210 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a873062-4c06-41b3-9078-4132ee69b343" containerName="container-00" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.634248 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2c601cc-83d1-42ec-8c21-f5f92ab9e778" containerName="registry-server" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.635218 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/crc-debug-tqpnl" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.723270 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8efdfb02-7fef-4510-ba28-a62daf11571a-host\") pod \"crc-debug-tqpnl\" (UID: \"8efdfb02-7fef-4510-ba28-a62daf11571a\") " pod="openshift-must-gather-tcf58/crc-debug-tqpnl" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.731662 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49dtc\" (UniqueName: \"kubernetes.io/projected/8efdfb02-7fef-4510-ba28-a62daf11571a-kube-api-access-49dtc\") pod \"crc-debug-tqpnl\" (UID: \"8efdfb02-7fef-4510-ba28-a62daf11571a\") " pod="openshift-must-gather-tcf58/crc-debug-tqpnl" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.834782 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8efdfb02-7fef-4510-ba28-a62daf11571a-host\") pod \"crc-debug-tqpnl\" (UID: \"8efdfb02-7fef-4510-ba28-a62daf11571a\") " pod="openshift-must-gather-tcf58/crc-debug-tqpnl" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.834954 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8efdfb02-7fef-4510-ba28-a62daf11571a-host\") pod \"crc-debug-tqpnl\" (UID: \"8efdfb02-7fef-4510-ba28-a62daf11571a\") " pod="openshift-must-gather-tcf58/crc-debug-tqpnl" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.835008 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49dtc\" (UniqueName: \"kubernetes.io/projected/8efdfb02-7fef-4510-ba28-a62daf11571a-kube-api-access-49dtc\") pod \"crc-debug-tqpnl\" (UID: \"8efdfb02-7fef-4510-ba28-a62daf11571a\") " pod="openshift-must-gather-tcf58/crc-debug-tqpnl" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.856863 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49dtc\" (UniqueName: \"kubernetes.io/projected/8efdfb02-7fef-4510-ba28-a62daf11571a-kube-api-access-49dtc\") pod \"crc-debug-tqpnl\" (UID: \"8efdfb02-7fef-4510-ba28-a62daf11571a\") " pod="openshift-must-gather-tcf58/crc-debug-tqpnl" Mar 12 15:20:51 crc kubenswrapper[4921]: I0312 15:20:51.966745 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/crc-debug-tqpnl" Mar 12 15:20:52 crc kubenswrapper[4921]: I0312 15:20:52.012249 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a873062-4c06-41b3-9078-4132ee69b343" path="/var/lib/kubelet/pods/5a873062-4c06-41b3-9078-4132ee69b343/volumes" Mar 12 15:20:52 crc kubenswrapper[4921]: I0312 15:20:52.158147 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcf58/crc-debug-tqpnl" event={"ID":"8efdfb02-7fef-4510-ba28-a62daf11571a","Type":"ContainerStarted","Data":"b051409cbc50971cf430e079bc4de166e9a28641f1fc65962776f4b4f47d2b93"} Mar 12 15:20:53 crc kubenswrapper[4921]: I0312 15:20:53.168726 4921 generic.go:334] "Generic (PLEG): container finished" podID="8efdfb02-7fef-4510-ba28-a62daf11571a" containerID="9f4cbf9225bc64836012350afca6bdae4e59fa1afad82e3cb8e61abf9ee83c48" exitCode=0 Mar 12 15:20:53 crc kubenswrapper[4921]: I0312 15:20:53.168824 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcf58/crc-debug-tqpnl" event={"ID":"8efdfb02-7fef-4510-ba28-a62daf11571a","Type":"ContainerDied","Data":"9f4cbf9225bc64836012350afca6bdae4e59fa1afad82e3cb8e61abf9ee83c48"} Mar 12 15:20:53 crc kubenswrapper[4921]: I0312 15:20:53.984023 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:20:53 crc kubenswrapper[4921]: E0312 15:20:53.984618 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:20:54 crc kubenswrapper[4921]: I0312 15:20:54.296011 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/crc-debug-tqpnl" Mar 12 15:20:54 crc kubenswrapper[4921]: I0312 15:20:54.395853 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8efdfb02-7fef-4510-ba28-a62daf11571a-host\") pod \"8efdfb02-7fef-4510-ba28-a62daf11571a\" (UID: \"8efdfb02-7fef-4510-ba28-a62daf11571a\") " Mar 12 15:20:54 crc kubenswrapper[4921]: I0312 15:20:54.395936 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8efdfb02-7fef-4510-ba28-a62daf11571a-host" (OuterVolumeSpecName: "host") pod "8efdfb02-7fef-4510-ba28-a62daf11571a" (UID: "8efdfb02-7fef-4510-ba28-a62daf11571a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:20:54 crc kubenswrapper[4921]: I0312 15:20:54.396938 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49dtc\" (UniqueName: \"kubernetes.io/projected/8efdfb02-7fef-4510-ba28-a62daf11571a-kube-api-access-49dtc\") pod \"8efdfb02-7fef-4510-ba28-a62daf11571a\" (UID: \"8efdfb02-7fef-4510-ba28-a62daf11571a\") " Mar 12 15:20:54 crc kubenswrapper[4921]: I0312 15:20:54.397887 4921 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8efdfb02-7fef-4510-ba28-a62daf11571a-host\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:54 crc kubenswrapper[4921]: I0312 15:20:54.409652 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8efdfb02-7fef-4510-ba28-a62daf11571a-kube-api-access-49dtc" (OuterVolumeSpecName: "kube-api-access-49dtc") pod "8efdfb02-7fef-4510-ba28-a62daf11571a" (UID: "8efdfb02-7fef-4510-ba28-a62daf11571a"). InnerVolumeSpecName "kube-api-access-49dtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:20:54 crc kubenswrapper[4921]: I0312 15:20:54.499868 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49dtc\" (UniqueName: \"kubernetes.io/projected/8efdfb02-7fef-4510-ba28-a62daf11571a-kube-api-access-49dtc\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:55 crc kubenswrapper[4921]: I0312 15:20:55.208554 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcf58/crc-debug-tqpnl" event={"ID":"8efdfb02-7fef-4510-ba28-a62daf11571a","Type":"ContainerDied","Data":"b051409cbc50971cf430e079bc4de166e9a28641f1fc65962776f4b4f47d2b93"} Mar 12 15:20:55 crc kubenswrapper[4921]: I0312 15:20:55.208645 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b051409cbc50971cf430e079bc4de166e9a28641f1fc65962776f4b4f47d2b93" Mar 12 15:20:55 crc kubenswrapper[4921]: I0312 15:20:55.208740 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/crc-debug-tqpnl" Mar 12 15:20:55 crc kubenswrapper[4921]: I0312 15:20:55.851040 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tcf58/crc-debug-tqpnl"] Mar 12 15:20:55 crc kubenswrapper[4921]: I0312 15:20:55.864429 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tcf58/crc-debug-tqpnl"] Mar 12 15:20:55 crc kubenswrapper[4921]: I0312 15:20:55.996573 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8efdfb02-7fef-4510-ba28-a62daf11571a" path="/var/lib/kubelet/pods/8efdfb02-7fef-4510-ba28-a62daf11571a/volumes" Mar 12 15:20:57 crc kubenswrapper[4921]: I0312 15:20:57.079464 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tcf58/crc-debug-8cdbz"] Mar 12 15:20:57 crc kubenswrapper[4921]: E0312 15:20:57.079994 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8efdfb02-7fef-4510-ba28-a62daf11571a" containerName="container-00" Mar 12 15:20:57 crc kubenswrapper[4921]: I0312 15:20:57.080009 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8efdfb02-7fef-4510-ba28-a62daf11571a" containerName="container-00" Mar 12 15:20:57 crc kubenswrapper[4921]: I0312 15:20:57.080607 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8efdfb02-7fef-4510-ba28-a62daf11571a" containerName="container-00" Mar 12 15:20:57 crc kubenswrapper[4921]: I0312 15:20:57.081416 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/crc-debug-8cdbz" Mar 12 15:20:57 crc kubenswrapper[4921]: I0312 15:20:57.162156 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cg9w\" (UniqueName: \"kubernetes.io/projected/f2af9bdd-a17a-414b-8fd0-7a4e44b64f71-kube-api-access-4cg9w\") pod \"crc-debug-8cdbz\" (UID: \"f2af9bdd-a17a-414b-8fd0-7a4e44b64f71\") " pod="openshift-must-gather-tcf58/crc-debug-8cdbz" Mar 12 15:20:57 crc kubenswrapper[4921]: I0312 15:20:57.162477 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2af9bdd-a17a-414b-8fd0-7a4e44b64f71-host\") pod \"crc-debug-8cdbz\" (UID: \"f2af9bdd-a17a-414b-8fd0-7a4e44b64f71\") " pod="openshift-must-gather-tcf58/crc-debug-8cdbz" Mar 12 15:20:57 crc kubenswrapper[4921]: I0312 15:20:57.265631 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cg9w\" (UniqueName: \"kubernetes.io/projected/f2af9bdd-a17a-414b-8fd0-7a4e44b64f71-kube-api-access-4cg9w\") pod \"crc-debug-8cdbz\" (UID: \"f2af9bdd-a17a-414b-8fd0-7a4e44b64f71\") " pod="openshift-must-gather-tcf58/crc-debug-8cdbz" Mar 12 15:20:57 crc kubenswrapper[4921]: I0312 15:20:57.265721 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2af9bdd-a17a-414b-8fd0-7a4e44b64f71-host\") pod \"crc-debug-8cdbz\" (UID: \"f2af9bdd-a17a-414b-8fd0-7a4e44b64f71\") " pod="openshift-must-gather-tcf58/crc-debug-8cdbz" Mar 12 15:20:57 crc kubenswrapper[4921]: I0312 15:20:57.265915 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2af9bdd-a17a-414b-8fd0-7a4e44b64f71-host\") pod \"crc-debug-8cdbz\" (UID: \"f2af9bdd-a17a-414b-8fd0-7a4e44b64f71\") " pod="openshift-must-gather-tcf58/crc-debug-8cdbz" Mar 12 15:20:57 crc kubenswrapper[4921]: I0312 15:20:57.296788 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cg9w\" (UniqueName: \"kubernetes.io/projected/f2af9bdd-a17a-414b-8fd0-7a4e44b64f71-kube-api-access-4cg9w\") pod \"crc-debug-8cdbz\" (UID: \"f2af9bdd-a17a-414b-8fd0-7a4e44b64f71\") " pod="openshift-must-gather-tcf58/crc-debug-8cdbz" Mar 12 15:20:57 crc kubenswrapper[4921]: I0312 15:20:57.406059 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/crc-debug-8cdbz" Mar 12 15:20:57 crc kubenswrapper[4921]: W0312 15:20:57.445148 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2af9bdd_a17a_414b_8fd0_7a4e44b64f71.slice/crio-eeb0432372d456a8eda072861fd413df5c4ee16387efd72aa63fc4748964557f WatchSource:0}: Error finding container eeb0432372d456a8eda072861fd413df5c4ee16387efd72aa63fc4748964557f: Status 404 returned error can't find the container with id eeb0432372d456a8eda072861fd413df5c4ee16387efd72aa63fc4748964557f Mar 12 15:20:58 crc kubenswrapper[4921]: I0312 15:20:58.243876 4921 generic.go:334] "Generic (PLEG): container finished" podID="f2af9bdd-a17a-414b-8fd0-7a4e44b64f71" containerID="8961a30658e5eac0dc090c742c1b1bdab44a27d2f924dfebb987e932f855e415" exitCode=0 Mar 12 15:20:58 crc kubenswrapper[4921]: I0312 15:20:58.243962 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcf58/crc-debug-8cdbz" event={"ID":"f2af9bdd-a17a-414b-8fd0-7a4e44b64f71","Type":"ContainerDied","Data":"8961a30658e5eac0dc090c742c1b1bdab44a27d2f924dfebb987e932f855e415"} Mar 12 15:20:58 crc kubenswrapper[4921]: I0312 15:20:58.245488 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcf58/crc-debug-8cdbz" event={"ID":"f2af9bdd-a17a-414b-8fd0-7a4e44b64f71","Type":"ContainerStarted","Data":"eeb0432372d456a8eda072861fd413df5c4ee16387efd72aa63fc4748964557f"} Mar 12 15:20:58 crc kubenswrapper[4921]: I0312 15:20:58.293643 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tcf58/crc-debug-8cdbz"] Mar 12 15:20:58 crc kubenswrapper[4921]: I0312 15:20:58.308544 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tcf58/crc-debug-8cdbz"] Mar 12 15:20:59 crc kubenswrapper[4921]: I0312 15:20:59.378396 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/crc-debug-8cdbz" Mar 12 15:20:59 crc kubenswrapper[4921]: I0312 15:20:59.424091 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cg9w\" (UniqueName: \"kubernetes.io/projected/f2af9bdd-a17a-414b-8fd0-7a4e44b64f71-kube-api-access-4cg9w\") pod \"f2af9bdd-a17a-414b-8fd0-7a4e44b64f71\" (UID: \"f2af9bdd-a17a-414b-8fd0-7a4e44b64f71\") " Mar 12 15:20:59 crc kubenswrapper[4921]: I0312 15:20:59.424248 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2af9bdd-a17a-414b-8fd0-7a4e44b64f71-host\") pod \"f2af9bdd-a17a-414b-8fd0-7a4e44b64f71\" (UID: \"f2af9bdd-a17a-414b-8fd0-7a4e44b64f71\") " Mar 12 15:20:59 crc kubenswrapper[4921]: I0312 15:20:59.424377 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2af9bdd-a17a-414b-8fd0-7a4e44b64f71-host" (OuterVolumeSpecName: "host") pod "f2af9bdd-a17a-414b-8fd0-7a4e44b64f71" (UID: "f2af9bdd-a17a-414b-8fd0-7a4e44b64f71"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:20:59 crc kubenswrapper[4921]: I0312 15:20:59.425095 4921 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2af9bdd-a17a-414b-8fd0-7a4e44b64f71-host\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:59 crc kubenswrapper[4921]: I0312 15:20:59.433127 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2af9bdd-a17a-414b-8fd0-7a4e44b64f71-kube-api-access-4cg9w" (OuterVolumeSpecName: "kube-api-access-4cg9w") pod "f2af9bdd-a17a-414b-8fd0-7a4e44b64f71" (UID: "f2af9bdd-a17a-414b-8fd0-7a4e44b64f71"). InnerVolumeSpecName "kube-api-access-4cg9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:20:59 crc kubenswrapper[4921]: I0312 15:20:59.527145 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cg9w\" (UniqueName: \"kubernetes.io/projected/f2af9bdd-a17a-414b-8fd0-7a4e44b64f71-kube-api-access-4cg9w\") on node \"crc\" DevicePath \"\"" Mar 12 15:20:59 crc kubenswrapper[4921]: I0312 15:20:59.999676 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2af9bdd-a17a-414b-8fd0-7a4e44b64f71" path="/var/lib/kubelet/pods/f2af9bdd-a17a-414b-8fd0-7a4e44b64f71/volumes" Mar 12 15:21:00 crc kubenswrapper[4921]: I0312 15:21:00.270359 4921 scope.go:117] "RemoveContainer" containerID="8961a30658e5eac0dc090c742c1b1bdab44a27d2f924dfebb987e932f855e415" Mar 12 15:21:00 crc kubenswrapper[4921]: I0312 15:21:00.270414 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/crc-debug-8cdbz" Mar 12 15:21:05 crc kubenswrapper[4921]: I0312 15:21:05.984085 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:21:05 crc kubenswrapper[4921]: E0312 15:21:05.984698 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:21:17 crc kubenswrapper[4921]: I0312 15:21:17.989993 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:21:17 crc kubenswrapper[4921]: E0312 15:21:17.991069 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:21:31 crc kubenswrapper[4921]: I0312 15:21:31.984157 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:21:31 crc kubenswrapper[4921]: E0312 15:21:31.985291 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:21:34 crc kubenswrapper[4921]: I0312 15:21:34.062627 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-bcbd96998-bx4p5_59a6f440-5a89-42a7-baa1-77a875476665/barbican-api/0.log" Mar 12 15:21:34 crc kubenswrapper[4921]: I0312 15:21:34.272941 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-bcbd96998-bx4p5_59a6f440-5a89-42a7-baa1-77a875476665/barbican-api-log/0.log" Mar 12 15:21:34 crc kubenswrapper[4921]: I0312 15:21:34.356497 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76b64f84d4-tpqnj_47867e82-3783-4f22-bc4f-9128016cf98e/barbican-keystone-listener/0.log" Mar 12 15:21:34 crc kubenswrapper[4921]: I0312 15:21:34.579545 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-594f99766c-xf6hh_6c4d7515-b40d-418c-b32e-b6a857c040a7/barbican-worker/0.log" Mar 12 15:21:34 crc kubenswrapper[4921]: I0312 15:21:34.590225 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76b64f84d4-tpqnj_47867e82-3783-4f22-bc4f-9128016cf98e/barbican-keystone-listener-log/0.log" Mar 12 15:21:34 crc kubenswrapper[4921]: I0312 15:21:34.628604 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-594f99766c-xf6hh_6c4d7515-b40d-418c-b32e-b6a857c040a7/barbican-worker-log/0.log" Mar 12 15:21:34 crc kubenswrapper[4921]: I0312 15:21:34.872437 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf_e5130d9e-9678-42d8-9394-bcced05db054/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:34 crc kubenswrapper[4921]: I0312 15:21:34.893896 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f195685b-74f0-4887-8598-367bf4425faa/ceilometer-central-agent/0.log" Mar 12 15:21:35 crc kubenswrapper[4921]: I0312 15:21:35.057485 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f195685b-74f0-4887-8598-367bf4425faa/ceilometer-notification-agent/0.log" Mar 12 15:21:35 crc kubenswrapper[4921]: I0312 15:21:35.092586 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f195685b-74f0-4887-8598-367bf4425faa/sg-core/0.log" Mar 12 15:21:35 crc kubenswrapper[4921]: I0312 15:21:35.108313 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f195685b-74f0-4887-8598-367bf4425faa/proxy-httpd/0.log" Mar 12 15:21:35 crc kubenswrapper[4921]: I0312 15:21:35.288668 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk_cbaebc43-5127-4000-abb3-79a878177cd2/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:35 crc kubenswrapper[4921]: I0312 15:21:35.315298 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-dt558_f5b6000a-13f1-4d52-9a03-3b777b3d651d/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:35 crc kubenswrapper[4921]: I0312 15:21:35.607421 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a5b74f92-1f9b-4321-b549-47269e3eb04c/cinder-api-log/0.log" Mar 12 15:21:35 crc kubenswrapper[4921]: I0312 15:21:35.701548 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a5b74f92-1f9b-4321-b549-47269e3eb04c/cinder-api/0.log" Mar 12 15:21:35 crc kubenswrapper[4921]: I0312 15:21:35.909907 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-1_b1c64c98-e301-4386-b33e-ccd4fde7592d/cinder-api/0.log" Mar 12 15:21:35 crc kubenswrapper[4921]: I0312 15:21:35.977479 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-1_b1c64c98-e301-4386-b33e-ccd4fde7592d/cinder-api-log/0.log" Mar 12 15:21:36 crc kubenswrapper[4921]: I0312 15:21:36.319207 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0ca55d43-e73b-403b-9760-f71e8b926650/probe/0.log" Mar 12 15:21:36 crc kubenswrapper[4921]: I0312 15:21:36.323990 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7cda98bc-d6ac-4204-8477-8ecd7dafb976/cinder-scheduler/0.log" Mar 12 15:21:36 crc kubenswrapper[4921]: I0312 15:21:36.566433 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7cda98bc-d6ac-4204-8477-8ecd7dafb976/probe/0.log" Mar 12 15:21:36 crc kubenswrapper[4921]: I0312 15:21:36.876804 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_8671593e-1709-4d99-ae81-8639ee492d20/probe/0.log" Mar 12 15:21:37 crc kubenswrapper[4921]: I0312 15:21:37.064515 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw_0a18ea59-b5e6-40e3-8096-0f2bda4563bb/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:37 crc kubenswrapper[4921]: I0312 15:21:37.395999 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pcpck_5a0ab9f2-e0b6-40e1-9816-a11f8135ed75/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:37 crc kubenswrapper[4921]: I0312 15:21:37.623199 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5664d5cbb7-9rpxn_5f732887-96f4-4cd5-9a36-df3848958280/init/0.log" Mar 12 15:21:37 crc kubenswrapper[4921]: I0312 15:21:37.837201 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5664d5cbb7-9rpxn_5f732887-96f4-4cd5-9a36-df3848958280/init/0.log" Mar 12 15:21:38 crc kubenswrapper[4921]: I0312 15:21:38.440069 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3ddcb284-70a7-47da-8b0e-e5ba1f0a9443/glance-httpd/0.log" Mar 12 15:21:38 crc kubenswrapper[4921]: I0312 15:21:38.670165 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3ddcb284-70a7-47da-8b0e-e5ba1f0a9443/glance-log/0.log" Mar 12 15:21:38 crc kubenswrapper[4921]: I0312 15:21:38.953570 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5664d5cbb7-9rpxn_5f732887-96f4-4cd5-9a36-df3848958280/dnsmasq-dns/0.log" Mar 12 15:21:38 crc kubenswrapper[4921]: I0312 15:21:38.957053 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-1_5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b/glance-httpd/0.log" Mar 12 15:21:39 crc kubenswrapper[4921]: I0312 15:21:39.075368 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-1_5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b/glance-log/0.log" Mar 12 15:21:39 crc kubenswrapper[4921]: I0312 15:21:39.281071 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d506b9f9-1563-432f-9b21-760ceb017fe9/glance-log/0.log" Mar 12 15:21:39 crc kubenswrapper[4921]: I0312 15:21:39.316576 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d506b9f9-1563-432f-9b21-760ceb017fe9/glance-httpd/0.log" Mar 12 15:21:39 crc kubenswrapper[4921]: I0312 15:21:39.569345 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-1_739d7b6f-9f1d-4052-958f-e08821db9361/glance-log/0.log" Mar 12 15:21:39 crc kubenswrapper[4921]: I0312 15:21:39.623374 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-1_739d7b6f-9f1d-4052-958f-e08821db9361/glance-httpd/0.log" Mar 12 15:21:39 crc kubenswrapper[4921]: I0312 15:21:39.792944 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0ca55d43-e73b-403b-9760-f71e8b926650/cinder-backup/0.log" Mar 12 15:21:39 crc kubenswrapper[4921]: I0312 15:21:39.955106 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_8671593e-1709-4d99-ae81-8639ee492d20/cinder-volume/0.log" Mar 12 15:21:39 crc kubenswrapper[4921]: I0312 15:21:39.973479 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-bbd56cc76-cwl96_e6e62dec-8193-4d3c-a111-2ee250f79b86/horizon/0.log" Mar 12 15:21:40 crc kubenswrapper[4921]: I0312 15:21:40.046590 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl_c4eac827-ab86-4fef-b974-8638416f5125/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:40 crc kubenswrapper[4921]: I0312 15:21:40.327741 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hfsqf_56567424-34cd-49a4-ad03-c72a25a07058/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:40 crc kubenswrapper[4921]: I0312 15:21:40.776271 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29555401-cfhz9_c85b992e-689f-4f2f-9799-da7e608f6ca8/keystone-cron/0.log" Mar 12 15:21:40 crc kubenswrapper[4921]: I0312 15:21:40.949324 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-bbd56cc76-cwl96_e6e62dec-8193-4d3c-a111-2ee250f79b86/horizon-log/0.log" Mar 12 15:21:41 crc kubenswrapper[4921]: I0312 15:21:41.029415 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29555461-nscpw_ce60198f-3189-4ce6-b4a7-32387eb98fa7/keystone-cron/0.log" Mar 12 15:21:41 crc kubenswrapper[4921]: I0312 15:21:41.200398 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_01d94a77-b0dc-48b9-863b-71dbccd74bfb/kube-state-metrics/0.log" Mar 12 15:21:41 crc kubenswrapper[4921]: I0312 15:21:41.476720 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6_2ee1e205-39b3-4648-8c21-4a7cd46b867f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:42 crc kubenswrapper[4921]: I0312 15:21:42.824652 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-547b7895d7-42nbh_f1a475b3-67ed-40db-b403-0f82930d5d36/neutron-httpd/0.log" Mar 12 15:21:43 crc kubenswrapper[4921]: I0312 15:21:43.000169 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c8b44c5c7-l6d8m_8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118/keystone-api/0.log" Mar 12 15:21:43 crc kubenswrapper[4921]: I0312 15:21:43.406988 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c8b44c5c7-pc46f_3fcdfac3-13b0-42ac-9396-587a7d443e2a/keystone-api/0.log" Mar 12 15:21:43 crc kubenswrapper[4921]: I0312 15:21:43.975102 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2_f5126789-42a1-4b3d-bc96-384b4db790b6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:44 crc kubenswrapper[4921]: I0312 15:21:44.084522 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-547b7895d7-9c58r_4d97370e-b2d5-463a-ba6d-5e8e12618140/neutron-httpd/0.log" Mar 12 15:21:46 crc kubenswrapper[4921]: I0312 15:21:46.984804 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:21:46 crc kubenswrapper[4921]: E0312 15:21:46.985670 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:21:47 crc kubenswrapper[4921]: I0312 15:21:47.791941 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_148f1f44-e990-4353-b376-1ccbb7f01d0a/nova-api-log/0.log" Mar 12 15:21:50 crc kubenswrapper[4921]: I0312 15:21:50.804867 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-547b7895d7-9c58r_4d97370e-b2d5-463a-ba6d-5e8e12618140/neutron-api/0.log" Mar 12 15:21:51 crc kubenswrapper[4921]: I0312 15:21:51.040256 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_148f1f44-e990-4353-b376-1ccbb7f01d0a/nova-api-api/0.log" Mar 12 15:21:51 crc kubenswrapper[4921]: I0312 15:21:51.183195 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-547b7895d7-42nbh_f1a475b3-67ed-40db-b403-0f82930d5d36/neutron-api/0.log" Mar 12 15:21:51 crc kubenswrapper[4921]: I0312 15:21:51.981366 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a7798e1f-b22a-4ebd-a812-e8c17694cf60/nova-cell1-conductor-conductor/0.log" Mar 12 15:21:52 crc kubenswrapper[4921]: I0312 15:21:52.419536 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_072b6f7c-f4af-4657-82e6-ff8acb7404d5/nova-cell0-conductor-conductor/0.log" Mar 12 15:21:52 crc kubenswrapper[4921]: I0312 15:21:52.739935 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_6f997ce1-fc3d-4a1c-b9a8-d357e879f70d/nova-cell1-novncproxy-novncproxy/0.log" Mar 12 15:21:53 crc kubenswrapper[4921]: I0312 15:21:53.029492 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j_bcef78dc-2d5d-4a04-b106-2b54e1b11292/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:53 crc kubenswrapper[4921]: I0312 15:21:53.335850 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a8089872-446f-4355-94d8-8b82e1b04030/nova-metadata-log/0.log" Mar 12 15:21:54 crc kubenswrapper[4921]: I0312 15:21:54.808382 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-1_ae5ecb59-c6e0-4a5f-a034-059935a3eaff/nova-api-api/0.log" Mar 12 15:21:55 crc kubenswrapper[4921]: I0312 15:21:55.009103 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-1_ae5ecb59-c6e0-4a5f-a034-059935a3eaff/nova-api-log/0.log" Mar 12 15:21:55 crc kubenswrapper[4921]: I0312 15:21:55.259763 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_69b5525a-14c6-453f-9673-11d9e63dd25a/mysql-bootstrap/0.log" Mar 12 15:21:55 crc kubenswrapper[4921]: I0312 15:21:55.292006 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a8089872-446f-4355-94d8-8b82e1b04030/nova-metadata-metadata/0.log" Mar 12 15:21:55 crc kubenswrapper[4921]: I0312 15:21:55.492179 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_69b5525a-14c6-453f-9673-11d9e63dd25a/mysql-bootstrap/0.log" Mar 12 15:21:55 crc kubenswrapper[4921]: I0312 15:21:55.522242 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_69b5525a-14c6-453f-9673-11d9e63dd25a/galera/0.log" Mar 12 15:21:55 crc kubenswrapper[4921]: I0312 15:21:55.748699 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ab9571cc-4c2d-4462-adc5-f84bd590bcca/mysql-bootstrap/0.log" Mar 12 15:21:55 crc kubenswrapper[4921]: I0312 15:21:55.951326 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f0c221da-6e02-450a-a048-9c8292c208ff/memcached/0.log" Mar 12 15:21:55 crc kubenswrapper[4921]: I0312 15:21:55.963600 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ab9571cc-4c2d-4462-adc5-f84bd590bcca/mysql-bootstrap/0.log" Mar 12 15:21:55 crc kubenswrapper[4921]: I0312 15:21:55.966770 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b3862104-1cf4-4b79-ab48-f94ad1e83964/nova-scheduler-scheduler/0.log" Mar 12 15:21:55 crc kubenswrapper[4921]: I0312 15:21:55.978143 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ab9571cc-4c2d-4462-adc5-f84bd590bcca/galera/0.log" Mar 12 15:21:56 crc kubenswrapper[4921]: I0312 15:21:56.224001 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_345031e5-3e52-4b4e-ba3d-73bc5c3fe95d/openstackclient/0.log" Mar 12 15:21:56 crc kubenswrapper[4921]: I0312 15:21:56.224840 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-zhfgt_0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb/openstack-network-exporter/0.log" Mar 12 15:21:56 crc kubenswrapper[4921]: I0312 15:21:56.297127 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z4nmg_f2c49e53-e8d4-4f9b-a05e-f44516144d43/ovsdb-server-init/0.log" Mar 12 15:21:56 crc kubenswrapper[4921]: I0312 15:21:56.489081 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z4nmg_f2c49e53-e8d4-4f9b-a05e-f44516144d43/ovsdb-server-init/0.log" Mar 12 15:21:56 crc kubenswrapper[4921]: I0312 15:21:56.543002 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z4nmg_f2c49e53-e8d4-4f9b-a05e-f44516144d43/ovs-vswitchd/0.log" Mar 12 15:21:56 crc kubenswrapper[4921]: I0312 15:21:56.553754 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z4nmg_f2c49e53-e8d4-4f9b-a05e-f44516144d43/ovsdb-server/0.log" Mar 12 15:21:56 crc kubenswrapper[4921]: I0312 15:21:56.594986 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s4mtb_6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb/ovn-controller/0.log" Mar 12 15:21:56 crc kubenswrapper[4921]: I0312 15:21:56.797794 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-p2wxb_8697c3cf-f4d2-45fb-9347-c580192e39d2/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:56 crc kubenswrapper[4921]: I0312 15:21:56.801638 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_47b82052-6f75-4fe5-b4af-9726f2a59c2f/openstack-network-exporter/0.log" Mar 12 15:21:56 crc kubenswrapper[4921]: I0312 15:21:56.860683 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_47b82052-6f75-4fe5-b4af-9726f2a59c2f/ovn-northd/0.log" Mar 12 15:21:57 crc kubenswrapper[4921]: I0312 15:21:57.015325 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ed0ceb5e-c541-4d3f-99b9-1865684ffa9d/openstack-network-exporter/0.log" Mar 12 15:21:57 crc kubenswrapper[4921]: I0312 15:21:57.019905 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ed0ceb5e-c541-4d3f-99b9-1865684ffa9d/ovsdbserver-nb/0.log" Mar 12 15:21:57 crc kubenswrapper[4921]: I0312 15:21:57.118337 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_cae9c939-db1a-4372-b8a0-ff4e9892cb85/openstack-network-exporter/0.log" Mar 12 15:21:57 crc kubenswrapper[4921]: I0312 15:21:57.255870 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_228e4171-a3c9-483e-bfa6-1e0cef68384c/openstack-network-exporter/0.log" Mar 12 15:21:57 crc kubenswrapper[4921]: I0312 15:21:57.322712 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_cae9c939-db1a-4372-b8a0-ff4e9892cb85/ovsdbserver-nb/0.log" Mar 12 15:21:57 crc kubenswrapper[4921]: I0312 15:21:57.351460 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_228e4171-a3c9-483e-bfa6-1e0cef68384c/ovsdbserver-sb/0.log" Mar 12 15:21:57 crc kubenswrapper[4921]: I0312 15:21:57.660332 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b28ef2e5-d1ca-460a-9c97-a058c098ef64/setup-container/0.log" Mar 12 15:21:57 crc kubenswrapper[4921]: I0312 15:21:57.877622 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b28ef2e5-d1ca-460a-9c97-a058c098ef64/setup-container/0.log" Mar 12 15:21:57 crc kubenswrapper[4921]: I0312 15:21:57.964960 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b28ef2e5-d1ca-460a-9c97-a058c098ef64/rabbitmq/0.log" Mar 12 15:21:58 crc kubenswrapper[4921]: I0312 15:21:58.133050 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e627c0e-6753-4c4a-ad5f-7d36e4373a2c/setup-container/0.log" Mar 12 15:21:58 crc kubenswrapper[4921]: I0312 15:21:58.202688 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f7ffb8f48-l6m2k_0091a555-ed5b-415c-ba49-7d2c64fdf54d/placement-api/0.log" Mar 12 15:21:58 crc kubenswrapper[4921]: I0312 15:21:58.314519 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f7ffb8f48-l6m2k_0091a555-ed5b-415c-ba49-7d2c64fdf54d/placement-log/0.log" Mar 12 15:21:58 crc kubenswrapper[4921]: I0312 15:21:58.381337 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e627c0e-6753-4c4a-ad5f-7d36e4373a2c/setup-container/0.log" Mar 12 15:21:58 crc kubenswrapper[4921]: I0312 15:21:58.410262 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e627c0e-6753-4c4a-ad5f-7d36e4373a2c/rabbitmq/0.log" Mar 12 15:21:58 crc kubenswrapper[4921]: I0312 15:21:58.411019 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z_55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:58 crc kubenswrapper[4921]: I0312 15:21:58.664742 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5_66cfa5a2-1910-4504-84cb-24e75749c210/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:58 crc kubenswrapper[4921]: I0312 15:21:58.665146 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-nzzfd_095fb2e2-a411-4c41-bf21-1c8b69166a54/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:58 crc kubenswrapper[4921]: I0312 15:21:58.837285 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-7x2dm_7dc60d30-c59f-4cd4-b798-7e8214c0fa52/ssh-known-hosts-edpm-deployment/0.log" Mar 12 15:21:58 crc kubenswrapper[4921]: I0312 15:21:58.904002 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_b061c47e-9c37-48ed-a879-9263d780de9f/tempest-tests-tempest-tests-runner/0.log" Mar 12 15:21:58 crc kubenswrapper[4921]: I0312 15:21:58.930323 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_5d16b762-c737-4831-ae57-099f1da5d7fb/test-operator-logs-container/0.log" Mar 12 15:21:59 crc kubenswrapper[4921]: I0312 15:21:59.120498 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm_36211ec3-db4f-4485-a93d-08dd120af919/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:21:59 crc kubenswrapper[4921]: I0312 15:21:59.983788 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:21:59 crc kubenswrapper[4921]: E0312 15:21:59.984270 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:22:00 crc kubenswrapper[4921]: I0312 15:22:00.147768 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555482-2cztc"] Mar 12 15:22:00 crc kubenswrapper[4921]: E0312 15:22:00.148323 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2af9bdd-a17a-414b-8fd0-7a4e44b64f71" containerName="container-00" Mar 12 15:22:00 crc kubenswrapper[4921]: I0312 15:22:00.148341 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2af9bdd-a17a-414b-8fd0-7a4e44b64f71" containerName="container-00" Mar 12 15:22:00 crc kubenswrapper[4921]: I0312 15:22:00.148599 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2af9bdd-a17a-414b-8fd0-7a4e44b64f71" containerName="container-00" Mar 12 15:22:00 crc kubenswrapper[4921]: I0312 15:22:00.149489 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555482-2cztc" Mar 12 15:22:00 crc kubenswrapper[4921]: I0312 15:22:00.153517 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:22:00 crc kubenswrapper[4921]: I0312 15:22:00.154146 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:22:00 crc kubenswrapper[4921]: I0312 15:22:00.155292 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:22:00 crc kubenswrapper[4921]: I0312 15:22:00.162919 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555482-2cztc"] Mar 12 15:22:00 crc kubenswrapper[4921]: I0312 15:22:00.259414 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qfs5\" (UniqueName: \"kubernetes.io/projected/11b52980-8d5c-425d-addb-5227add5653f-kube-api-access-6qfs5\") pod \"auto-csr-approver-29555482-2cztc\" (UID: \"11b52980-8d5c-425d-addb-5227add5653f\") " pod="openshift-infra/auto-csr-approver-29555482-2cztc" Mar 12 15:22:00 crc kubenswrapper[4921]: I0312 15:22:00.360929 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qfs5\" (UniqueName: \"kubernetes.io/projected/11b52980-8d5c-425d-addb-5227add5653f-kube-api-access-6qfs5\") pod \"auto-csr-approver-29555482-2cztc\" (UID: \"11b52980-8d5c-425d-addb-5227add5653f\") " pod="openshift-infra/auto-csr-approver-29555482-2cztc" Mar 12 15:22:00 crc kubenswrapper[4921]: I0312 15:22:00.395597 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qfs5\" (UniqueName: \"kubernetes.io/projected/11b52980-8d5c-425d-addb-5227add5653f-kube-api-access-6qfs5\") pod \"auto-csr-approver-29555482-2cztc\" (UID: \"11b52980-8d5c-425d-addb-5227add5653f\") " pod="openshift-infra/auto-csr-approver-29555482-2cztc" Mar 12 15:22:00 crc kubenswrapper[4921]: I0312 15:22:00.474801 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555482-2cztc" Mar 12 15:22:01 crc kubenswrapper[4921]: I0312 15:22:01.037142 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555482-2cztc"] Mar 12 15:22:01 crc kubenswrapper[4921]: I0312 15:22:01.048627 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:22:01 crc kubenswrapper[4921]: I0312 15:22:01.968254 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555482-2cztc" event={"ID":"11b52980-8d5c-425d-addb-5227add5653f","Type":"ContainerStarted","Data":"4569eefc159e9bbd258a3d5040609b4b6177cc3c76bda0723b208990a8b12135"} Mar 12 15:22:02 crc kubenswrapper[4921]: I0312 15:22:02.980415 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555482-2cztc" event={"ID":"11b52980-8d5c-425d-addb-5227add5653f","Type":"ContainerStarted","Data":"b816f229e478bb14860505718b596c4709f5f435529b1d35ba8b7e68c618bb25"} Mar 12 15:22:03 crc kubenswrapper[4921]: I0312 15:22:03.016742 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555482-2cztc" podStartSLOduration=1.94094411 podStartE2EDuration="3.016714286s" podCreationTimestamp="2026-03-12 15:22:00 +0000 UTC" firstStartedPulling="2026-03-12 15:22:01.048393971 +0000 UTC m=+7943.738465942" lastFinishedPulling="2026-03-12 15:22:02.124164137 +0000 UTC m=+7944.814236118" observedRunningTime="2026-03-12 15:22:02.996416896 +0000 UTC m=+7945.686488867" watchObservedRunningTime="2026-03-12 15:22:03.016714286 +0000 UTC m=+7945.706786257" Mar 12 15:22:03 crc kubenswrapper[4921]: I0312 15:22:03.994330 4921 generic.go:334] "Generic (PLEG): container finished" podID="11b52980-8d5c-425d-addb-5227add5653f" containerID="b816f229e478bb14860505718b596c4709f5f435529b1d35ba8b7e68c618bb25" exitCode=0 Mar 12 15:22:03 crc kubenswrapper[4921]: I0312 15:22:03.994410 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555482-2cztc" event={"ID":"11b52980-8d5c-425d-addb-5227add5653f","Type":"ContainerDied","Data":"b816f229e478bb14860505718b596c4709f5f435529b1d35ba8b7e68c618bb25"} Mar 12 15:22:05 crc kubenswrapper[4921]: I0312 15:22:05.396652 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555482-2cztc" Mar 12 15:22:05 crc kubenswrapper[4921]: I0312 15:22:05.410975 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qfs5\" (UniqueName: \"kubernetes.io/projected/11b52980-8d5c-425d-addb-5227add5653f-kube-api-access-6qfs5\") pod \"11b52980-8d5c-425d-addb-5227add5653f\" (UID: \"11b52980-8d5c-425d-addb-5227add5653f\") " Mar 12 15:22:05 crc kubenswrapper[4921]: I0312 15:22:05.429916 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11b52980-8d5c-425d-addb-5227add5653f-kube-api-access-6qfs5" (OuterVolumeSpecName: "kube-api-access-6qfs5") pod "11b52980-8d5c-425d-addb-5227add5653f" (UID: "11b52980-8d5c-425d-addb-5227add5653f"). InnerVolumeSpecName "kube-api-access-6qfs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:22:05 crc kubenswrapper[4921]: I0312 15:22:05.514421 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qfs5\" (UniqueName: \"kubernetes.io/projected/11b52980-8d5c-425d-addb-5227add5653f-kube-api-access-6qfs5\") on node \"crc\" DevicePath \"\"" Mar 12 15:22:06 crc kubenswrapper[4921]: I0312 15:22:06.016905 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555482-2cztc" event={"ID":"11b52980-8d5c-425d-addb-5227add5653f","Type":"ContainerDied","Data":"4569eefc159e9bbd258a3d5040609b4b6177cc3c76bda0723b208990a8b12135"} Mar 12 15:22:06 crc kubenswrapper[4921]: I0312 15:22:06.016950 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4569eefc159e9bbd258a3d5040609b4b6177cc3c76bda0723b208990a8b12135" Mar 12 15:22:06 crc kubenswrapper[4921]: I0312 15:22:06.017009 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555482-2cztc" Mar 12 15:22:06 crc kubenswrapper[4921]: I0312 15:22:06.081262 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555476-wc6zf"] Mar 12 15:22:06 crc kubenswrapper[4921]: I0312 15:22:06.093450 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555476-wc6zf"] Mar 12 15:22:08 crc kubenswrapper[4921]: I0312 15:22:08.022469 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e564772e-8d64-4f52-a570-a89fb04b1560" path="/var/lib/kubelet/pods/e564772e-8d64-4f52-a570-a89fb04b1560/volumes" Mar 12 15:22:12 crc kubenswrapper[4921]: I0312 15:22:12.984290 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:22:12 crc kubenswrapper[4921]: E0312 15:22:12.985413 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:22:23 crc kubenswrapper[4921]: I0312 15:22:23.088114 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/util/0.log" Mar 12 15:22:23 crc kubenswrapper[4921]: I0312 15:22:23.360264 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/pull/0.log" Mar 12 15:22:23 crc kubenswrapper[4921]: I0312 15:22:23.414348 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/util/0.log" Mar 12 15:22:23 crc kubenswrapper[4921]: I0312 15:22:23.448206 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/pull/0.log" Mar 12 15:22:23 crc kubenswrapper[4921]: I0312 15:22:23.634203 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/util/0.log" Mar 12 15:22:23 crc kubenswrapper[4921]: I0312 15:22:23.681373 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/extract/0.log" Mar 12 15:22:23 crc kubenswrapper[4921]: I0312 15:22:23.681931 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/pull/0.log" Mar 12 15:22:24 crc kubenswrapper[4921]: I0312 15:22:24.568420 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-j46tf_5908e8b2-d088-4190-8ccf-ea7526921e80/manager/0.log" Mar 12 15:22:24 crc kubenswrapper[4921]: I0312 15:22:24.898511 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-5jt7c_7494cb10-090c-4ac2-bbf1-663979f3e4cf/manager/0.log" Mar 12 15:22:25 crc kubenswrapper[4921]: I0312 15:22:25.078649 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-nq8wj_c6de3785-ea06-49bb-9b39-d8f2f10bce81/manager/0.log" Mar 12 15:22:25 crc kubenswrapper[4921]: I0312 15:22:25.343848 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-fp4rs_001425f5-0a2a-4bdc-a437-d6f9ba3687b4/manager/0.log" Mar 12 15:22:25 crc kubenswrapper[4921]: I0312 15:22:25.958776 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-67xqg_6a1a1aea-a74a-4886-ae24-1d188243e859/manager/0.log" Mar 12 15:22:25 crc kubenswrapper[4921]: I0312 15:22:25.983311 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:22:25 crc kubenswrapper[4921]: E0312 15:22:25.983899 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:22:26 crc kubenswrapper[4921]: I0312 15:22:26.206961 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-9tkrv_c09491c8-72c5-4019-91bf-37ee1a3a937c/manager/0.log" Mar 12 15:22:26 crc kubenswrapper[4921]: I0312 15:22:26.563224 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-v42m2_d4de9b0c-3812-462a-aa80-ffe00e6d47ca/manager/0.log" Mar 12 15:22:26 crc kubenswrapper[4921]: I0312 15:22:26.723315 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-xzm8h_fd1bc9ca-529d-4d59-a236-db1bb5c121ca/manager/0.log" Mar 12 15:22:26 crc kubenswrapper[4921]: I0312 15:22:26.992807 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-692s5_6131e4c9-d85a-4cdf-9cec-128c9e81bc29/manager/0.log" Mar 12 15:22:27 crc kubenswrapper[4921]: I0312 15:22:27.375750 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-kzh67_2394f3bd-4f8b-4036-b240-7ed71b80798a/manager/0.log" Mar 12 15:22:27 crc kubenswrapper[4921]: I0312 15:22:27.660391 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-bz8j7_4e1ee178-3f0e-405a-93cb-9414b2fccbe0/manager/0.log" Mar 12 15:22:27 crc kubenswrapper[4921]: I0312 15:22:27.748728 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-686d5f9fbd-hmkmx_1a0b0ff9-21c3-452f-9ded-00d374fbbcbe/manager/0.log" Mar 12 15:22:27 crc kubenswrapper[4921]: I0312 15:22:27.862389 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8_0c9cd39f-8440-4f22-82ce-d3be95bea1be/manager/0.log" Mar 12 15:22:28 crc kubenswrapper[4921]: I0312 15:22:28.121290 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-zmq56_ac8d4a43-01b6-438e-b1d8-d3521ed82176/manager/0.log" Mar 12 15:22:28 crc kubenswrapper[4921]: I0312 15:22:28.259973 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5bc4df7446-bp8nq_c7db0c3c-40e2-49df-bffc-c0f94b26c92f/operator/0.log" Mar 12 15:22:28 crc kubenswrapper[4921]: I0312 15:22:28.394771 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rrhpc_5f20d433-83bd-4524-a6ce-ef19ef8a1064/registry-server/0.log" Mar 12 15:22:28 crc kubenswrapper[4921]: I0312 15:22:28.764447 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-x4tf4_994c3a47-47a7-4fbe-9f9c-df011597775b/manager/0.log" Mar 12 15:22:28 crc kubenswrapper[4921]: I0312 15:22:28.823692 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-64dcj_3a930c0b-6c3b-4a1d-b02f-1190a124ceb2/manager/0.log" Mar 12 15:22:28 crc kubenswrapper[4921]: I0312 15:22:28.990733 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-h97zm_f0da206d-658e-47e1-9cfb-5b74237c406a/operator/0.log" Mar 12 15:22:29 crc kubenswrapper[4921]: I0312 15:22:29.114301 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-m842c_f2c81917-4047-4d0b-baed-45afa8a53a60/manager/0.log" Mar 12 15:22:29 crc kubenswrapper[4921]: I0312 15:22:29.348535 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-dlgkj_fe35cc9d-bfc6-4a4d-b21f-06ab55672726/manager/0.log" Mar 12 15:22:29 crc kubenswrapper[4921]: I0312 15:22:29.499738 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-2sf7v_ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b/manager/0.log" Mar 12 15:22:29 crc kubenswrapper[4921]: I0312 15:22:29.603346 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-7l7sm_2db21a73-26d9-44d6-aa91-ba8068b0525a/manager/0.log" Mar 12 15:22:30 crc kubenswrapper[4921]: I0312 15:22:30.271841 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5785b7957-24wxp_9b888138-4648-48a6-9364-639fb0e0c8b6/manager/0.log" Mar 12 15:22:38 crc kubenswrapper[4921]: I0312 15:22:38.984220 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:22:38 crc kubenswrapper[4921]: E0312 15:22:38.985181 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:22:39 crc kubenswrapper[4921]: I0312 15:22:39.665403 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-dmwhv_0cc6c5ac-1bcd-4636-924a-8a6d6ebfaeea/manager/0.log" Mar 12 15:22:50 crc kubenswrapper[4921]: I0312 15:22:50.984008 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:22:50 crc kubenswrapper[4921]: E0312 15:22:50.984906 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:22:51 crc kubenswrapper[4921]: I0312 15:22:51.443793 4921 scope.go:117] "RemoveContainer" containerID="0bc7cb9c8a0c61cda77e7fff50e03fdded791dae73ba4b596295c5210254e41f" Mar 12 15:22:51 crc kubenswrapper[4921]: I0312 15:22:51.740114 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-x8rdl_5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4/control-plane-machine-set-operator/0.log" Mar 12 15:22:51 crc kubenswrapper[4921]: I0312 15:22:51.944544 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-r7sfx_345c99f7-75d2-48da-9a45-6fd8ce5c92da/machine-api-operator/0.log" Mar 12 15:22:51 crc kubenswrapper[4921]: I0312 15:22:51.950925 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-r7sfx_345c99f7-75d2-48da-9a45-6fd8ce5c92da/kube-rbac-proxy/0.log" Mar 12 15:23:03 crc kubenswrapper[4921]: I0312 15:23:03.984071 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:23:03 crc kubenswrapper[4921]: E0312 15:23:03.985048 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:23:04 crc kubenswrapper[4921]: I0312 15:23:04.290948 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-d22fr_aabe30ef-92c9-4d25-8278-09d1dba1583b/cert-manager-controller/0.log" Mar 12 15:23:04 crc kubenswrapper[4921]: I0312 15:23:04.536558 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-bbpqn_b02a546a-2d4e-4de2-9673-9c7b2d37a6e8/cert-manager-cainjector/0.log" Mar 12 15:23:04 crc kubenswrapper[4921]: I0312 15:23:04.542510 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-jw9bb_5e022cd5-783e-4dbe-a554-42a43e2bc746/cert-manager-webhook/0.log" Mar 12 15:23:16 crc kubenswrapper[4921]: I0312 15:23:16.983415 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:23:16 crc kubenswrapper[4921]: E0312 15:23:16.984530 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:23:17 crc kubenswrapper[4921]: I0312 15:23:17.539011 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-ppq69_e1bd23bf-3c09-41ff-9840-3397219f3f4d/nmstate-console-plugin/0.log" Mar 12 15:23:17 crc kubenswrapper[4921]: I0312 15:23:17.738898 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-2x8kb_7fea2e61-eacd-4cef-9425-2e03106cf6f4/nmstate-handler/0.log" Mar 12 15:23:17 crc kubenswrapper[4921]: I0312 15:23:17.780844 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-tkdph_e3a3372c-64ea-4841-91b6-55d6dbc9490a/kube-rbac-proxy/0.log" Mar 12 15:23:17 crc kubenswrapper[4921]: I0312 15:23:17.870748 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-tkdph_e3a3372c-64ea-4841-91b6-55d6dbc9490a/nmstate-metrics/0.log" Mar 12 15:23:17 crc kubenswrapper[4921]: I0312 15:23:17.960008 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-nvv9l_7258907e-4b4e-41d5-aac1-9d0fb967e5fd/nmstate-operator/0.log" Mar 12 15:23:18 crc kubenswrapper[4921]: I0312 15:23:18.062683 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-kf975_20f1f547-f958-419e-a5c2-58695625d6ad/nmstate-webhook/0.log" Mar 12 15:23:31 crc kubenswrapper[4921]: I0312 15:23:31.984015 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:23:31 crc kubenswrapper[4921]: E0312 15:23:31.985092 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:23:33 crc kubenswrapper[4921]: I0312 15:23:33.853339 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hpd4s"] Mar 12 15:23:33 crc kubenswrapper[4921]: E0312 15:23:33.854368 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b52980-8d5c-425d-addb-5227add5653f" containerName="oc" Mar 12 15:23:33 crc kubenswrapper[4921]: I0312 15:23:33.854400 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b52980-8d5c-425d-addb-5227add5653f" containerName="oc" Mar 12 15:23:33 crc kubenswrapper[4921]: I0312 15:23:33.854668 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b52980-8d5c-425d-addb-5227add5653f" containerName="oc" Mar 12 15:23:33 crc kubenswrapper[4921]: I0312 15:23:33.856585 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:33 crc kubenswrapper[4921]: I0312 15:23:33.863796 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpd4s"] Mar 12 15:23:33 crc kubenswrapper[4921]: I0312 15:23:33.908854 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f59ace-8038-41f1-833d-3580f18a3a2e-catalog-content\") pod \"redhat-marketplace-hpd4s\" (UID: \"59f59ace-8038-41f1-833d-3580f18a3a2e\") " pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:33 crc kubenswrapper[4921]: I0312 15:23:33.908941 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f59ace-8038-41f1-833d-3580f18a3a2e-utilities\") pod \"redhat-marketplace-hpd4s\" (UID: \"59f59ace-8038-41f1-833d-3580f18a3a2e\") " pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:33 crc kubenswrapper[4921]: I0312 15:23:33.909018 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr65j\" (UniqueName: \"kubernetes.io/projected/59f59ace-8038-41f1-833d-3580f18a3a2e-kube-api-access-kr65j\") pod \"redhat-marketplace-hpd4s\" (UID: \"59f59ace-8038-41f1-833d-3580f18a3a2e\") " pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:34 crc kubenswrapper[4921]: I0312 15:23:34.012005 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr65j\" (UniqueName: \"kubernetes.io/projected/59f59ace-8038-41f1-833d-3580f18a3a2e-kube-api-access-kr65j\") pod \"redhat-marketplace-hpd4s\" (UID: \"59f59ace-8038-41f1-833d-3580f18a3a2e\") " pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:34 crc kubenswrapper[4921]: I0312 15:23:34.012256 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f59ace-8038-41f1-833d-3580f18a3a2e-catalog-content\") pod \"redhat-marketplace-hpd4s\" (UID: \"59f59ace-8038-41f1-833d-3580f18a3a2e\") " pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:34 crc kubenswrapper[4921]: I0312 15:23:34.012291 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f59ace-8038-41f1-833d-3580f18a3a2e-utilities\") pod \"redhat-marketplace-hpd4s\" (UID: \"59f59ace-8038-41f1-833d-3580f18a3a2e\") " pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:34 crc kubenswrapper[4921]: I0312 15:23:34.012900 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f59ace-8038-41f1-833d-3580f18a3a2e-utilities\") pod \"redhat-marketplace-hpd4s\" (UID: \"59f59ace-8038-41f1-833d-3580f18a3a2e\") " pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:34 crc kubenswrapper[4921]: I0312 15:23:34.012934 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f59ace-8038-41f1-833d-3580f18a3a2e-catalog-content\") pod \"redhat-marketplace-hpd4s\" (UID: \"59f59ace-8038-41f1-833d-3580f18a3a2e\") " pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:34 crc kubenswrapper[4921]: I0312 15:23:34.051868 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr65j\" (UniqueName: \"kubernetes.io/projected/59f59ace-8038-41f1-833d-3580f18a3a2e-kube-api-access-kr65j\") pod \"redhat-marketplace-hpd4s\" (UID: \"59f59ace-8038-41f1-833d-3580f18a3a2e\") " pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:34 crc kubenswrapper[4921]: I0312 15:23:34.181506 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:34 crc kubenswrapper[4921]: I0312 15:23:34.859091 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpd4s"] Mar 12 15:23:34 crc kubenswrapper[4921]: I0312 15:23:34.892131 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpd4s" event={"ID":"59f59ace-8038-41f1-833d-3580f18a3a2e","Type":"ContainerStarted","Data":"fb18fa9a5a4e07c3df80d6be59708c625fcc7114678715325370d2d7afede093"} Mar 12 15:23:35 crc kubenswrapper[4921]: I0312 15:23:35.904489 4921 generic.go:334] "Generic (PLEG): container finished" podID="59f59ace-8038-41f1-833d-3580f18a3a2e" containerID="7a0ac989a54039b3da39b66b2c0acf083da63cd6c8e965df52d9494ba05075a3" exitCode=0 Mar 12 15:23:35 crc kubenswrapper[4921]: I0312 15:23:35.904628 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpd4s" event={"ID":"59f59ace-8038-41f1-833d-3580f18a3a2e","Type":"ContainerDied","Data":"7a0ac989a54039b3da39b66b2c0acf083da63cd6c8e965df52d9494ba05075a3"} Mar 12 15:23:36 crc kubenswrapper[4921]: I0312 15:23:36.917729 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpd4s" event={"ID":"59f59ace-8038-41f1-833d-3580f18a3a2e","Type":"ContainerStarted","Data":"ff380b38a218ba07920ecc3c7c76843d9b1cbe21e89a189a74b9a6a33f0ddc7b"} Mar 12 15:23:38 crc kubenswrapper[4921]: I0312 15:23:38.938584 4921 generic.go:334] "Generic (PLEG): container finished" podID="59f59ace-8038-41f1-833d-3580f18a3a2e" containerID="ff380b38a218ba07920ecc3c7c76843d9b1cbe21e89a189a74b9a6a33f0ddc7b" exitCode=0 Mar 12 15:23:38 crc kubenswrapper[4921]: I0312 15:23:38.938665 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpd4s" event={"ID":"59f59ace-8038-41f1-833d-3580f18a3a2e","Type":"ContainerDied","Data":"ff380b38a218ba07920ecc3c7c76843d9b1cbe21e89a189a74b9a6a33f0ddc7b"} Mar 12 15:23:39 crc kubenswrapper[4921]: I0312 15:23:39.967170 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpd4s" event={"ID":"59f59ace-8038-41f1-833d-3580f18a3a2e","Type":"ContainerStarted","Data":"5a174d6971535303754d0b0cdfccaef3bf808d87a57cb2bd3fe5f868c8d60acd"} Mar 12 15:23:44 crc kubenswrapper[4921]: I0312 15:23:44.181675 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:44 crc kubenswrapper[4921]: I0312 15:23:44.182591 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:44 crc kubenswrapper[4921]: I0312 15:23:44.231768 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:44 crc kubenswrapper[4921]: I0312 15:23:44.253423 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hpd4s" podStartSLOduration=7.629705946 podStartE2EDuration="11.253393423s" podCreationTimestamp="2026-03-12 15:23:33 +0000 UTC" firstStartedPulling="2026-03-12 15:23:35.9078688 +0000 UTC m=+8038.597940781" lastFinishedPulling="2026-03-12 15:23:39.531556287 +0000 UTC m=+8042.221628258" observedRunningTime="2026-03-12 15:23:39.989526589 +0000 UTC m=+8042.679598560" watchObservedRunningTime="2026-03-12 15:23:44.253393423 +0000 UTC m=+8046.943465414" Mar 12 15:23:44 crc kubenswrapper[4921]: I0312 15:23:44.983906 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:23:44 crc kubenswrapper[4921]: E0312 15:23:44.984277 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:23:45 crc kubenswrapper[4921]: I0312 15:23:45.079284 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:46 crc kubenswrapper[4921]: I0312 15:23:46.872066 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpd4s"] Mar 12 15:23:47 crc kubenswrapper[4921]: I0312 15:23:47.042268 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hpd4s" podUID="59f59ace-8038-41f1-833d-3580f18a3a2e" containerName="registry-server" containerID="cri-o://5a174d6971535303754d0b0cdfccaef3bf808d87a57cb2bd3fe5f868c8d60acd" gracePeriod=2 Mar 12 15:23:47 crc kubenswrapper[4921]: I0312 15:23:47.479849 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:47 crc kubenswrapper[4921]: I0312 15:23:47.633034 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr65j\" (UniqueName: \"kubernetes.io/projected/59f59ace-8038-41f1-833d-3580f18a3a2e-kube-api-access-kr65j\") pod \"59f59ace-8038-41f1-833d-3580f18a3a2e\" (UID: \"59f59ace-8038-41f1-833d-3580f18a3a2e\") " Mar 12 15:23:47 crc kubenswrapper[4921]: I0312 15:23:47.633522 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f59ace-8038-41f1-833d-3580f18a3a2e-utilities\") pod \"59f59ace-8038-41f1-833d-3580f18a3a2e\" (UID: \"59f59ace-8038-41f1-833d-3580f18a3a2e\") " Mar 12 15:23:47 crc kubenswrapper[4921]: I0312 15:23:47.633557 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f59ace-8038-41f1-833d-3580f18a3a2e-catalog-content\") pod \"59f59ace-8038-41f1-833d-3580f18a3a2e\" (UID: \"59f59ace-8038-41f1-833d-3580f18a3a2e\") " Mar 12 15:23:47 crc kubenswrapper[4921]: I0312 15:23:47.634665 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59f59ace-8038-41f1-833d-3580f18a3a2e-utilities" (OuterVolumeSpecName: "utilities") pod "59f59ace-8038-41f1-833d-3580f18a3a2e" (UID: "59f59ace-8038-41f1-833d-3580f18a3a2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:23:47 crc kubenswrapper[4921]: I0312 15:23:47.640706 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f59ace-8038-41f1-833d-3580f18a3a2e-kube-api-access-kr65j" (OuterVolumeSpecName: "kube-api-access-kr65j") pod "59f59ace-8038-41f1-833d-3580f18a3a2e" (UID: "59f59ace-8038-41f1-833d-3580f18a3a2e"). InnerVolumeSpecName "kube-api-access-kr65j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:23:47 crc kubenswrapper[4921]: I0312 15:23:47.665304 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59f59ace-8038-41f1-833d-3580f18a3a2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59f59ace-8038-41f1-833d-3580f18a3a2e" (UID: "59f59ace-8038-41f1-833d-3580f18a3a2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:23:47 crc kubenswrapper[4921]: I0312 15:23:47.735907 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr65j\" (UniqueName: \"kubernetes.io/projected/59f59ace-8038-41f1-833d-3580f18a3a2e-kube-api-access-kr65j\") on node \"crc\" DevicePath \"\"" Mar 12 15:23:47 crc kubenswrapper[4921]: I0312 15:23:47.735951 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59f59ace-8038-41f1-833d-3580f18a3a2e-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:23:47 crc kubenswrapper[4921]: I0312 15:23:47.735965 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59f59ace-8038-41f1-833d-3580f18a3a2e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.054266 4921 generic.go:334] "Generic (PLEG): container finished" podID="59f59ace-8038-41f1-833d-3580f18a3a2e" containerID="5a174d6971535303754d0b0cdfccaef3bf808d87a57cb2bd3fe5f868c8d60acd" exitCode=0 Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.054322 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpd4s" event={"ID":"59f59ace-8038-41f1-833d-3580f18a3a2e","Type":"ContainerDied","Data":"5a174d6971535303754d0b0cdfccaef3bf808d87a57cb2bd3fe5f868c8d60acd"} Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.054357 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hpd4s" event={"ID":"59f59ace-8038-41f1-833d-3580f18a3a2e","Type":"ContainerDied","Data":"fb18fa9a5a4e07c3df80d6be59708c625fcc7114678715325370d2d7afede093"} Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.054379 4921 scope.go:117] "RemoveContainer" containerID="5a174d6971535303754d0b0cdfccaef3bf808d87a57cb2bd3fe5f868c8d60acd" Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.054330 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hpd4s" Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.078154 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpd4s"] Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.081390 4921 scope.go:117] "RemoveContainer" containerID="ff380b38a218ba07920ecc3c7c76843d9b1cbe21e89a189a74b9a6a33f0ddc7b" Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.088769 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hpd4s"] Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.099480 4921 scope.go:117] "RemoveContainer" containerID="7a0ac989a54039b3da39b66b2c0acf083da63cd6c8e965df52d9494ba05075a3" Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.155362 4921 scope.go:117] "RemoveContainer" containerID="5a174d6971535303754d0b0cdfccaef3bf808d87a57cb2bd3fe5f868c8d60acd" Mar 12 15:23:48 crc kubenswrapper[4921]: E0312 15:23:48.155903 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a174d6971535303754d0b0cdfccaef3bf808d87a57cb2bd3fe5f868c8d60acd\": container with ID starting with 5a174d6971535303754d0b0cdfccaef3bf808d87a57cb2bd3fe5f868c8d60acd not found: ID does not exist" containerID="5a174d6971535303754d0b0cdfccaef3bf808d87a57cb2bd3fe5f868c8d60acd" Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.155971 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a174d6971535303754d0b0cdfccaef3bf808d87a57cb2bd3fe5f868c8d60acd"} err="failed to get container status \"5a174d6971535303754d0b0cdfccaef3bf808d87a57cb2bd3fe5f868c8d60acd\": rpc error: code = NotFound desc = could not find container \"5a174d6971535303754d0b0cdfccaef3bf808d87a57cb2bd3fe5f868c8d60acd\": container with ID starting with 5a174d6971535303754d0b0cdfccaef3bf808d87a57cb2bd3fe5f868c8d60acd not found: ID does not exist" Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.156014 4921 scope.go:117] "RemoveContainer" containerID="ff380b38a218ba07920ecc3c7c76843d9b1cbe21e89a189a74b9a6a33f0ddc7b" Mar 12 15:23:48 crc kubenswrapper[4921]: E0312 15:23:48.156398 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff380b38a218ba07920ecc3c7c76843d9b1cbe21e89a189a74b9a6a33f0ddc7b\": container with ID starting with ff380b38a218ba07920ecc3c7c76843d9b1cbe21e89a189a74b9a6a33f0ddc7b not found: ID does not exist" containerID="ff380b38a218ba07920ecc3c7c76843d9b1cbe21e89a189a74b9a6a33f0ddc7b" Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.156434 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff380b38a218ba07920ecc3c7c76843d9b1cbe21e89a189a74b9a6a33f0ddc7b"} err="failed to get container status \"ff380b38a218ba07920ecc3c7c76843d9b1cbe21e89a189a74b9a6a33f0ddc7b\": rpc error: code = NotFound desc = could not find container \"ff380b38a218ba07920ecc3c7c76843d9b1cbe21e89a189a74b9a6a33f0ddc7b\": container with ID starting with ff380b38a218ba07920ecc3c7c76843d9b1cbe21e89a189a74b9a6a33f0ddc7b not found: ID does not exist" Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.156456 4921 scope.go:117] "RemoveContainer" containerID="7a0ac989a54039b3da39b66b2c0acf083da63cd6c8e965df52d9494ba05075a3" Mar 12 15:23:48 crc kubenswrapper[4921]: E0312 15:23:48.156724 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a0ac989a54039b3da39b66b2c0acf083da63cd6c8e965df52d9494ba05075a3\": container with ID starting with 7a0ac989a54039b3da39b66b2c0acf083da63cd6c8e965df52d9494ba05075a3 not found: ID does not exist" containerID="7a0ac989a54039b3da39b66b2c0acf083da63cd6c8e965df52d9494ba05075a3" Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.156754 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a0ac989a54039b3da39b66b2c0acf083da63cd6c8e965df52d9494ba05075a3"} err="failed to get container status \"7a0ac989a54039b3da39b66b2c0acf083da63cd6c8e965df52d9494ba05075a3\": rpc error: code = NotFound desc = could not find container \"7a0ac989a54039b3da39b66b2c0acf083da63cd6c8e965df52d9494ba05075a3\": container with ID starting with 7a0ac989a54039b3da39b66b2c0acf083da63cd6c8e965df52d9494ba05075a3 not found: ID does not exist" Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.895141 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-nzvhg_ceb498e3-36d0-4f72-9c07-54807b7a11ea/kube-rbac-proxy/0.log" Mar 12 15:23:48 crc kubenswrapper[4921]: I0312 15:23:48.954589 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-nzvhg_ceb498e3-36d0-4f72-9c07-54807b7a11ea/controller/0.log" Mar 12 15:23:49 crc kubenswrapper[4921]: I0312 15:23:49.124195 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-frr-files/0.log" Mar 12 15:23:49 crc kubenswrapper[4921]: I0312 15:23:49.324682 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-reloader/0.log" Mar 12 15:23:49 crc kubenswrapper[4921]: I0312 15:23:49.345179 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-frr-files/0.log" Mar 12 15:23:49 crc kubenswrapper[4921]: I0312 15:23:49.364838 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-reloader/0.log" Mar 12 15:23:49 crc kubenswrapper[4921]: I0312 15:23:49.381458 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-metrics/0.log" Mar 12 15:23:49 crc kubenswrapper[4921]: I0312 15:23:49.527098 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-reloader/0.log" Mar 12 15:23:49 crc kubenswrapper[4921]: I0312 15:23:49.547340 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-metrics/0.log" Mar 12 15:23:49 crc kubenswrapper[4921]: I0312 15:23:49.551279 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-frr-files/0.log" Mar 12 15:23:49 crc kubenswrapper[4921]: I0312 15:23:49.589136 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-metrics/0.log" Mar 12 15:23:49 crc kubenswrapper[4921]: I0312 15:23:49.759496 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-frr-files/0.log" Mar 12 15:23:49 crc kubenswrapper[4921]: I0312 15:23:49.774276 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/controller/0.log" Mar 12 15:23:49 crc kubenswrapper[4921]: I0312 15:23:49.783564 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-reloader/0.log" Mar 12 15:23:49 crc kubenswrapper[4921]: I0312 15:23:49.812318 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-metrics/0.log" Mar 12 15:23:49 crc kubenswrapper[4921]: I0312 15:23:49.994420 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f59ace-8038-41f1-833d-3580f18a3a2e" path="/var/lib/kubelet/pods/59f59ace-8038-41f1-833d-3580f18a3a2e/volumes" Mar 12 15:23:50 crc kubenswrapper[4921]: I0312 15:23:50.002226 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/frr-metrics/0.log" Mar 12 15:23:50 crc kubenswrapper[4921]: I0312 15:23:50.019746 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/kube-rbac-proxy/0.log" Mar 12 15:23:50 crc kubenswrapper[4921]: I0312 15:23:50.038248 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/kube-rbac-proxy-frr/0.log" Mar 12 15:23:50 crc kubenswrapper[4921]: I0312 15:23:50.179102 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/reloader/0.log" Mar 12 15:23:50 crc kubenswrapper[4921]: I0312 15:23:50.286041 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jn8d5_aabfc338-f7a1-46a8-a02a-daf1adc64862/frr-k8s-webhook-server/0.log" Mar 12 15:23:50 crc kubenswrapper[4921]: I0312 15:23:50.535803 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-74b4d54bf-8p27k_ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234/manager/0.log" Mar 12 15:23:50 crc kubenswrapper[4921]: I0312 15:23:50.657853 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-78c99c5f4b-pq84h_7a20ce4c-4e95-4fcd-ba22-212cc219c81f/webhook-server/0.log" Mar 12 15:23:50 crc kubenswrapper[4921]: I0312 15:23:50.855500 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zfh6j_8ae92198-0eeb-414f-859a-27c54e4338bf/kube-rbac-proxy/0.log" Mar 12 15:23:51 crc kubenswrapper[4921]: I0312 15:23:51.459507 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zfh6j_8ae92198-0eeb-414f-859a-27c54e4338bf/speaker/0.log" Mar 12 15:23:52 crc kubenswrapper[4921]: I0312 15:23:52.606470 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/frr/0.log" Mar 12 15:23:59 crc kubenswrapper[4921]: I0312 15:23:59.983233 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:23:59 crc kubenswrapper[4921]: E0312 15:23:59.984450 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.147807 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555484-rz5r2"] Mar 12 15:24:00 crc kubenswrapper[4921]: E0312 15:24:00.148611 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f59ace-8038-41f1-833d-3580f18a3a2e" containerName="extract-content" Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.148649 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f59ace-8038-41f1-833d-3580f18a3a2e" containerName="extract-content" Mar 12 15:24:00 crc kubenswrapper[4921]: E0312 15:24:00.148672 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f59ace-8038-41f1-833d-3580f18a3a2e" containerName="registry-server" Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.148715 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f59ace-8038-41f1-833d-3580f18a3a2e" containerName="registry-server" Mar 12 15:24:00 crc kubenswrapper[4921]: E0312 15:24:00.148766 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f59ace-8038-41f1-833d-3580f18a3a2e" containerName="extract-utilities" Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.148779 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f59ace-8038-41f1-833d-3580f18a3a2e" containerName="extract-utilities" Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.149102 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f59ace-8038-41f1-833d-3580f18a3a2e" containerName="registry-server" Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.151321 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555484-rz5r2" Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.167480 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555484-rz5r2"] Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.191857 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.191948 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.191879 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.309924 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bffqn\" (UniqueName: \"kubernetes.io/projected/2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476-kube-api-access-bffqn\") pod \"auto-csr-approver-29555484-rz5r2\" (UID: \"2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476\") " pod="openshift-infra/auto-csr-approver-29555484-rz5r2" Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.412649 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bffqn\" (UniqueName: \"kubernetes.io/projected/2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476-kube-api-access-bffqn\") pod \"auto-csr-approver-29555484-rz5r2\" (UID: \"2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476\") " pod="openshift-infra/auto-csr-approver-29555484-rz5r2" Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.432903 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bffqn\" (UniqueName: \"kubernetes.io/projected/2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476-kube-api-access-bffqn\") pod \"auto-csr-approver-29555484-rz5r2\" (UID: \"2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476\") " pod="openshift-infra/auto-csr-approver-29555484-rz5r2" Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.507387 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555484-rz5r2" Mar 12 15:24:00 crc kubenswrapper[4921]: I0312 15:24:00.962621 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555484-rz5r2"] Mar 12 15:24:01 crc kubenswrapper[4921]: I0312 15:24:01.172737 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555484-rz5r2" event={"ID":"2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476","Type":"ContainerStarted","Data":"dec7b708a450e7562cb6b16edd88041a22d768dd2000ea9f32e23df80f14b82c"} Mar 12 15:24:03 crc kubenswrapper[4921]: I0312 15:24:03.194561 4921 generic.go:334] "Generic (PLEG): container finished" podID="2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476" containerID="4011dd91f23660c3f69dfd2777f071e28923f396187572e0175bd19be349edb1" exitCode=0 Mar 12 15:24:03 crc kubenswrapper[4921]: I0312 15:24:03.194931 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555484-rz5r2" event={"ID":"2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476","Type":"ContainerDied","Data":"4011dd91f23660c3f69dfd2777f071e28923f396187572e0175bd19be349edb1"} Mar 12 15:24:04 crc kubenswrapper[4921]: I0312 15:24:04.321676 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk_6cbcab60-00bb-4477-a36c-5d3f8298ab6b/util/0.log" Mar 12 15:24:04 crc kubenswrapper[4921]: I0312 15:24:04.525886 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk_6cbcab60-00bb-4477-a36c-5d3f8298ab6b/pull/0.log" Mar 12 15:24:04 crc kubenswrapper[4921]: I0312 15:24:04.536125 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk_6cbcab60-00bb-4477-a36c-5d3f8298ab6b/pull/0.log" Mar 12 15:24:04 crc kubenswrapper[4921]: I0312 15:24:04.579013 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555484-rz5r2" Mar 12 15:24:04 crc kubenswrapper[4921]: I0312 15:24:04.582163 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk_6cbcab60-00bb-4477-a36c-5d3f8298ab6b/util/0.log" Mar 12 15:24:04 crc kubenswrapper[4921]: I0312 15:24:04.714562 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bffqn\" (UniqueName: \"kubernetes.io/projected/2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476-kube-api-access-bffqn\") pod \"2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476\" (UID: \"2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476\") " Mar 12 15:24:04 crc kubenswrapper[4921]: I0312 15:24:04.733617 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476-kube-api-access-bffqn" (OuterVolumeSpecName: "kube-api-access-bffqn") pod "2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476" (UID: "2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476"). InnerVolumeSpecName "kube-api-access-bffqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:24:04 crc kubenswrapper[4921]: I0312 15:24:04.798234 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk_6cbcab60-00bb-4477-a36c-5d3f8298ab6b/util/0.log" Mar 12 15:24:04 crc kubenswrapper[4921]: I0312 15:24:04.805280 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk_6cbcab60-00bb-4477-a36c-5d3f8298ab6b/pull/0.log" Mar 12 15:24:04 crc kubenswrapper[4921]: I0312 15:24:04.817021 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bffqn\" (UniqueName: \"kubernetes.io/projected/2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476-kube-api-access-bffqn\") on node \"crc\" DevicePath \"\"" Mar 12 15:24:04 crc kubenswrapper[4921]: I0312 15:24:04.833977 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk_6cbcab60-00bb-4477-a36c-5d3f8298ab6b/extract/0.log" Mar 12 15:24:04 crc kubenswrapper[4921]: I0312 15:24:04.966313 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7_8247093d-09e8-4ff9-8a21-902c3135b7ab/util/0.log" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.169439 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7_8247093d-09e8-4ff9-8a21-902c3135b7ab/util/0.log" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.172069 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7_8247093d-09e8-4ff9-8a21-902c3135b7ab/pull/0.log" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.198300 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7_8247093d-09e8-4ff9-8a21-902c3135b7ab/pull/0.log" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.243751 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555484-rz5r2" event={"ID":"2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476","Type":"ContainerDied","Data":"dec7b708a450e7562cb6b16edd88041a22d768dd2000ea9f32e23df80f14b82c"} Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.243800 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dec7b708a450e7562cb6b16edd88041a22d768dd2000ea9f32e23df80f14b82c" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.243941 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555484-rz5r2" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.349446 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7_8247093d-09e8-4ff9-8a21-902c3135b7ab/extract/0.log" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.423955 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7_8247093d-09e8-4ff9-8a21-902c3135b7ab/pull/0.log" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.426691 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7_8247093d-09e8-4ff9-8a21-902c3135b7ab/util/0.log" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.578773 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gjth_01d61927-e67d-49cf-97e5-70d2fed9192b/extract-utilities/0.log" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.657050 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555478-nbrck"] Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.668449 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555478-nbrck"] Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.769420 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gjth_01d61927-e67d-49cf-97e5-70d2fed9192b/extract-content/0.log" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.807421 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gjth_01d61927-e67d-49cf-97e5-70d2fed9192b/extract-content/0.log" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.808875 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gjth_01d61927-e67d-49cf-97e5-70d2fed9192b/extract-utilities/0.log" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.957630 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gjth_01d61927-e67d-49cf-97e5-70d2fed9192b/extract-content/0.log" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.969597 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gjth_01d61927-e67d-49cf-97e5-70d2fed9192b/extract-utilities/0.log" Mar 12 15:24:05 crc kubenswrapper[4921]: I0312 15:24:05.995342 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ef400fc-80d5-42bd-9a95-b63a725790e4" path="/var/lib/kubelet/pods/5ef400fc-80d5-42bd-9a95-b63a725790e4/volumes" Mar 12 15:24:06 crc kubenswrapper[4921]: I0312 15:24:06.236139 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5622m_e8698537-b9bf-41de-9d12-68d07948c6e4/extract-utilities/0.log" Mar 12 15:24:06 crc kubenswrapper[4921]: I0312 15:24:06.550531 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5622m_e8698537-b9bf-41de-9d12-68d07948c6e4/extract-content/0.log" Mar 12 15:24:06 crc kubenswrapper[4921]: I0312 15:24:06.555276 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5622m_e8698537-b9bf-41de-9d12-68d07948c6e4/extract-utilities/0.log" Mar 12 15:24:06 crc kubenswrapper[4921]: I0312 15:24:06.557737 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5622m_e8698537-b9bf-41de-9d12-68d07948c6e4/extract-content/0.log" Mar 12 15:24:06 crc kubenswrapper[4921]: I0312 15:24:06.757232 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5622m_e8698537-b9bf-41de-9d12-68d07948c6e4/extract-content/0.log" Mar 12 15:24:06 crc kubenswrapper[4921]: I0312 15:24:06.841222 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5622m_e8698537-b9bf-41de-9d12-68d07948c6e4/extract-utilities/0.log" Mar 12 15:24:07 crc kubenswrapper[4921]: I0312 15:24:07.089294 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5622m_e8698537-b9bf-41de-9d12-68d07948c6e4/registry-server/0.log" Mar 12 15:24:07 crc kubenswrapper[4921]: I0312 15:24:07.132900 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cc774_f8eea941-027c-44f8-a189-b7e9b3c6cb55/marketplace-operator/0.log" Mar 12 15:24:07 crc kubenswrapper[4921]: I0312 15:24:07.270874 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnh65_587b8721-fb47-4cd2-8c47-917e0b6dd5dc/extract-utilities/0.log" Mar 12 15:24:07 crc kubenswrapper[4921]: I0312 15:24:07.427839 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gjth_01d61927-e67d-49cf-97e5-70d2fed9192b/registry-server/0.log" Mar 12 15:24:07 crc kubenswrapper[4921]: I0312 15:24:07.491291 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnh65_587b8721-fb47-4cd2-8c47-917e0b6dd5dc/extract-utilities/0.log" Mar 12 15:24:07 crc kubenswrapper[4921]: I0312 15:24:07.529020 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnh65_587b8721-fb47-4cd2-8c47-917e0b6dd5dc/extract-content/0.log" Mar 12 15:24:07 crc kubenswrapper[4921]: I0312 15:24:07.614637 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnh65_587b8721-fb47-4cd2-8c47-917e0b6dd5dc/extract-content/0.log" Mar 12 15:24:07 crc kubenswrapper[4921]: I0312 15:24:07.763145 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnh65_587b8721-fb47-4cd2-8c47-917e0b6dd5dc/extract-content/0.log" Mar 12 15:24:07 crc kubenswrapper[4921]: I0312 15:24:07.765582 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnh65_587b8721-fb47-4cd2-8c47-917e0b6dd5dc/extract-utilities/0.log" Mar 12 15:24:08 crc kubenswrapper[4921]: I0312 15:24:08.035294 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnh65_587b8721-fb47-4cd2-8c47-917e0b6dd5dc/registry-server/0.log" Mar 12 15:24:08 crc kubenswrapper[4921]: I0312 15:24:08.221189 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bstd5_96baa3f9-7cf9-499b-94ad-0f8cd1a98f76/extract-utilities/0.log" Mar 12 15:24:08 crc kubenswrapper[4921]: I0312 15:24:08.415799 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bstd5_96baa3f9-7cf9-499b-94ad-0f8cd1a98f76/extract-content/0.log" Mar 12 15:24:08 crc kubenswrapper[4921]: I0312 15:24:08.420943 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bstd5_96baa3f9-7cf9-499b-94ad-0f8cd1a98f76/extract-content/0.log" Mar 12 15:24:08 crc kubenswrapper[4921]: I0312 15:24:08.425758 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bstd5_96baa3f9-7cf9-499b-94ad-0f8cd1a98f76/extract-utilities/0.log" Mar 12 15:24:08 crc kubenswrapper[4921]: I0312 15:24:08.614351 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bstd5_96baa3f9-7cf9-499b-94ad-0f8cd1a98f76/extract-utilities/0.log" Mar 12 15:24:08 crc kubenswrapper[4921]: I0312 15:24:08.646385 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bstd5_96baa3f9-7cf9-499b-94ad-0f8cd1a98f76/extract-content/0.log" Mar 12 15:24:08 crc kubenswrapper[4921]: I0312 15:24:08.964657 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bstd5_96baa3f9-7cf9-499b-94ad-0f8cd1a98f76/registry-server/0.log" Mar 12 15:24:14 crc kubenswrapper[4921]: I0312 15:24:14.983972 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:24:14 crc kubenswrapper[4921]: E0312 15:24:14.984765 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:24:29 crc kubenswrapper[4921]: I0312 15:24:29.271044 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:24:29 crc kubenswrapper[4921]: E0312 15:24:29.272239 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:24:42 crc kubenswrapper[4921]: I0312 15:24:42.983400 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:24:42 crc kubenswrapper[4921]: E0312 15:24:42.984563 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:24:51 crc kubenswrapper[4921]: I0312 15:24:51.571255 4921 scope.go:117] "RemoveContainer" containerID="bd382fc7d191d23425fab8be0a14c6b79e6fada1e4101839597734931d0c0a6b" Mar 12 15:24:53 crc kubenswrapper[4921]: I0312 15:24:53.984028 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:24:53 crc kubenswrapper[4921]: E0312 15:24:53.985175 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:25:08 crc kubenswrapper[4921]: I0312 15:25:08.984309 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:25:08 crc kubenswrapper[4921]: E0312 15:25:08.985649 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:25:19 crc kubenswrapper[4921]: I0312 15:25:19.989460 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:25:19 crc kubenswrapper[4921]: E0312 15:25:19.990665 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:25:31 crc kubenswrapper[4921]: I0312 15:25:31.983655 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:25:32 crc kubenswrapper[4921]: I0312 15:25:32.927135 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"2c8ef7671ee7546b2decc42a107ed27fc61073a784d7daa28725d0413b1e5ae7"} Mar 12 15:26:00 crc kubenswrapper[4921]: I0312 15:26:00.152492 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555486-7twgj"] Mar 12 15:26:00 crc kubenswrapper[4921]: E0312 15:26:00.154071 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476" containerName="oc" Mar 12 15:26:00 crc kubenswrapper[4921]: I0312 15:26:00.154098 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476" containerName="oc" Mar 12 15:26:00 crc kubenswrapper[4921]: I0312 15:26:00.154512 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476" containerName="oc" Mar 12 15:26:00 crc kubenswrapper[4921]: I0312 15:26:00.156032 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555486-7twgj" Mar 12 15:26:00 crc kubenswrapper[4921]: I0312 15:26:00.158123 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:26:00 crc kubenswrapper[4921]: I0312 15:26:00.158456 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:26:00 crc kubenswrapper[4921]: I0312 15:26:00.158729 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:26:00 crc kubenswrapper[4921]: I0312 15:26:00.168808 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555486-7twgj"] Mar 12 15:26:00 crc kubenswrapper[4921]: I0312 15:26:00.263742 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjr5c\" (UniqueName: \"kubernetes.io/projected/8ae661f9-d836-4ace-8e58-8a4b362c683a-kube-api-access-mjr5c\") pod \"auto-csr-approver-29555486-7twgj\" (UID: \"8ae661f9-d836-4ace-8e58-8a4b362c683a\") " pod="openshift-infra/auto-csr-approver-29555486-7twgj" Mar 12 15:26:00 crc kubenswrapper[4921]: I0312 15:26:00.365935 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjr5c\" (UniqueName: \"kubernetes.io/projected/8ae661f9-d836-4ace-8e58-8a4b362c683a-kube-api-access-mjr5c\") pod \"auto-csr-approver-29555486-7twgj\" (UID: \"8ae661f9-d836-4ace-8e58-8a4b362c683a\") " pod="openshift-infra/auto-csr-approver-29555486-7twgj" Mar 12 15:26:00 crc kubenswrapper[4921]: I0312 15:26:00.385728 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjr5c\" (UniqueName: \"kubernetes.io/projected/8ae661f9-d836-4ace-8e58-8a4b362c683a-kube-api-access-mjr5c\") pod \"auto-csr-approver-29555486-7twgj\" (UID: \"8ae661f9-d836-4ace-8e58-8a4b362c683a\") " pod="openshift-infra/auto-csr-approver-29555486-7twgj" Mar 12 15:26:00 crc kubenswrapper[4921]: I0312 15:26:00.487136 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555486-7twgj" Mar 12 15:26:00 crc kubenswrapper[4921]: I0312 15:26:00.969654 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555486-7twgj"] Mar 12 15:26:01 crc kubenswrapper[4921]: I0312 15:26:01.193788 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555486-7twgj" event={"ID":"8ae661f9-d836-4ace-8e58-8a4b362c683a","Type":"ContainerStarted","Data":"856b794e6c8ed1877073370c2b923fba283c46691d882a672dc00ab2650ee8e2"} Mar 12 15:26:22 crc kubenswrapper[4921]: E0312 15:26:22.998213 4921 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = initializing source docker://registry.redhat.io/openshift4/ose-cli:latest: reading manifest latest in registry.redhat.io/openshift4/ose-cli: received unexpected HTTP status: 500 Internal Server Error" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 12 15:26:23 crc kubenswrapper[4921]: E0312 15:26:23.001799 4921 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 15:26:23 crc kubenswrapper[4921]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 12 15:26:23 crc kubenswrapper[4921]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mjr5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29555486-7twgj_openshift-infra(8ae661f9-d836-4ace-8e58-8a4b362c683a): ErrImagePull: initializing source docker://registry.redhat.io/openshift4/ose-cli:latest: reading manifest latest in registry.redhat.io/openshift4/ose-cli: received unexpected HTTP status: 500 Internal Server Error Mar 12 15:26:23 crc kubenswrapper[4921]: > logger="UnhandledError" Mar 12 15:26:23 crc kubenswrapper[4921]: E0312 15:26:23.003050 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"initializing source docker://registry.redhat.io/openshift4/ose-cli:latest: reading manifest latest in registry.redhat.io/openshift4/ose-cli: received unexpected HTTP status: 500 Internal Server Error\"" pod="openshift-infra/auto-csr-approver-29555486-7twgj" podUID="8ae661f9-d836-4ace-8e58-8a4b362c683a" Mar 12 15:26:23 crc kubenswrapper[4921]: E0312 15:26:23.457201 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29555486-7twgj" podUID="8ae661f9-d836-4ace-8e58-8a4b362c683a" Mar 12 15:26:40 crc kubenswrapper[4921]: I0312 15:26:40.687868 4921 generic.go:334] "Generic (PLEG): container finished" podID="8ae661f9-d836-4ace-8e58-8a4b362c683a" containerID="9c278395ac90ec85b24be344af27b73c70ec3445115c61ecbf5a670dfe309326" exitCode=0 Mar 12 15:26:40 crc kubenswrapper[4921]: I0312 15:26:40.687937 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555486-7twgj" event={"ID":"8ae661f9-d836-4ace-8e58-8a4b362c683a","Type":"ContainerDied","Data":"9c278395ac90ec85b24be344af27b73c70ec3445115c61ecbf5a670dfe309326"} Mar 12 15:26:42 crc kubenswrapper[4921]: I0312 15:26:42.063176 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555486-7twgj" Mar 12 15:26:42 crc kubenswrapper[4921]: I0312 15:26:42.158183 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjr5c\" (UniqueName: \"kubernetes.io/projected/8ae661f9-d836-4ace-8e58-8a4b362c683a-kube-api-access-mjr5c\") pod \"8ae661f9-d836-4ace-8e58-8a4b362c683a\" (UID: \"8ae661f9-d836-4ace-8e58-8a4b362c683a\") " Mar 12 15:26:42 crc kubenswrapper[4921]: I0312 15:26:42.168109 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ae661f9-d836-4ace-8e58-8a4b362c683a-kube-api-access-mjr5c" (OuterVolumeSpecName: "kube-api-access-mjr5c") pod "8ae661f9-d836-4ace-8e58-8a4b362c683a" (UID: "8ae661f9-d836-4ace-8e58-8a4b362c683a"). InnerVolumeSpecName "kube-api-access-mjr5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:26:42 crc kubenswrapper[4921]: I0312 15:26:42.262609 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjr5c\" (UniqueName: \"kubernetes.io/projected/8ae661f9-d836-4ace-8e58-8a4b362c683a-kube-api-access-mjr5c\") on node \"crc\" DevicePath \"\"" Mar 12 15:26:42 crc kubenswrapper[4921]: I0312 15:26:42.710958 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555486-7twgj" event={"ID":"8ae661f9-d836-4ace-8e58-8a4b362c683a","Type":"ContainerDied","Data":"856b794e6c8ed1877073370c2b923fba283c46691d882a672dc00ab2650ee8e2"} Mar 12 15:26:42 crc kubenswrapper[4921]: I0312 15:26:42.711012 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="856b794e6c8ed1877073370c2b923fba283c46691d882a672dc00ab2650ee8e2" Mar 12 15:26:42 crc kubenswrapper[4921]: I0312 15:26:42.711016 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555486-7twgj" Mar 12 15:26:43 crc kubenswrapper[4921]: I0312 15:26:43.145203 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555480-jhdsm"] Mar 12 15:26:43 crc kubenswrapper[4921]: I0312 15:26:43.158018 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555480-jhdsm"] Mar 12 15:26:43 crc kubenswrapper[4921]: I0312 15:26:43.997961 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28baf337-488f-431b-8f48-49fe07810f9e" path="/var/lib/kubelet/pods/28baf337-488f-431b-8f48-49fe07810f9e/volumes" Mar 12 15:26:51 crc kubenswrapper[4921]: I0312 15:26:51.672167 4921 scope.go:117] "RemoveContainer" containerID="11a2ce2cf26685312e91198842d10ea17050f901589e9477b94761370703cc4a" Mar 12 15:26:51 crc kubenswrapper[4921]: I0312 15:26:51.726634 4921 scope.go:117] "RemoveContainer" containerID="6734177d1276d460b3666c08a4038268df44a6c4aea5144bf67e7b2557966335" Mar 12 15:26:55 crc kubenswrapper[4921]: I0312 15:26:55.755908 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kztfc"] Mar 12 15:26:55 crc kubenswrapper[4921]: E0312 15:26:55.757371 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae661f9-d836-4ace-8e58-8a4b362c683a" containerName="oc" Mar 12 15:26:55 crc kubenswrapper[4921]: I0312 15:26:55.757388 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae661f9-d836-4ace-8e58-8a4b362c683a" containerName="oc" Mar 12 15:26:55 crc kubenswrapper[4921]: I0312 15:26:55.757604 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae661f9-d836-4ace-8e58-8a4b362c683a" containerName="oc" Mar 12 15:26:55 crc kubenswrapper[4921]: I0312 15:26:55.759310 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:26:55 crc kubenswrapper[4921]: I0312 15:26:55.764226 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kztfc"] Mar 12 15:26:55 crc kubenswrapper[4921]: I0312 15:26:55.930492 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxkq5\" (UniqueName: \"kubernetes.io/projected/17b70379-e148-4f61-ad19-24bb89c16339-kube-api-access-vxkq5\") pod \"redhat-operators-kztfc\" (UID: \"17b70379-e148-4f61-ad19-24bb89c16339\") " pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:26:55 crc kubenswrapper[4921]: I0312 15:26:55.931067 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b70379-e148-4f61-ad19-24bb89c16339-catalog-content\") pod \"redhat-operators-kztfc\" (UID: \"17b70379-e148-4f61-ad19-24bb89c16339\") " pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:26:55 crc kubenswrapper[4921]: I0312 15:26:55.931107 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b70379-e148-4f61-ad19-24bb89c16339-utilities\") pod \"redhat-operators-kztfc\" (UID: \"17b70379-e148-4f61-ad19-24bb89c16339\") " pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:26:56 crc kubenswrapper[4921]: I0312 15:26:56.034917 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b70379-e148-4f61-ad19-24bb89c16339-catalog-content\") pod \"redhat-operators-kztfc\" (UID: \"17b70379-e148-4f61-ad19-24bb89c16339\") " pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:26:56 crc kubenswrapper[4921]: I0312 15:26:56.034997 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b70379-e148-4f61-ad19-24bb89c16339-utilities\") pod \"redhat-operators-kztfc\" (UID: \"17b70379-e148-4f61-ad19-24bb89c16339\") " pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:26:56 crc kubenswrapper[4921]: I0312 15:26:56.035125 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxkq5\" (UniqueName: \"kubernetes.io/projected/17b70379-e148-4f61-ad19-24bb89c16339-kube-api-access-vxkq5\") pod \"redhat-operators-kztfc\" (UID: \"17b70379-e148-4f61-ad19-24bb89c16339\") " pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:26:56 crc kubenswrapper[4921]: I0312 15:26:56.036151 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b70379-e148-4f61-ad19-24bb89c16339-utilities\") pod \"redhat-operators-kztfc\" (UID: \"17b70379-e148-4f61-ad19-24bb89c16339\") " pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:26:56 crc kubenswrapper[4921]: I0312 15:26:56.036308 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b70379-e148-4f61-ad19-24bb89c16339-catalog-content\") pod \"redhat-operators-kztfc\" (UID: \"17b70379-e148-4f61-ad19-24bb89c16339\") " pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:26:56 crc kubenswrapper[4921]: I0312 15:26:56.056109 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxkq5\" (UniqueName: \"kubernetes.io/projected/17b70379-e148-4f61-ad19-24bb89c16339-kube-api-access-vxkq5\") pod \"redhat-operators-kztfc\" (UID: \"17b70379-e148-4f61-ad19-24bb89c16339\") " pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:26:56 crc kubenswrapper[4921]: I0312 15:26:56.093692 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:26:56 crc kubenswrapper[4921]: I0312 15:26:56.637724 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kztfc"] Mar 12 15:26:56 crc kubenswrapper[4921]: I0312 15:26:56.851326 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kztfc" event={"ID":"17b70379-e148-4f61-ad19-24bb89c16339","Type":"ContainerStarted","Data":"9566fa00ed84eaecf73e2769f177806cb708852c6bbd92c34830b616d5158eb6"} Mar 12 15:26:57 crc kubenswrapper[4921]: I0312 15:26:57.861071 4921 generic.go:334] "Generic (PLEG): container finished" podID="17b70379-e148-4f61-ad19-24bb89c16339" containerID="b1b4232285f7d9d8efcf53ca758679fba281d9109adeebf257a521cb1b97f4f8" exitCode=0 Mar 12 15:26:57 crc kubenswrapper[4921]: I0312 15:26:57.861138 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kztfc" event={"ID":"17b70379-e148-4f61-ad19-24bb89c16339","Type":"ContainerDied","Data":"b1b4232285f7d9d8efcf53ca758679fba281d9109adeebf257a521cb1b97f4f8"} Mar 12 15:27:01 crc kubenswrapper[4921]: I0312 15:27:01.931752 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kztfc" event={"ID":"17b70379-e148-4f61-ad19-24bb89c16339","Type":"ContainerStarted","Data":"e346d040ec97bfe03574ddce448b69c4666bad256b2eea2c4bb77787bba7baf8"} Mar 12 15:27:06 crc kubenswrapper[4921]: I0312 15:27:06.984270 4921 generic.go:334] "Generic (PLEG): container finished" podID="17b70379-e148-4f61-ad19-24bb89c16339" containerID="e346d040ec97bfe03574ddce448b69c4666bad256b2eea2c4bb77787bba7baf8" exitCode=0 Mar 12 15:27:06 crc kubenswrapper[4921]: I0312 15:27:06.984485 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kztfc" event={"ID":"17b70379-e148-4f61-ad19-24bb89c16339","Type":"ContainerDied","Data":"e346d040ec97bfe03574ddce448b69c4666bad256b2eea2c4bb77787bba7baf8"} Mar 12 15:27:06 crc kubenswrapper[4921]: I0312 15:27:06.988856 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:27:08 crc kubenswrapper[4921]: I0312 15:27:08.000734 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kztfc" event={"ID":"17b70379-e148-4f61-ad19-24bb89c16339","Type":"ContainerStarted","Data":"ed0ebc7168195f682e7dab1229e098c482770b531ab3ed27e4a4e7897d744dac"} Mar 12 15:27:08 crc kubenswrapper[4921]: I0312 15:27:08.027382 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kztfc" podStartSLOduration=3.23638126 podStartE2EDuration="13.027354248s" podCreationTimestamp="2026-03-12 15:26:55 +0000 UTC" firstStartedPulling="2026-03-12 15:26:57.864321154 +0000 UTC m=+8240.554393125" lastFinishedPulling="2026-03-12 15:27:07.655294142 +0000 UTC m=+8250.345366113" observedRunningTime="2026-03-12 15:27:08.021568118 +0000 UTC m=+8250.711640089" watchObservedRunningTime="2026-03-12 15:27:08.027354248 +0000 UTC m=+8250.717426229" Mar 12 15:27:16 crc kubenswrapper[4921]: I0312 15:27:16.094560 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:27:16 crc kubenswrapper[4921]: I0312 15:27:16.095194 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:27:16 crc kubenswrapper[4921]: I0312 15:27:16.147762 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:27:17 crc kubenswrapper[4921]: I0312 15:27:17.128092 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:27:17 crc kubenswrapper[4921]: I0312 15:27:17.178765 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kztfc"] Mar 12 15:27:19 crc kubenswrapper[4921]: I0312 15:27:19.092977 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kztfc" podUID="17b70379-e148-4f61-ad19-24bb89c16339" containerName="registry-server" containerID="cri-o://ed0ebc7168195f682e7dab1229e098c482770b531ab3ed27e4a4e7897d744dac" gracePeriod=2 Mar 12 15:27:19 crc kubenswrapper[4921]: I0312 15:27:19.603687 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:27:19 crc kubenswrapper[4921]: I0312 15:27:19.668071 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxkq5\" (UniqueName: \"kubernetes.io/projected/17b70379-e148-4f61-ad19-24bb89c16339-kube-api-access-vxkq5\") pod \"17b70379-e148-4f61-ad19-24bb89c16339\" (UID: \"17b70379-e148-4f61-ad19-24bb89c16339\") " Mar 12 15:27:19 crc kubenswrapper[4921]: I0312 15:27:19.668140 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b70379-e148-4f61-ad19-24bb89c16339-utilities\") pod \"17b70379-e148-4f61-ad19-24bb89c16339\" (UID: \"17b70379-e148-4f61-ad19-24bb89c16339\") " Mar 12 15:27:19 crc kubenswrapper[4921]: I0312 15:27:19.668253 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b70379-e148-4f61-ad19-24bb89c16339-catalog-content\") pod \"17b70379-e148-4f61-ad19-24bb89c16339\" (UID: \"17b70379-e148-4f61-ad19-24bb89c16339\") " Mar 12 15:27:19 crc kubenswrapper[4921]: I0312 15:27:19.669334 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17b70379-e148-4f61-ad19-24bb89c16339-utilities" (OuterVolumeSpecName: "utilities") pod "17b70379-e148-4f61-ad19-24bb89c16339" (UID: "17b70379-e148-4f61-ad19-24bb89c16339"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:27:19 crc kubenswrapper[4921]: I0312 15:27:19.690777 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b70379-e148-4f61-ad19-24bb89c16339-kube-api-access-vxkq5" (OuterVolumeSpecName: "kube-api-access-vxkq5") pod "17b70379-e148-4f61-ad19-24bb89c16339" (UID: "17b70379-e148-4f61-ad19-24bb89c16339"). InnerVolumeSpecName "kube-api-access-vxkq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:27:19 crc kubenswrapper[4921]: I0312 15:27:19.771269 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxkq5\" (UniqueName: \"kubernetes.io/projected/17b70379-e148-4f61-ad19-24bb89c16339-kube-api-access-vxkq5\") on node \"crc\" DevicePath \"\"" Mar 12 15:27:19 crc kubenswrapper[4921]: I0312 15:27:19.771329 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17b70379-e148-4f61-ad19-24bb89c16339-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:27:19 crc kubenswrapper[4921]: I0312 15:27:19.796957 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17b70379-e148-4f61-ad19-24bb89c16339-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17b70379-e148-4f61-ad19-24bb89c16339" (UID: "17b70379-e148-4f61-ad19-24bb89c16339"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:27:19 crc kubenswrapper[4921]: I0312 15:27:19.873833 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17b70379-e148-4f61-ad19-24bb89c16339-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.107480 4921 generic.go:334] "Generic (PLEG): container finished" podID="17b70379-e148-4f61-ad19-24bb89c16339" containerID="ed0ebc7168195f682e7dab1229e098c482770b531ab3ed27e4a4e7897d744dac" exitCode=0 Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.107547 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kztfc" event={"ID":"17b70379-e148-4f61-ad19-24bb89c16339","Type":"ContainerDied","Data":"ed0ebc7168195f682e7dab1229e098c482770b531ab3ed27e4a4e7897d744dac"} Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.107520 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kztfc" Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.107600 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kztfc" event={"ID":"17b70379-e148-4f61-ad19-24bb89c16339","Type":"ContainerDied","Data":"9566fa00ed84eaecf73e2769f177806cb708852c6bbd92c34830b616d5158eb6"} Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.107624 4921 scope.go:117] "RemoveContainer" containerID="ed0ebc7168195f682e7dab1229e098c482770b531ab3ed27e4a4e7897d744dac" Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.157666 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kztfc"] Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.163897 4921 scope.go:117] "RemoveContainer" containerID="e346d040ec97bfe03574ddce448b69c4666bad256b2eea2c4bb77787bba7baf8" Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.171333 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kztfc"] Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.192730 4921 scope.go:117] "RemoveContainer" containerID="b1b4232285f7d9d8efcf53ca758679fba281d9109adeebf257a521cb1b97f4f8" Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.234178 4921 scope.go:117] "RemoveContainer" containerID="ed0ebc7168195f682e7dab1229e098c482770b531ab3ed27e4a4e7897d744dac" Mar 12 15:27:20 crc kubenswrapper[4921]: E0312 15:27:20.234669 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed0ebc7168195f682e7dab1229e098c482770b531ab3ed27e4a4e7897d744dac\": container with ID starting with ed0ebc7168195f682e7dab1229e098c482770b531ab3ed27e4a4e7897d744dac not found: ID does not exist" containerID="ed0ebc7168195f682e7dab1229e098c482770b531ab3ed27e4a4e7897d744dac" Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.234724 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed0ebc7168195f682e7dab1229e098c482770b531ab3ed27e4a4e7897d744dac"} err="failed to get container status \"ed0ebc7168195f682e7dab1229e098c482770b531ab3ed27e4a4e7897d744dac\": rpc error: code = NotFound desc = could not find container \"ed0ebc7168195f682e7dab1229e098c482770b531ab3ed27e4a4e7897d744dac\": container with ID starting with ed0ebc7168195f682e7dab1229e098c482770b531ab3ed27e4a4e7897d744dac not found: ID does not exist" Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.234769 4921 scope.go:117] "RemoveContainer" containerID="e346d040ec97bfe03574ddce448b69c4666bad256b2eea2c4bb77787bba7baf8" Mar 12 15:27:20 crc kubenswrapper[4921]: E0312 15:27:20.235641 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e346d040ec97bfe03574ddce448b69c4666bad256b2eea2c4bb77787bba7baf8\": container with ID starting with e346d040ec97bfe03574ddce448b69c4666bad256b2eea2c4bb77787bba7baf8 not found: ID does not exist" containerID="e346d040ec97bfe03574ddce448b69c4666bad256b2eea2c4bb77787bba7baf8" Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.235680 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e346d040ec97bfe03574ddce448b69c4666bad256b2eea2c4bb77787bba7baf8"} err="failed to get container status \"e346d040ec97bfe03574ddce448b69c4666bad256b2eea2c4bb77787bba7baf8\": rpc error: code = NotFound desc = could not find container \"e346d040ec97bfe03574ddce448b69c4666bad256b2eea2c4bb77787bba7baf8\": container with ID starting with e346d040ec97bfe03574ddce448b69c4666bad256b2eea2c4bb77787bba7baf8 not found: ID does not exist" Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.235706 4921 scope.go:117] "RemoveContainer" containerID="b1b4232285f7d9d8efcf53ca758679fba281d9109adeebf257a521cb1b97f4f8" Mar 12 15:27:20 crc kubenswrapper[4921]: E0312 15:27:20.236171 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b4232285f7d9d8efcf53ca758679fba281d9109adeebf257a521cb1b97f4f8\": container with ID starting with b1b4232285f7d9d8efcf53ca758679fba281d9109adeebf257a521cb1b97f4f8 not found: ID does not exist" containerID="b1b4232285f7d9d8efcf53ca758679fba281d9109adeebf257a521cb1b97f4f8" Mar 12 15:27:20 crc kubenswrapper[4921]: I0312 15:27:20.236192 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b4232285f7d9d8efcf53ca758679fba281d9109adeebf257a521cb1b97f4f8"} err="failed to get container status \"b1b4232285f7d9d8efcf53ca758679fba281d9109adeebf257a521cb1b97f4f8\": rpc error: code = NotFound desc = could not find container \"b1b4232285f7d9d8efcf53ca758679fba281d9109adeebf257a521cb1b97f4f8\": container with ID starting with b1b4232285f7d9d8efcf53ca758679fba281d9109adeebf257a521cb1b97f4f8 not found: ID does not exist" Mar 12 15:27:21 crc kubenswrapper[4921]: I0312 15:27:21.997573 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17b70379-e148-4f61-ad19-24bb89c16339" path="/var/lib/kubelet/pods/17b70379-e148-4f61-ad19-24bb89c16339/volumes" Mar 12 15:27:37 crc kubenswrapper[4921]: I0312 15:27:37.305399 4921 generic.go:334] "Generic (PLEG): container finished" podID="67c4e944-fd05-4288-934f-5ecbe702e2b6" containerID="36f8982c976b2d67bd2a6d42d369b39129ee567469a0f23d436f05d556bc937a" exitCode=0 Mar 12 15:27:37 crc kubenswrapper[4921]: I0312 15:27:37.305440 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tcf58/must-gather-nhsgw" event={"ID":"67c4e944-fd05-4288-934f-5ecbe702e2b6","Type":"ContainerDied","Data":"36f8982c976b2d67bd2a6d42d369b39129ee567469a0f23d436f05d556bc937a"} Mar 12 15:27:37 crc kubenswrapper[4921]: I0312 15:27:37.306953 4921 scope.go:117] "RemoveContainer" containerID="36f8982c976b2d67bd2a6d42d369b39129ee567469a0f23d436f05d556bc937a" Mar 12 15:27:37 crc kubenswrapper[4921]: I0312 15:27:37.558144 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tcf58_must-gather-nhsgw_67c4e944-fd05-4288-934f-5ecbe702e2b6/gather/0.log" Mar 12 15:27:47 crc kubenswrapper[4921]: I0312 15:27:47.204841 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-tcf58/must-gather-nhsgw"] Mar 12 15:27:47 crc kubenswrapper[4921]: I0312 15:27:47.205878 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-tcf58/must-gather-nhsgw" podUID="67c4e944-fd05-4288-934f-5ecbe702e2b6" containerName="copy" containerID="cri-o://d422d7efbeeafdc3adc7ce07afb9585227326b52c138b103905cc028668a2e26" gracePeriod=2 Mar 12 15:27:47 crc kubenswrapper[4921]: I0312 15:27:47.218303 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-tcf58/must-gather-nhsgw"] Mar 12 15:27:47 crc kubenswrapper[4921]: I0312 15:27:47.412037 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tcf58_must-gather-nhsgw_67c4e944-fd05-4288-934f-5ecbe702e2b6/copy/0.log" Mar 12 15:27:47 crc kubenswrapper[4921]: I0312 15:27:47.413100 4921 generic.go:334] "Generic (PLEG): container finished" podID="67c4e944-fd05-4288-934f-5ecbe702e2b6" containerID="d422d7efbeeafdc3adc7ce07afb9585227326b52c138b103905cc028668a2e26" exitCode=143 Mar 12 15:27:47 crc kubenswrapper[4921]: I0312 15:27:47.683141 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tcf58_must-gather-nhsgw_67c4e944-fd05-4288-934f-5ecbe702e2b6/copy/0.log" Mar 12 15:27:47 crc kubenswrapper[4921]: I0312 15:27:47.683737 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/must-gather-nhsgw" Mar 12 15:27:47 crc kubenswrapper[4921]: I0312 15:27:47.781964 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhb6v\" (UniqueName: \"kubernetes.io/projected/67c4e944-fd05-4288-934f-5ecbe702e2b6-kube-api-access-bhb6v\") pod \"67c4e944-fd05-4288-934f-5ecbe702e2b6\" (UID: \"67c4e944-fd05-4288-934f-5ecbe702e2b6\") " Mar 12 15:27:47 crc kubenswrapper[4921]: I0312 15:27:47.782241 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/67c4e944-fd05-4288-934f-5ecbe702e2b6-must-gather-output\") pod \"67c4e944-fd05-4288-934f-5ecbe702e2b6\" (UID: \"67c4e944-fd05-4288-934f-5ecbe702e2b6\") " Mar 12 15:27:47 crc kubenswrapper[4921]: I0312 15:27:47.793138 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67c4e944-fd05-4288-934f-5ecbe702e2b6-kube-api-access-bhb6v" (OuterVolumeSpecName: "kube-api-access-bhb6v") pod "67c4e944-fd05-4288-934f-5ecbe702e2b6" (UID: "67c4e944-fd05-4288-934f-5ecbe702e2b6"). InnerVolumeSpecName "kube-api-access-bhb6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:27:47 crc kubenswrapper[4921]: I0312 15:27:47.885149 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhb6v\" (UniqueName: \"kubernetes.io/projected/67c4e944-fd05-4288-934f-5ecbe702e2b6-kube-api-access-bhb6v\") on node \"crc\" DevicePath \"\"" Mar 12 15:27:48 crc kubenswrapper[4921]: I0312 15:27:48.081933 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67c4e944-fd05-4288-934f-5ecbe702e2b6-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "67c4e944-fd05-4288-934f-5ecbe702e2b6" (UID: "67c4e944-fd05-4288-934f-5ecbe702e2b6"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:27:48 crc kubenswrapper[4921]: I0312 15:27:48.099553 4921 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/67c4e944-fd05-4288-934f-5ecbe702e2b6-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 12 15:27:48 crc kubenswrapper[4921]: I0312 15:27:48.423661 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-tcf58_must-gather-nhsgw_67c4e944-fd05-4288-934f-5ecbe702e2b6/copy/0.log" Mar 12 15:27:48 crc kubenswrapper[4921]: I0312 15:27:48.424129 4921 scope.go:117] "RemoveContainer" containerID="d422d7efbeeafdc3adc7ce07afb9585227326b52c138b103905cc028668a2e26" Mar 12 15:27:48 crc kubenswrapper[4921]: I0312 15:27:48.424228 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tcf58/must-gather-nhsgw" Mar 12 15:27:48 crc kubenswrapper[4921]: I0312 15:27:48.446941 4921 scope.go:117] "RemoveContainer" containerID="36f8982c976b2d67bd2a6d42d369b39129ee567469a0f23d436f05d556bc937a" Mar 12 15:27:49 crc kubenswrapper[4921]: I0312 15:27:49.994684 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67c4e944-fd05-4288-934f-5ecbe702e2b6" path="/var/lib/kubelet/pods/67c4e944-fd05-4288-934f-5ecbe702e2b6/volumes" Mar 12 15:27:51 crc kubenswrapper[4921]: I0312 15:27:51.856756 4921 scope.go:117] "RemoveContainer" containerID="9f4cbf9225bc64836012350afca6bdae4e59fa1afad82e3cb8e61abf9ee83c48" Mar 12 15:27:56 crc kubenswrapper[4921]: I0312 15:27:56.324062 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:27:56 crc kubenswrapper[4921]: I0312 15:27:56.324851 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.161608 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555488-sdk2q"] Mar 12 15:28:00 crc kubenswrapper[4921]: E0312 15:28:00.162990 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67c4e944-fd05-4288-934f-5ecbe702e2b6" containerName="gather" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.163009 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="67c4e944-fd05-4288-934f-5ecbe702e2b6" containerName="gather" Mar 12 15:28:00 crc kubenswrapper[4921]: E0312 15:28:00.163038 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b70379-e148-4f61-ad19-24bb89c16339" containerName="extract-utilities" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.163047 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b70379-e148-4f61-ad19-24bb89c16339" containerName="extract-utilities" Mar 12 15:28:00 crc kubenswrapper[4921]: E0312 15:28:00.163069 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b70379-e148-4f61-ad19-24bb89c16339" containerName="registry-server" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.163077 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b70379-e148-4f61-ad19-24bb89c16339" containerName="registry-server" Mar 12 15:28:00 crc kubenswrapper[4921]: E0312 15:28:00.163093 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b70379-e148-4f61-ad19-24bb89c16339" containerName="extract-content" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.163102 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b70379-e148-4f61-ad19-24bb89c16339" containerName="extract-content" Mar 12 15:28:00 crc kubenswrapper[4921]: E0312 15:28:00.163120 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67c4e944-fd05-4288-934f-5ecbe702e2b6" containerName="copy" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.163128 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="67c4e944-fd05-4288-934f-5ecbe702e2b6" containerName="copy" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.163593 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="67c4e944-fd05-4288-934f-5ecbe702e2b6" containerName="copy" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.163611 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="67c4e944-fd05-4288-934f-5ecbe702e2b6" containerName="gather" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.163628 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b70379-e148-4f61-ad19-24bb89c16339" containerName="registry-server" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.165010 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555488-sdk2q" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.170409 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.170539 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.172927 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555488-sdk2q"] Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.176961 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.198255 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq42h\" (UniqueName: \"kubernetes.io/projected/594712a0-438a-4239-86bf-77785f152327-kube-api-access-jq42h\") pod \"auto-csr-approver-29555488-sdk2q\" (UID: \"594712a0-438a-4239-86bf-77785f152327\") " pod="openshift-infra/auto-csr-approver-29555488-sdk2q" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.301412 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq42h\" (UniqueName: \"kubernetes.io/projected/594712a0-438a-4239-86bf-77785f152327-kube-api-access-jq42h\") pod \"auto-csr-approver-29555488-sdk2q\" (UID: \"594712a0-438a-4239-86bf-77785f152327\") " pod="openshift-infra/auto-csr-approver-29555488-sdk2q" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.335702 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq42h\" (UniqueName: \"kubernetes.io/projected/594712a0-438a-4239-86bf-77785f152327-kube-api-access-jq42h\") pod \"auto-csr-approver-29555488-sdk2q\" (UID: \"594712a0-438a-4239-86bf-77785f152327\") " pod="openshift-infra/auto-csr-approver-29555488-sdk2q" Mar 12 15:28:00 crc kubenswrapper[4921]: I0312 15:28:00.498653 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555488-sdk2q" Mar 12 15:28:01 crc kubenswrapper[4921]: I0312 15:28:01.048588 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555488-sdk2q"] Mar 12 15:28:01 crc kubenswrapper[4921]: I0312 15:28:01.566941 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555488-sdk2q" event={"ID":"594712a0-438a-4239-86bf-77785f152327","Type":"ContainerStarted","Data":"23b326320c31e074390243d0fff72ce15c875ee236d9735a9b73afa054a6c604"} Mar 12 15:28:03 crc kubenswrapper[4921]: I0312 15:28:03.594342 4921 generic.go:334] "Generic (PLEG): container finished" podID="594712a0-438a-4239-86bf-77785f152327" containerID="45be2bb72a956c93c3583edf55de76d90b2dfde4092e328553566151abbd1b4e" exitCode=0 Mar 12 15:28:03 crc kubenswrapper[4921]: I0312 15:28:03.594437 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555488-sdk2q" event={"ID":"594712a0-438a-4239-86bf-77785f152327","Type":"ContainerDied","Data":"45be2bb72a956c93c3583edf55de76d90b2dfde4092e328553566151abbd1b4e"} Mar 12 15:28:05 crc kubenswrapper[4921]: I0312 15:28:05.058641 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555488-sdk2q" Mar 12 15:28:05 crc kubenswrapper[4921]: I0312 15:28:05.159477 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq42h\" (UniqueName: \"kubernetes.io/projected/594712a0-438a-4239-86bf-77785f152327-kube-api-access-jq42h\") pod \"594712a0-438a-4239-86bf-77785f152327\" (UID: \"594712a0-438a-4239-86bf-77785f152327\") " Mar 12 15:28:05 crc kubenswrapper[4921]: I0312 15:28:05.169139 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594712a0-438a-4239-86bf-77785f152327-kube-api-access-jq42h" (OuterVolumeSpecName: "kube-api-access-jq42h") pod "594712a0-438a-4239-86bf-77785f152327" (UID: "594712a0-438a-4239-86bf-77785f152327"). InnerVolumeSpecName "kube-api-access-jq42h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:28:05 crc kubenswrapper[4921]: I0312 15:28:05.262156 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq42h\" (UniqueName: \"kubernetes.io/projected/594712a0-438a-4239-86bf-77785f152327-kube-api-access-jq42h\") on node \"crc\" DevicePath \"\"" Mar 12 15:28:05 crc kubenswrapper[4921]: I0312 15:28:05.618180 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555488-sdk2q" event={"ID":"594712a0-438a-4239-86bf-77785f152327","Type":"ContainerDied","Data":"23b326320c31e074390243d0fff72ce15c875ee236d9735a9b73afa054a6c604"} Mar 12 15:28:05 crc kubenswrapper[4921]: I0312 15:28:05.618282 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23b326320c31e074390243d0fff72ce15c875ee236d9735a9b73afa054a6c604" Mar 12 15:28:05 crc kubenswrapper[4921]: I0312 15:28:05.618375 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555488-sdk2q" Mar 12 15:28:06 crc kubenswrapper[4921]: I0312 15:28:06.135929 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555482-2cztc"] Mar 12 15:28:06 crc kubenswrapper[4921]: I0312 15:28:06.153659 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555482-2cztc"] Mar 12 15:28:07 crc kubenswrapper[4921]: I0312 15:28:07.996102 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11b52980-8d5c-425d-addb-5227add5653f" path="/var/lib/kubelet/pods/11b52980-8d5c-425d-addb-5227add5653f/volumes" Mar 12 15:28:26 crc kubenswrapper[4921]: I0312 15:28:26.324038 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:28:26 crc kubenswrapper[4921]: I0312 15:28:26.325096 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:28:52 crc kubenswrapper[4921]: I0312 15:28:52.003776 4921 scope.go:117] "RemoveContainer" containerID="b816f229e478bb14860505718b596c4709f5f435529b1d35ba8b7e68c618bb25" Mar 12 15:28:56 crc kubenswrapper[4921]: I0312 15:28:56.323841 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:28:56 crc kubenswrapper[4921]: I0312 15:28:56.324655 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:28:56 crc kubenswrapper[4921]: I0312 15:28:56.324725 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 15:28:56 crc kubenswrapper[4921]: I0312 15:28:56.325844 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c8ef7671ee7546b2decc42a107ed27fc61073a784d7daa28725d0413b1e5ae7"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:28:56 crc kubenswrapper[4921]: I0312 15:28:56.325915 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://2c8ef7671ee7546b2decc42a107ed27fc61073a784d7daa28725d0413b1e5ae7" gracePeriod=600 Mar 12 15:28:57 crc kubenswrapper[4921]: I0312 15:28:57.208841 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="2c8ef7671ee7546b2decc42a107ed27fc61073a784d7daa28725d0413b1e5ae7" exitCode=0 Mar 12 15:28:57 crc kubenswrapper[4921]: I0312 15:28:57.208880 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"2c8ef7671ee7546b2decc42a107ed27fc61073a784d7daa28725d0413b1e5ae7"} Mar 12 15:28:57 crc kubenswrapper[4921]: I0312 15:28:57.209659 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54"} Mar 12 15:28:57 crc kubenswrapper[4921]: I0312 15:28:57.209683 4921 scope.go:117] "RemoveContainer" containerID="6d25df45263e51dff96ccfa4324d98d2b765a3de474c730622647eddb738ee4f" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.144557 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555490-dhtm5"] Mar 12 15:30:00 crc kubenswrapper[4921]: E0312 15:30:00.147301 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594712a0-438a-4239-86bf-77785f152327" containerName="oc" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.147351 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="594712a0-438a-4239-86bf-77785f152327" containerName="oc" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.148493 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="594712a0-438a-4239-86bf-77785f152327" containerName="oc" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.150122 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555490-dhtm5" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.158423 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.158730 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.158972 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.175518 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd"] Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.181972 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.184730 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.185069 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.190806 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555490-dhtm5"] Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.208686 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd"] Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.246762 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw252\" (UniqueName: \"kubernetes.io/projected/1a13769b-bac4-4484-8f58-fefc1c5532ee-kube-api-access-hw252\") pod \"auto-csr-approver-29555490-dhtm5\" (UID: \"1a13769b-bac4-4484-8f58-fefc1c5532ee\") " pod="openshift-infra/auto-csr-approver-29555490-dhtm5" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.350616 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98c9e025-b795-434d-9578-b2a8c3d32ab5-config-volume\") pod \"collect-profiles-29555490-2vjtd\" (UID: \"98c9e025-b795-434d-9578-b2a8c3d32ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.350991 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98c9e025-b795-434d-9578-b2a8c3d32ab5-secret-volume\") pod \"collect-profiles-29555490-2vjtd\" (UID: \"98c9e025-b795-434d-9578-b2a8c3d32ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.351260 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw252\" (UniqueName: \"kubernetes.io/projected/1a13769b-bac4-4484-8f58-fefc1c5532ee-kube-api-access-hw252\") pod \"auto-csr-approver-29555490-dhtm5\" (UID: \"1a13769b-bac4-4484-8f58-fefc1c5532ee\") " pod="openshift-infra/auto-csr-approver-29555490-dhtm5" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.351392 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjj5b\" (UniqueName: \"kubernetes.io/projected/98c9e025-b795-434d-9578-b2a8c3d32ab5-kube-api-access-vjj5b\") pod \"collect-profiles-29555490-2vjtd\" (UID: \"98c9e025-b795-434d-9578-b2a8c3d32ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.378585 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw252\" (UniqueName: \"kubernetes.io/projected/1a13769b-bac4-4484-8f58-fefc1c5532ee-kube-api-access-hw252\") pod \"auto-csr-approver-29555490-dhtm5\" (UID: \"1a13769b-bac4-4484-8f58-fefc1c5532ee\") " pod="openshift-infra/auto-csr-approver-29555490-dhtm5" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.454046 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98c9e025-b795-434d-9578-b2a8c3d32ab5-config-volume\") pod \"collect-profiles-29555490-2vjtd\" (UID: \"98c9e025-b795-434d-9578-b2a8c3d32ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.454204 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98c9e025-b795-434d-9578-b2a8c3d32ab5-secret-volume\") pod \"collect-profiles-29555490-2vjtd\" (UID: \"98c9e025-b795-434d-9578-b2a8c3d32ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.454318 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjj5b\" (UniqueName: \"kubernetes.io/projected/98c9e025-b795-434d-9578-b2a8c3d32ab5-kube-api-access-vjj5b\") pod \"collect-profiles-29555490-2vjtd\" (UID: \"98c9e025-b795-434d-9578-b2a8c3d32ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.455061 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98c9e025-b795-434d-9578-b2a8c3d32ab5-config-volume\") pod \"collect-profiles-29555490-2vjtd\" (UID: \"98c9e025-b795-434d-9578-b2a8c3d32ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.459417 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98c9e025-b795-434d-9578-b2a8c3d32ab5-secret-volume\") pod \"collect-profiles-29555490-2vjtd\" (UID: \"98c9e025-b795-434d-9578-b2a8c3d32ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.489167 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555490-dhtm5" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.506550 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjj5b\" (UniqueName: \"kubernetes.io/projected/98c9e025-b795-434d-9578-b2a8c3d32ab5-kube-api-access-vjj5b\") pod \"collect-profiles-29555490-2vjtd\" (UID: \"98c9e025-b795-434d-9578-b2a8c3d32ab5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.799887 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" Mar 12 15:30:00 crc kubenswrapper[4921]: I0312 15:30:00.960861 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555490-dhtm5"] Mar 12 15:30:01 crc kubenswrapper[4921]: I0312 15:30:01.267919 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd"] Mar 12 15:30:01 crc kubenswrapper[4921]: W0312 15:30:01.268803 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98c9e025_b795_434d_9578_b2a8c3d32ab5.slice/crio-bd3254a022ef339548702e1ad92a7cbcd6b21cb346fc50c6ce89635f93a3ed38 WatchSource:0}: Error finding container bd3254a022ef339548702e1ad92a7cbcd6b21cb346fc50c6ce89635f93a3ed38: Status 404 returned error can't find the container with id bd3254a022ef339548702e1ad92a7cbcd6b21cb346fc50c6ce89635f93a3ed38 Mar 12 15:30:01 crc kubenswrapper[4921]: I0312 15:30:01.892482 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555490-dhtm5" event={"ID":"1a13769b-bac4-4484-8f58-fefc1c5532ee","Type":"ContainerStarted","Data":"990ce43a6d7cc6272290e8898fc509af0a8a80ceb57c966bc14cf97c3ea72df2"} Mar 12 15:30:01 crc kubenswrapper[4921]: I0312 15:30:01.894070 4921 generic.go:334] "Generic (PLEG): container finished" podID="98c9e025-b795-434d-9578-b2a8c3d32ab5" containerID="82f0b978609d038ca88535dd875294ed27f2cdbadbe48df6403dc820a3892d66" exitCode=0 Mar 12 15:30:01 crc kubenswrapper[4921]: I0312 15:30:01.894123 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" event={"ID":"98c9e025-b795-434d-9578-b2a8c3d32ab5","Type":"ContainerDied","Data":"82f0b978609d038ca88535dd875294ed27f2cdbadbe48df6403dc820a3892d66"} Mar 12 15:30:01 crc kubenswrapper[4921]: I0312 15:30:01.894152 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" event={"ID":"98c9e025-b795-434d-9578-b2a8c3d32ab5","Type":"ContainerStarted","Data":"bd3254a022ef339548702e1ad92a7cbcd6b21cb346fc50c6ce89635f93a3ed38"} Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.281100 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.422699 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98c9e025-b795-434d-9578-b2a8c3d32ab5-config-volume\") pod \"98c9e025-b795-434d-9578-b2a8c3d32ab5\" (UID: \"98c9e025-b795-434d-9578-b2a8c3d32ab5\") " Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.423180 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjj5b\" (UniqueName: \"kubernetes.io/projected/98c9e025-b795-434d-9578-b2a8c3d32ab5-kube-api-access-vjj5b\") pod \"98c9e025-b795-434d-9578-b2a8c3d32ab5\" (UID: \"98c9e025-b795-434d-9578-b2a8c3d32ab5\") " Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.423351 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98c9e025-b795-434d-9578-b2a8c3d32ab5-secret-volume\") pod \"98c9e025-b795-434d-9578-b2a8c3d32ab5\" (UID: \"98c9e025-b795-434d-9578-b2a8c3d32ab5\") " Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.425736 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c9e025-b795-434d-9578-b2a8c3d32ab5-config-volume" (OuterVolumeSpecName: "config-volume") pod "98c9e025-b795-434d-9578-b2a8c3d32ab5" (UID: "98c9e025-b795-434d-9578-b2a8c3d32ab5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.431334 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c9e025-b795-434d-9578-b2a8c3d32ab5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "98c9e025-b795-434d-9578-b2a8c3d32ab5" (UID: "98c9e025-b795-434d-9578-b2a8c3d32ab5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.432059 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c9e025-b795-434d-9578-b2a8c3d32ab5-kube-api-access-vjj5b" (OuterVolumeSpecName: "kube-api-access-vjj5b") pod "98c9e025-b795-434d-9578-b2a8c3d32ab5" (UID: "98c9e025-b795-434d-9578-b2a8c3d32ab5"). InnerVolumeSpecName "kube-api-access-vjj5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.526039 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjj5b\" (UniqueName: \"kubernetes.io/projected/98c9e025-b795-434d-9578-b2a8c3d32ab5-kube-api-access-vjj5b\") on node \"crc\" DevicePath \"\"" Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.526084 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98c9e025-b795-434d-9578-b2a8c3d32ab5-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.526103 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98c9e025-b795-434d-9578-b2a8c3d32ab5-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.916224 4921 generic.go:334] "Generic (PLEG): container finished" podID="1a13769b-bac4-4484-8f58-fefc1c5532ee" containerID="38da3af9d2069f7cbd520d4b35ed77ce32b373721aed72037f0520e55f5fa6bc" exitCode=0 Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.916344 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555490-dhtm5" event={"ID":"1a13769b-bac4-4484-8f58-fefc1c5532ee","Type":"ContainerDied","Data":"38da3af9d2069f7cbd520d4b35ed77ce32b373721aed72037f0520e55f5fa6bc"} Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.920443 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" event={"ID":"98c9e025-b795-434d-9578-b2a8c3d32ab5","Type":"ContainerDied","Data":"bd3254a022ef339548702e1ad92a7cbcd6b21cb346fc50c6ce89635f93a3ed38"} Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.920503 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd3254a022ef339548702e1ad92a7cbcd6b21cb346fc50c6ce89635f93a3ed38" Mar 12 15:30:03 crc kubenswrapper[4921]: I0312 15:30:03.920599 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555490-2vjtd" Mar 12 15:30:04 crc kubenswrapper[4921]: I0312 15:30:04.355836 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr"] Mar 12 15:30:04 crc kubenswrapper[4921]: I0312 15:30:04.365363 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555445-xtjfr"] Mar 12 15:30:05 crc kubenswrapper[4921]: I0312 15:30:05.270754 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555490-dhtm5" Mar 12 15:30:05 crc kubenswrapper[4921]: I0312 15:30:05.369724 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw252\" (UniqueName: \"kubernetes.io/projected/1a13769b-bac4-4484-8f58-fefc1c5532ee-kube-api-access-hw252\") pod \"1a13769b-bac4-4484-8f58-fefc1c5532ee\" (UID: \"1a13769b-bac4-4484-8f58-fefc1c5532ee\") " Mar 12 15:30:05 crc kubenswrapper[4921]: I0312 15:30:05.378035 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a13769b-bac4-4484-8f58-fefc1c5532ee-kube-api-access-hw252" (OuterVolumeSpecName: "kube-api-access-hw252") pod "1a13769b-bac4-4484-8f58-fefc1c5532ee" (UID: "1a13769b-bac4-4484-8f58-fefc1c5532ee"). InnerVolumeSpecName "kube-api-access-hw252". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:30:05 crc kubenswrapper[4921]: I0312 15:30:05.473432 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw252\" (UniqueName: \"kubernetes.io/projected/1a13769b-bac4-4484-8f58-fefc1c5532ee-kube-api-access-hw252\") on node \"crc\" DevicePath \"\"" Mar 12 15:30:05 crc kubenswrapper[4921]: I0312 15:30:05.937895 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555490-dhtm5" event={"ID":"1a13769b-bac4-4484-8f58-fefc1c5532ee","Type":"ContainerDied","Data":"990ce43a6d7cc6272290e8898fc509af0a8a80ceb57c966bc14cf97c3ea72df2"} Mar 12 15:30:05 crc kubenswrapper[4921]: I0312 15:30:05.937947 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="990ce43a6d7cc6272290e8898fc509af0a8a80ceb57c966bc14cf97c3ea72df2" Mar 12 15:30:05 crc kubenswrapper[4921]: I0312 15:30:05.938015 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555490-dhtm5" Mar 12 15:30:05 crc kubenswrapper[4921]: I0312 15:30:05.997070 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80794446-210c-4c0c-ae85-fdbd7565ba54" path="/var/lib/kubelet/pods/80794446-210c-4c0c-ae85-fdbd7565ba54/volumes" Mar 12 15:30:06 crc kubenswrapper[4921]: I0312 15:30:06.328743 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555484-rz5r2"] Mar 12 15:30:06 crc kubenswrapper[4921]: I0312 15:30:06.344231 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555484-rz5r2"] Mar 12 15:30:07 crc kubenswrapper[4921]: I0312 15:30:07.996714 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476" path="/var/lib/kubelet/pods/2d43d1d5-a4a4-45e5-bf6a-c51ffd2d3476/volumes" Mar 12 15:30:52 crc kubenswrapper[4921]: I0312 15:30:52.092586 4921 scope.go:117] "RemoveContainer" containerID="4011dd91f23660c3f69dfd2777f071e28923f396187572e0175bd19be349edb1" Mar 12 15:30:52 crc kubenswrapper[4921]: I0312 15:30:52.142910 4921 scope.go:117] "RemoveContainer" containerID="6adc46e47163256d4b6935553ca5971c6204a6f4c4458de1de5fd1470c4d8efe" Mar 12 15:30:56 crc kubenswrapper[4921]: I0312 15:30:56.324552 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:30:56 crc kubenswrapper[4921]: I0312 15:30:56.325423 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.799865 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-57wwd"] Mar 12 15:31:01 crc kubenswrapper[4921]: E0312 15:31:01.802066 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a13769b-bac4-4484-8f58-fefc1c5532ee" containerName="oc" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.802109 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a13769b-bac4-4484-8f58-fefc1c5532ee" containerName="oc" Mar 12 15:31:01 crc kubenswrapper[4921]: E0312 15:31:01.802192 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c9e025-b795-434d-9578-b2a8c3d32ab5" containerName="collect-profiles" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.802211 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c9e025-b795-434d-9578-b2a8c3d32ab5" containerName="collect-profiles" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.802940 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a13769b-bac4-4484-8f58-fefc1c5532ee" containerName="oc" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.803011 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c9e025-b795-434d-9578-b2a8c3d32ab5" containerName="collect-profiles" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.809274 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.809762 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-57wwd"] Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.811072 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgg5c\" (UniqueName: \"kubernetes.io/projected/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-kube-api-access-mgg5c\") pod \"community-operators-57wwd\" (UID: \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\") " pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.811199 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-utilities\") pod \"community-operators-57wwd\" (UID: \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\") " pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.811229 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-catalog-content\") pod \"community-operators-57wwd\" (UID: \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\") " pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.913559 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgg5c\" (UniqueName: \"kubernetes.io/projected/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-kube-api-access-mgg5c\") pod \"community-operators-57wwd\" (UID: \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\") " pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.913960 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-utilities\") pod \"community-operators-57wwd\" (UID: \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\") " pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.914047 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-catalog-content\") pod \"community-operators-57wwd\" (UID: \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\") " pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.914653 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-utilities\") pod \"community-operators-57wwd\" (UID: \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\") " pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.914777 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-catalog-content\") pod \"community-operators-57wwd\" (UID: \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\") " pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:01 crc kubenswrapper[4921]: I0312 15:31:01.937432 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgg5c\" (UniqueName: \"kubernetes.io/projected/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-kube-api-access-mgg5c\") pod \"community-operators-57wwd\" (UID: \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\") " pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:02 crc kubenswrapper[4921]: I0312 15:31:02.051642 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-bcbd96998-bx4p5_59a6f440-5a89-42a7-baa1-77a875476665/barbican-api-log/0.log" Mar 12 15:31:02 crc kubenswrapper[4921]: I0312 15:31:02.139068 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:02 crc kubenswrapper[4921]: I0312 15:31:02.767695 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-57wwd"] Mar 12 15:31:02 crc kubenswrapper[4921]: W0312 15:31:02.768776 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e0f4a33_9f4c_4252_acd6_0e65e7e89894.slice/crio-ef0070e450e250712a0d8d70f2b91445a2dc181b3c8e5154edd7d08786957650 WatchSource:0}: Error finding container ef0070e450e250712a0d8d70f2b91445a2dc181b3c8e5154edd7d08786957650: Status 404 returned error can't find the container with id ef0070e450e250712a0d8d70f2b91445a2dc181b3c8e5154edd7d08786957650 Mar 12 15:31:02 crc kubenswrapper[4921]: I0312 15:31:02.935189 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76b64f84d4-tpqnj_47867e82-3783-4f22-bc4f-9128016cf98e/barbican-keystone-listener-log/0.log" Mar 12 15:31:03 crc kubenswrapper[4921]: I0312 15:31:03.356331 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-594f99766c-xf6hh_6c4d7515-b40d-418c-b32e-b6a857c040a7/barbican-worker-log/0.log" Mar 12 15:31:03 crc kubenswrapper[4921]: I0312 15:31:03.527454 4921 generic.go:334] "Generic (PLEG): container finished" podID="6e0f4a33-9f4c-4252-acd6-0e65e7e89894" containerID="5764757cd562f79801d5804fcde8554f975f46f1cdf5efc73a8a70f8a4711cf4" exitCode=0 Mar 12 15:31:03 crc kubenswrapper[4921]: I0312 15:31:03.527506 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57wwd" event={"ID":"6e0f4a33-9f4c-4252-acd6-0e65e7e89894","Type":"ContainerDied","Data":"5764757cd562f79801d5804fcde8554f975f46f1cdf5efc73a8a70f8a4711cf4"} Mar 12 15:31:03 crc kubenswrapper[4921]: I0312 15:31:03.527925 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57wwd" event={"ID":"6e0f4a33-9f4c-4252-acd6-0e65e7e89894","Type":"ContainerStarted","Data":"ef0070e450e250712a0d8d70f2b91445a2dc181b3c8e5154edd7d08786957650"} Mar 12 15:31:03 crc kubenswrapper[4921]: I0312 15:31:03.800807 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf_e5130d9e-9678-42d8-9394-bcced05db054/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:31:04 crc kubenswrapper[4921]: I0312 15:31:04.327960 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f195685b-74f0-4887-8598-367bf4425faa/ceilometer-central-agent/0.log" Mar 12 15:31:04 crc kubenswrapper[4921]: I0312 15:31:04.760338 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-dt558_f5b6000a-13f1-4d52-9a03-3b777b3d651d/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:31:05 crc kubenswrapper[4921]: I0312 15:31:05.213540 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk_cbaebc43-5127-4000-abb3-79a878177cd2/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:31:05 crc kubenswrapper[4921]: I0312 15:31:05.562104 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57wwd" event={"ID":"6e0f4a33-9f4c-4252-acd6-0e65e7e89894","Type":"ContainerStarted","Data":"13d5f0a7c025dd10642da7e0af41af70a38ed55aefc9e5e06a1819792f728ce4"} Mar 12 15:31:05 crc kubenswrapper[4921]: I0312 15:31:05.696932 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a5b74f92-1f9b-4321-b549-47269e3eb04c/cinder-api-log/0.log" Mar 12 15:31:06 crc kubenswrapper[4921]: I0312 15:31:06.282463 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-1_b1c64c98-e301-4386-b33e-ccd4fde7592d/cinder-api-log/0.log" Mar 12 15:31:06 crc kubenswrapper[4921]: I0312 15:31:06.572540 4921 generic.go:334] "Generic (PLEG): container finished" podID="6e0f4a33-9f4c-4252-acd6-0e65e7e89894" containerID="13d5f0a7c025dd10642da7e0af41af70a38ed55aefc9e5e06a1819792f728ce4" exitCode=0 Mar 12 15:31:06 crc kubenswrapper[4921]: I0312 15:31:06.572588 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57wwd" event={"ID":"6e0f4a33-9f4c-4252-acd6-0e65e7e89894","Type":"ContainerDied","Data":"13d5f0a7c025dd10642da7e0af41af70a38ed55aefc9e5e06a1819792f728ce4"} Mar 12 15:31:07 crc kubenswrapper[4921]: I0312 15:31:07.584596 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57wwd" event={"ID":"6e0f4a33-9f4c-4252-acd6-0e65e7e89894","Type":"ContainerStarted","Data":"ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392"} Mar 12 15:31:07 crc kubenswrapper[4921]: I0312 15:31:07.603602 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-57wwd" podStartSLOduration=3.073663634 podStartE2EDuration="6.603576265s" podCreationTimestamp="2026-03-12 15:31:01 +0000 UTC" firstStartedPulling="2026-03-12 15:31:03.529592159 +0000 UTC m=+8486.219664130" lastFinishedPulling="2026-03-12 15:31:07.05950479 +0000 UTC m=+8489.749576761" observedRunningTime="2026-03-12 15:31:07.603464581 +0000 UTC m=+8490.293536572" watchObservedRunningTime="2026-03-12 15:31:07.603576265 +0000 UTC m=+8490.293648236" Mar 12 15:31:09 crc kubenswrapper[4921]: I0312 15:31:09.163038 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0ca55d43-e73b-403b-9760-f71e8b926650/cinder-backup/0.log" Mar 12 15:31:09 crc kubenswrapper[4921]: I0312 15:31:09.652743 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7cda98bc-d6ac-4204-8477-8ecd7dafb976/cinder-scheduler/0.log" Mar 12 15:31:12 crc kubenswrapper[4921]: I0312 15:31:12.140106 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:12 crc kubenswrapper[4921]: I0312 15:31:12.141051 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:12 crc kubenswrapper[4921]: I0312 15:31:12.188746 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_8671593e-1709-4d99-ae81-8639ee492d20/cinder-volume/0.log" Mar 12 15:31:12 crc kubenswrapper[4921]: I0312 15:31:12.190094 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:12 crc kubenswrapper[4921]: I0312 15:31:12.598443 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw_0a18ea59-b5e6-40e3-8096-0f2bda4563bb/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:31:12 crc kubenswrapper[4921]: I0312 15:31:12.686274 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:12 crc kubenswrapper[4921]: I0312 15:31:12.740739 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-57wwd"] Mar 12 15:31:13 crc kubenswrapper[4921]: I0312 15:31:13.039782 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pcpck_5a0ab9f2-e0b6-40e1-9816-a11f8135ed75/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:31:14 crc kubenswrapper[4921]: I0312 15:31:14.049499 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5664d5cbb7-9rpxn_5f732887-96f4-4cd5-9a36-df3848958280/dnsmasq-dns/0.log" Mar 12 15:31:14 crc kubenswrapper[4921]: I0312 15:31:14.462342 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3ddcb284-70a7-47da-8b0e-e5ba1f0a9443/glance-log/0.log" Mar 12 15:31:14 crc kubenswrapper[4921]: I0312 15:31:14.657427 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-57wwd" podUID="6e0f4a33-9f4c-4252-acd6-0e65e7e89894" containerName="registry-server" containerID="cri-o://ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392" gracePeriod=2 Mar 12 15:31:14 crc kubenswrapper[4921]: I0312 15:31:14.847676 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-1_5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b/glance-log/0.log" Mar 12 15:31:14 crc kubenswrapper[4921]: E0312 15:31:14.864678 4921 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e0f4a33_9f4c_4252_acd6_0e65e7e89894.slice/crio-ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e0f4a33_9f4c_4252_acd6_0e65e7e89894.slice/crio-conmon-ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392.scope\": RecentStats: unable to find data in memory cache]" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.115774 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.141939 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgg5c\" (UniqueName: \"kubernetes.io/projected/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-kube-api-access-mgg5c\") pod \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\" (UID: \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\") " Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.142078 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-utilities\") pod \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\" (UID: \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\") " Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.142247 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-catalog-content\") pod \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\" (UID: \"6e0f4a33-9f4c-4252-acd6-0e65e7e89894\") " Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.145124 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-utilities" (OuterVolumeSpecName: "utilities") pod "6e0f4a33-9f4c-4252-acd6-0e65e7e89894" (UID: "6e0f4a33-9f4c-4252-acd6-0e65e7e89894"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.150047 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-kube-api-access-mgg5c" (OuterVolumeSpecName: "kube-api-access-mgg5c") pod "6e0f4a33-9f4c-4252-acd6-0e65e7e89894" (UID: "6e0f4a33-9f4c-4252-acd6-0e65e7e89894"). InnerVolumeSpecName "kube-api-access-mgg5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.245531 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgg5c\" (UniqueName: \"kubernetes.io/projected/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-kube-api-access-mgg5c\") on node \"crc\" DevicePath \"\"" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.245584 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.258662 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d506b9f9-1563-432f-9b21-760ceb017fe9/glance-log/0.log" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.291867 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e0f4a33-9f4c-4252-acd6-0e65e7e89894" (UID: "6e0f4a33-9f4c-4252-acd6-0e65e7e89894"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.348139 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e0f4a33-9f4c-4252-acd6-0e65e7e89894-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.660342 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-1_739d7b6f-9f1d-4052-958f-e08821db9361/glance-log/0.log" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.669579 4921 generic.go:334] "Generic (PLEG): container finished" podID="6e0f4a33-9f4c-4252-acd6-0e65e7e89894" containerID="ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392" exitCode=0 Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.669634 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-57wwd" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.669632 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57wwd" event={"ID":"6e0f4a33-9f4c-4252-acd6-0e65e7e89894","Type":"ContainerDied","Data":"ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392"} Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.669783 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-57wwd" event={"ID":"6e0f4a33-9f4c-4252-acd6-0e65e7e89894","Type":"ContainerDied","Data":"ef0070e450e250712a0d8d70f2b91445a2dc181b3c8e5154edd7d08786957650"} Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.669841 4921 scope.go:117] "RemoveContainer" containerID="ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.703408 4921 scope.go:117] "RemoveContainer" containerID="13d5f0a7c025dd10642da7e0af41af70a38ed55aefc9e5e06a1819792f728ce4" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.709236 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-57wwd"] Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.724761 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-57wwd"] Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.730868 4921 scope.go:117] "RemoveContainer" containerID="5764757cd562f79801d5804fcde8554f975f46f1cdf5efc73a8a70f8a4711cf4" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.791666 4921 scope.go:117] "RemoveContainer" containerID="ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392" Mar 12 15:31:15 crc kubenswrapper[4921]: E0312 15:31:15.792138 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392\": container with ID starting with ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392 not found: ID does not exist" containerID="ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.792242 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392"} err="failed to get container status \"ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392\": rpc error: code = NotFound desc = could not find container \"ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392\": container with ID starting with ac839c23ea2450a2629002d13d2b33b20146c0693cf38336b047d76da3cb2392 not found: ID does not exist" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.792337 4921 scope.go:117] "RemoveContainer" containerID="13d5f0a7c025dd10642da7e0af41af70a38ed55aefc9e5e06a1819792f728ce4" Mar 12 15:31:15 crc kubenswrapper[4921]: E0312 15:31:15.792663 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d5f0a7c025dd10642da7e0af41af70a38ed55aefc9e5e06a1819792f728ce4\": container with ID starting with 13d5f0a7c025dd10642da7e0af41af70a38ed55aefc9e5e06a1819792f728ce4 not found: ID does not exist" containerID="13d5f0a7c025dd10642da7e0af41af70a38ed55aefc9e5e06a1819792f728ce4" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.792744 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d5f0a7c025dd10642da7e0af41af70a38ed55aefc9e5e06a1819792f728ce4"} err="failed to get container status \"13d5f0a7c025dd10642da7e0af41af70a38ed55aefc9e5e06a1819792f728ce4\": rpc error: code = NotFound desc = could not find container \"13d5f0a7c025dd10642da7e0af41af70a38ed55aefc9e5e06a1819792f728ce4\": container with ID starting with 13d5f0a7c025dd10642da7e0af41af70a38ed55aefc9e5e06a1819792f728ce4 not found: ID does not exist" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.792806 4921 scope.go:117] "RemoveContainer" containerID="5764757cd562f79801d5804fcde8554f975f46f1cdf5efc73a8a70f8a4711cf4" Mar 12 15:31:15 crc kubenswrapper[4921]: E0312 15:31:15.793129 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5764757cd562f79801d5804fcde8554f975f46f1cdf5efc73a8a70f8a4711cf4\": container with ID starting with 5764757cd562f79801d5804fcde8554f975f46f1cdf5efc73a8a70f8a4711cf4 not found: ID does not exist" containerID="5764757cd562f79801d5804fcde8554f975f46f1cdf5efc73a8a70f8a4711cf4" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.793210 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5764757cd562f79801d5804fcde8554f975f46f1cdf5efc73a8a70f8a4711cf4"} err="failed to get container status \"5764757cd562f79801d5804fcde8554f975f46f1cdf5efc73a8a70f8a4711cf4\": rpc error: code = NotFound desc = could not find container \"5764757cd562f79801d5804fcde8554f975f46f1cdf5efc73a8a70f8a4711cf4\": container with ID starting with 5764757cd562f79801d5804fcde8554f975f46f1cdf5efc73a8a70f8a4711cf4 not found: ID does not exist" Mar 12 15:31:15 crc kubenswrapper[4921]: I0312 15:31:15.995353 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e0f4a33-9f4c-4252-acd6-0e65e7e89894" path="/var/lib/kubelet/pods/6e0f4a33-9f4c-4252-acd6-0e65e7e89894/volumes" Mar 12 15:31:16 crc kubenswrapper[4921]: I0312 15:31:16.977256 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-bbd56cc76-cwl96_e6e62dec-8193-4d3c-a111-2ee250f79b86/horizon-log/0.log" Mar 12 15:31:17 crc kubenswrapper[4921]: I0312 15:31:17.415923 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl_c4eac827-ab86-4fef-b974-8638416f5125/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:31:17 crc kubenswrapper[4921]: I0312 15:31:17.873277 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hfsqf_56567424-34cd-49a4-ad03-c72a25a07058/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:31:19 crc kubenswrapper[4921]: I0312 15:31:19.524359 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c8b44c5c7-l6d8m_8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118/keystone-api/0.log" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.585502 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qpg92"] Mar 12 15:31:20 crc kubenswrapper[4921]: E0312 15:31:20.586574 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0f4a33-9f4c-4252-acd6-0e65e7e89894" containerName="extract-utilities" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.586602 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0f4a33-9f4c-4252-acd6-0e65e7e89894" containerName="extract-utilities" Mar 12 15:31:20 crc kubenswrapper[4921]: E0312 15:31:20.586626 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0f4a33-9f4c-4252-acd6-0e65e7e89894" containerName="extract-content" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.586636 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0f4a33-9f4c-4252-acd6-0e65e7e89894" containerName="extract-content" Mar 12 15:31:20 crc kubenswrapper[4921]: E0312 15:31:20.586662 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e0f4a33-9f4c-4252-acd6-0e65e7e89894" containerName="registry-server" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.586670 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e0f4a33-9f4c-4252-acd6-0e65e7e89894" containerName="registry-server" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.586931 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e0f4a33-9f4c-4252-acd6-0e65e7e89894" containerName="registry-server" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.588934 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.615638 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qpg92"] Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.663331 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jv87\" (UniqueName: \"kubernetes.io/projected/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-kube-api-access-9jv87\") pod \"certified-operators-qpg92\" (UID: \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\") " pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.663373 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-utilities\") pod \"certified-operators-qpg92\" (UID: \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\") " pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.663400 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-catalog-content\") pod \"certified-operators-qpg92\" (UID: \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\") " pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.764664 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jv87\" (UniqueName: \"kubernetes.io/projected/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-kube-api-access-9jv87\") pod \"certified-operators-qpg92\" (UID: \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\") " pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.764709 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-utilities\") pod \"certified-operators-qpg92\" (UID: \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\") " pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.764731 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-catalog-content\") pod \"certified-operators-qpg92\" (UID: \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\") " pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.765407 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-catalog-content\") pod \"certified-operators-qpg92\" (UID: \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\") " pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.765809 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-utilities\") pod \"certified-operators-qpg92\" (UID: \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\") " pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.802426 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jv87\" (UniqueName: \"kubernetes.io/projected/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-kube-api-access-9jv87\") pod \"certified-operators-qpg92\" (UID: \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\") " pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:20 crc kubenswrapper[4921]: I0312 15:31:20.923355 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:21 crc kubenswrapper[4921]: I0312 15:31:21.423553 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c8b44c5c7-pc46f_3fcdfac3-13b0-42ac-9396-587a7d443e2a/keystone-api/0.log" Mar 12 15:31:21 crc kubenswrapper[4921]: I0312 15:31:21.515244 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qpg92"] Mar 12 15:31:21 crc kubenswrapper[4921]: I0312 15:31:21.745319 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpg92" event={"ID":"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af","Type":"ContainerStarted","Data":"58efa81c104593ff4e94050a9682038d5880658d40a3e7df5137d8e5fc4c3982"} Mar 12 15:31:21 crc kubenswrapper[4921]: I0312 15:31:21.745724 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpg92" event={"ID":"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af","Type":"ContainerStarted","Data":"a85259acad5ac0bbf5a22c265a6d851dbf02b323be574caf544c217451442b26"} Mar 12 15:31:21 crc kubenswrapper[4921]: I0312 15:31:21.808347 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29555401-cfhz9_c85b992e-689f-4f2f-9799-da7e608f6ca8/keystone-cron/0.log" Mar 12 15:31:22 crc kubenswrapper[4921]: I0312 15:31:22.192211 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29555461-nscpw_ce60198f-3189-4ce6-b4a7-32387eb98fa7/keystone-cron/0.log" Mar 12 15:31:22 crc kubenswrapper[4921]: I0312 15:31:22.582855 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_01d94a77-b0dc-48b9-863b-71dbccd74bfb/kube-state-metrics/0.log" Mar 12 15:31:22 crc kubenswrapper[4921]: I0312 15:31:22.757833 4921 generic.go:334] "Generic (PLEG): container finished" podID="1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" containerID="58efa81c104593ff4e94050a9682038d5880658d40a3e7df5137d8e5fc4c3982" exitCode=0 Mar 12 15:31:22 crc kubenswrapper[4921]: I0312 15:31:22.757885 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpg92" event={"ID":"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af","Type":"ContainerDied","Data":"58efa81c104593ff4e94050a9682038d5880658d40a3e7df5137d8e5fc4c3982"} Mar 12 15:31:23 crc kubenswrapper[4921]: I0312 15:31:23.093165 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6_2ee1e205-39b3-4648-8c21-4a7cd46b867f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:31:23 crc kubenswrapper[4921]: I0312 15:31:23.769158 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpg92" event={"ID":"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af","Type":"ContainerStarted","Data":"85162adaab5a114a350bd7d81fcbd63d930f724814eba5a7d6ced14f0592310b"} Mar 12 15:31:25 crc kubenswrapper[4921]: I0312 15:31:25.794675 4921 generic.go:334] "Generic (PLEG): container finished" podID="1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" containerID="85162adaab5a114a350bd7d81fcbd63d930f724814eba5a7d6ced14f0592310b" exitCode=0 Mar 12 15:31:25 crc kubenswrapper[4921]: I0312 15:31:25.794737 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpg92" event={"ID":"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af","Type":"ContainerDied","Data":"85162adaab5a114a350bd7d81fcbd63d930f724814eba5a7d6ced14f0592310b"} Mar 12 15:31:26 crc kubenswrapper[4921]: I0312 15:31:26.323425 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:31:26 crc kubenswrapper[4921]: I0312 15:31:26.323758 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:31:26 crc kubenswrapper[4921]: I0312 15:31:26.815400 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpg92" event={"ID":"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af","Type":"ContainerStarted","Data":"1e09d5df60ddeba3611052a84770ddf4a7e1b6b04d11ada7879e67ca4be2e4ec"} Mar 12 15:31:26 crc kubenswrapper[4921]: I0312 15:31:26.840769 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qpg92" podStartSLOduration=3.418622874 podStartE2EDuration="6.840750202s" podCreationTimestamp="2026-03-12 15:31:20 +0000 UTC" firstStartedPulling="2026-03-12 15:31:22.760888383 +0000 UTC m=+8505.450960354" lastFinishedPulling="2026-03-12 15:31:26.183015711 +0000 UTC m=+8508.873087682" observedRunningTime="2026-03-12 15:31:26.833934921 +0000 UTC m=+8509.524006892" watchObservedRunningTime="2026-03-12 15:31:26.840750202 +0000 UTC m=+8509.530822173" Mar 12 15:31:30 crc kubenswrapper[4921]: I0312 15:31:30.923671 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:30 crc kubenswrapper[4921]: I0312 15:31:30.924629 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:30 crc kubenswrapper[4921]: I0312 15:31:30.976102 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:31 crc kubenswrapper[4921]: I0312 15:31:31.918899 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:32 crc kubenswrapper[4921]: I0312 15:31:32.222655 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qpg92"] Mar 12 15:31:32 crc kubenswrapper[4921]: I0312 15:31:32.524982 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f0c221da-6e02-450a-a048-9c8292c208ff/memcached/0.log" Mar 12 15:31:33 crc kubenswrapper[4921]: I0312 15:31:33.883073 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qpg92" podUID="1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" containerName="registry-server" containerID="cri-o://1e09d5df60ddeba3611052a84770ddf4a7e1b6b04d11ada7879e67ca4be2e4ec" gracePeriod=2 Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.396134 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.510075 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-utilities\") pod \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\" (UID: \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\") " Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.510152 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-catalog-content\") pod \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\" (UID: \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\") " Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.510583 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jv87\" (UniqueName: \"kubernetes.io/projected/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-kube-api-access-9jv87\") pod \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\" (UID: \"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af\") " Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.511451 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-utilities" (OuterVolumeSpecName: "utilities") pod "1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" (UID: "1ef5af1f-fcf7-4911-b10f-dc7bae54c2af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.511920 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.521306 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-kube-api-access-9jv87" (OuterVolumeSpecName: "kube-api-access-9jv87") pod "1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" (UID: "1ef5af1f-fcf7-4911-b10f-dc7bae54c2af"). InnerVolumeSpecName "kube-api-access-9jv87". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.579526 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" (UID: "1ef5af1f-fcf7-4911-b10f-dc7bae54c2af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.614224 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.614280 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jv87\" (UniqueName: \"kubernetes.io/projected/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af-kube-api-access-9jv87\") on node \"crc\" DevicePath \"\"" Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.910305 4921 generic.go:334] "Generic (PLEG): container finished" podID="1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" containerID="1e09d5df60ddeba3611052a84770ddf4a7e1b6b04d11ada7879e67ca4be2e4ec" exitCode=0 Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.910384 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpg92" event={"ID":"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af","Type":"ContainerDied","Data":"1e09d5df60ddeba3611052a84770ddf4a7e1b6b04d11ada7879e67ca4be2e4ec"} Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.910426 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpg92" event={"ID":"1ef5af1f-fcf7-4911-b10f-dc7bae54c2af","Type":"ContainerDied","Data":"a85259acad5ac0bbf5a22c265a6d851dbf02b323be574caf544c217451442b26"} Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.910449 4921 scope.go:117] "RemoveContainer" containerID="1e09d5df60ddeba3611052a84770ddf4a7e1b6b04d11ada7879e67ca4be2e4ec" Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.910630 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpg92" Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.949349 4921 scope.go:117] "RemoveContainer" containerID="85162adaab5a114a350bd7d81fcbd63d930f724814eba5a7d6ced14f0592310b" Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.957474 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qpg92"] Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.972780 4921 scope.go:117] "RemoveContainer" containerID="58efa81c104593ff4e94050a9682038d5880658d40a3e7df5137d8e5fc4c3982" Mar 12 15:31:34 crc kubenswrapper[4921]: I0312 15:31:34.974345 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qpg92"] Mar 12 15:31:35 crc kubenswrapper[4921]: I0312 15:31:35.016054 4921 scope.go:117] "RemoveContainer" containerID="1e09d5df60ddeba3611052a84770ddf4a7e1b6b04d11ada7879e67ca4be2e4ec" Mar 12 15:31:35 crc kubenswrapper[4921]: E0312 15:31:35.016585 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e09d5df60ddeba3611052a84770ddf4a7e1b6b04d11ada7879e67ca4be2e4ec\": container with ID starting with 1e09d5df60ddeba3611052a84770ddf4a7e1b6b04d11ada7879e67ca4be2e4ec not found: ID does not exist" containerID="1e09d5df60ddeba3611052a84770ddf4a7e1b6b04d11ada7879e67ca4be2e4ec" Mar 12 15:31:35 crc kubenswrapper[4921]: I0312 15:31:35.016635 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e09d5df60ddeba3611052a84770ddf4a7e1b6b04d11ada7879e67ca4be2e4ec"} err="failed to get container status \"1e09d5df60ddeba3611052a84770ddf4a7e1b6b04d11ada7879e67ca4be2e4ec\": rpc error: code = NotFound desc = could not find container \"1e09d5df60ddeba3611052a84770ddf4a7e1b6b04d11ada7879e67ca4be2e4ec\": container with ID starting with 1e09d5df60ddeba3611052a84770ddf4a7e1b6b04d11ada7879e67ca4be2e4ec not found: ID does not exist" Mar 12 15:31:35 crc kubenswrapper[4921]: I0312 15:31:35.016667 4921 scope.go:117] "RemoveContainer" containerID="85162adaab5a114a350bd7d81fcbd63d930f724814eba5a7d6ced14f0592310b" Mar 12 15:31:35 crc kubenswrapper[4921]: E0312 15:31:35.017064 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85162adaab5a114a350bd7d81fcbd63d930f724814eba5a7d6ced14f0592310b\": container with ID starting with 85162adaab5a114a350bd7d81fcbd63d930f724814eba5a7d6ced14f0592310b not found: ID does not exist" containerID="85162adaab5a114a350bd7d81fcbd63d930f724814eba5a7d6ced14f0592310b" Mar 12 15:31:35 crc kubenswrapper[4921]: I0312 15:31:35.017100 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85162adaab5a114a350bd7d81fcbd63d930f724814eba5a7d6ced14f0592310b"} err="failed to get container status \"85162adaab5a114a350bd7d81fcbd63d930f724814eba5a7d6ced14f0592310b\": rpc error: code = NotFound desc = could not find container \"85162adaab5a114a350bd7d81fcbd63d930f724814eba5a7d6ced14f0592310b\": container with ID starting with 85162adaab5a114a350bd7d81fcbd63d930f724814eba5a7d6ced14f0592310b not found: ID does not exist" Mar 12 15:31:35 crc kubenswrapper[4921]: I0312 15:31:35.017150 4921 scope.go:117] "RemoveContainer" containerID="58efa81c104593ff4e94050a9682038d5880658d40a3e7df5137d8e5fc4c3982" Mar 12 15:31:35 crc kubenswrapper[4921]: E0312 15:31:35.017512 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58efa81c104593ff4e94050a9682038d5880658d40a3e7df5137d8e5fc4c3982\": container with ID starting with 58efa81c104593ff4e94050a9682038d5880658d40a3e7df5137d8e5fc4c3982 not found: ID does not exist" containerID="58efa81c104593ff4e94050a9682038d5880658d40a3e7df5137d8e5fc4c3982" Mar 12 15:31:35 crc kubenswrapper[4921]: I0312 15:31:35.017583 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58efa81c104593ff4e94050a9682038d5880658d40a3e7df5137d8e5fc4c3982"} err="failed to get container status \"58efa81c104593ff4e94050a9682038d5880658d40a3e7df5137d8e5fc4c3982\": rpc error: code = NotFound desc = could not find container \"58efa81c104593ff4e94050a9682038d5880658d40a3e7df5137d8e5fc4c3982\": container with ID starting with 58efa81c104593ff4e94050a9682038d5880658d40a3e7df5137d8e5fc4c3982 not found: ID does not exist" Mar 12 15:31:35 crc kubenswrapper[4921]: I0312 15:31:35.996885 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" path="/var/lib/kubelet/pods/1ef5af1f-fcf7-4911-b10f-dc7bae54c2af/volumes" Mar 12 15:31:37 crc kubenswrapper[4921]: I0312 15:31:37.646836 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-547b7895d7-42nbh_f1a475b3-67ed-40db-b403-0f82930d5d36/neutron-api/0.log" Mar 12 15:31:42 crc kubenswrapper[4921]: I0312 15:31:42.335888 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-547b7895d7-9c58r_4d97370e-b2d5-463a-ba6d-5e8e12618140/neutron-api/0.log" Mar 12 15:31:42 crc kubenswrapper[4921]: I0312 15:31:42.820552 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2_f5126789-42a1-4b3d-bc96-384b4db790b6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:31:44 crc kubenswrapper[4921]: I0312 15:31:44.999066 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_148f1f44-e990-4353-b376-1ccbb7f01d0a/nova-api-log/0.log" Mar 12 15:31:47 crc kubenswrapper[4921]: I0312 15:31:47.839334 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-1_ae5ecb59-c6e0-4a5f-a034-059935a3eaff/nova-api-log/0.log" Mar 12 15:31:48 crc kubenswrapper[4921]: I0312 15:31:48.779190 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_072b6f7c-f4af-4657-82e6-ff8acb7404d5/nova-cell0-conductor-conductor/0.log" Mar 12 15:31:49 crc kubenswrapper[4921]: I0312 15:31:49.459507 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a7798e1f-b22a-4ebd-a812-e8c17694cf60/nova-cell1-conductor-conductor/0.log" Mar 12 15:31:50 crc kubenswrapper[4921]: I0312 15:31:50.001676 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_6f997ce1-fc3d-4a1c-b9a8-d357e879f70d/nova-cell1-novncproxy-novncproxy/0.log" Mar 12 15:31:50 crc kubenswrapper[4921]: I0312 15:31:50.456578 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j_bcef78dc-2d5d-4a04-b106-2b54e1b11292/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:31:50 crc kubenswrapper[4921]: I0312 15:31:50.941287 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a8089872-446f-4355-94d8-8b82e1b04030/nova-metadata-log/0.log" Mar 12 15:31:52 crc kubenswrapper[4921]: I0312 15:31:52.480459 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b3862104-1cf4-4b79-ab48-f94ad1e83964/nova-scheduler-scheduler/0.log" Mar 12 15:31:52 crc kubenswrapper[4921]: I0312 15:31:52.970459 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_69b5525a-14c6-453f-9673-11d9e63dd25a/galera/0.log" Mar 12 15:31:53 crc kubenswrapper[4921]: I0312 15:31:53.454316 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ab9571cc-4c2d-4462-adc5-f84bd590bcca/galera/0.log" Mar 12 15:31:53 crc kubenswrapper[4921]: I0312 15:31:53.893103 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_345031e5-3e52-4b4e-ba3d-73bc5c3fe95d/openstackclient/0.log" Mar 12 15:31:54 crc kubenswrapper[4921]: I0312 15:31:54.329570 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-zhfgt_0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb/openstack-network-exporter/0.log" Mar 12 15:31:54 crc kubenswrapper[4921]: I0312 15:31:54.728016 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z4nmg_f2c49e53-e8d4-4f9b-a05e-f44516144d43/ovsdb-server/0.log" Mar 12 15:31:55 crc kubenswrapper[4921]: I0312 15:31:55.226585 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s4mtb_6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb/ovn-controller/0.log" Mar 12 15:31:55 crc kubenswrapper[4921]: I0312 15:31:55.667221 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-p2wxb_8697c3cf-f4d2-45fb-9347-c580192e39d2/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:31:56 crc kubenswrapper[4921]: I0312 15:31:56.093878 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_47b82052-6f75-4fe5-b4af-9726f2a59c2f/ovn-northd/0.log" Mar 12 15:31:56 crc kubenswrapper[4921]: I0312 15:31:56.324064 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:31:56 crc kubenswrapper[4921]: I0312 15:31:56.324192 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:31:56 crc kubenswrapper[4921]: I0312 15:31:56.324254 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 15:31:56 crc kubenswrapper[4921]: I0312 15:31:56.325076 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:31:56 crc kubenswrapper[4921]: I0312 15:31:56.325155 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" gracePeriod=600 Mar 12 15:31:56 crc kubenswrapper[4921]: E0312 15:31:56.470960 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:31:56 crc kubenswrapper[4921]: I0312 15:31:56.497501 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ed0ceb5e-c541-4d3f-99b9-1865684ffa9d/ovsdbserver-nb/0.log" Mar 12 15:31:56 crc kubenswrapper[4921]: I0312 15:31:56.937507 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_cae9c939-db1a-4372-b8a0-ff4e9892cb85/ovsdbserver-nb/0.log" Mar 12 15:31:57 crc kubenswrapper[4921]: I0312 15:31:57.166355 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" exitCode=0 Mar 12 15:31:57 crc kubenswrapper[4921]: I0312 15:31:57.166453 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54"} Mar 12 15:31:57 crc kubenswrapper[4921]: I0312 15:31:57.166994 4921 scope.go:117] "RemoveContainer" containerID="2c8ef7671ee7546b2decc42a107ed27fc61073a784d7daa28725d0413b1e5ae7" Mar 12 15:31:57 crc kubenswrapper[4921]: I0312 15:31:57.168730 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:31:57 crc kubenswrapper[4921]: E0312 15:31:57.169418 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:31:57 crc kubenswrapper[4921]: I0312 15:31:57.374784 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_228e4171-a3c9-483e-bfa6-1e0cef68384c/ovsdbserver-sb/0.log" Mar 12 15:31:58 crc kubenswrapper[4921]: I0312 15:31:58.404120 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f7ffb8f48-l6m2k_0091a555-ed5b-415c-ba49-7d2c64fdf54d/placement-log/0.log" Mar 12 15:31:58 crc kubenswrapper[4921]: I0312 15:31:58.818001 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b28ef2e5-d1ca-460a-9c97-a058c098ef64/rabbitmq/0.log" Mar 12 15:31:59 crc kubenswrapper[4921]: I0312 15:31:59.229458 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e627c0e-6753-4c4a-ad5f-7d36e4373a2c/rabbitmq/0.log" Mar 12 15:31:59 crc kubenswrapper[4921]: I0312 15:31:59.626930 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z_55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.082316 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5_66cfa5a2-1910-4504-84cb-24e75749c210/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.147724 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555492-j9ww9"] Mar 12 15:32:00 crc kubenswrapper[4921]: E0312 15:32:00.148391 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" containerName="extract-content" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.148421 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" containerName="extract-content" Mar 12 15:32:00 crc kubenswrapper[4921]: E0312 15:32:00.148434 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" containerName="registry-server" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.148443 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" containerName="registry-server" Mar 12 15:32:00 crc kubenswrapper[4921]: E0312 15:32:00.148461 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" containerName="extract-utilities" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.148470 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" containerName="extract-utilities" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.148803 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef5af1f-fcf7-4911-b10f-dc7bae54c2af" containerName="registry-server" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.149929 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555492-j9ww9" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.152436 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.152571 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.152720 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.157503 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555492-j9ww9"] Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.247035 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxqss\" (UniqueName: \"kubernetes.io/projected/a109c9d1-84c7-46ed-8631-37b6d309a388-kube-api-access-jxqss\") pod \"auto-csr-approver-29555492-j9ww9\" (UID: \"a109c9d1-84c7-46ed-8631-37b6d309a388\") " pod="openshift-infra/auto-csr-approver-29555492-j9ww9" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.350544 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxqss\" (UniqueName: \"kubernetes.io/projected/a109c9d1-84c7-46ed-8631-37b6d309a388-kube-api-access-jxqss\") pod \"auto-csr-approver-29555492-j9ww9\" (UID: \"a109c9d1-84c7-46ed-8631-37b6d309a388\") " pod="openshift-infra/auto-csr-approver-29555492-j9ww9" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.371239 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxqss\" (UniqueName: \"kubernetes.io/projected/a109c9d1-84c7-46ed-8631-37b6d309a388-kube-api-access-jxqss\") pod \"auto-csr-approver-29555492-j9ww9\" (UID: \"a109c9d1-84c7-46ed-8631-37b6d309a388\") " pod="openshift-infra/auto-csr-approver-29555492-j9ww9" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.475789 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555492-j9ww9" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.531975 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-nzzfd_095fb2e2-a411-4c41-bf21-1c8b69166a54/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.951231 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-7x2dm_7dc60d30-c59f-4cd4-b798-7e8214c0fa52/ssh-known-hosts-edpm-deployment/0.log" Mar 12 15:32:00 crc kubenswrapper[4921]: I0312 15:32:00.960327 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555492-j9ww9"] Mar 12 15:32:01 crc kubenswrapper[4921]: I0312 15:32:01.215258 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555492-j9ww9" event={"ID":"a109c9d1-84c7-46ed-8631-37b6d309a388","Type":"ContainerStarted","Data":"3272a36cc99daa3aec2907a974a7660826fa880ff5621b35f8c1dff40a243baf"} Mar 12 15:32:01 crc kubenswrapper[4921]: I0312 15:32:01.463298 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_b061c47e-9c37-48ed-a879-9263d780de9f/tempest-tests-tempest-tests-runner/0.log" Mar 12 15:32:01 crc kubenswrapper[4921]: I0312 15:32:01.871559 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_5d16b762-c737-4831-ae57-099f1da5d7fb/test-operator-logs-container/0.log" Mar 12 15:32:02 crc kubenswrapper[4921]: I0312 15:32:02.281199 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm_36211ec3-db4f-4485-a93d-08dd120af919/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:32:03 crc kubenswrapper[4921]: I0312 15:32:03.237593 4921 generic.go:334] "Generic (PLEG): container finished" podID="a109c9d1-84c7-46ed-8631-37b6d309a388" containerID="169778feb9448b75c38a3428bbfda9502b2b3a16781990d8545c0b97334b970d" exitCode=0 Mar 12 15:32:03 crc kubenswrapper[4921]: I0312 15:32:03.237840 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555492-j9ww9" event={"ID":"a109c9d1-84c7-46ed-8631-37b6d309a388","Type":"ContainerDied","Data":"169778feb9448b75c38a3428bbfda9502b2b3a16781990d8545c0b97334b970d"} Mar 12 15:32:04 crc kubenswrapper[4921]: I0312 15:32:04.612300 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555492-j9ww9" Mar 12 15:32:04 crc kubenswrapper[4921]: I0312 15:32:04.655968 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxqss\" (UniqueName: \"kubernetes.io/projected/a109c9d1-84c7-46ed-8631-37b6d309a388-kube-api-access-jxqss\") pod \"a109c9d1-84c7-46ed-8631-37b6d309a388\" (UID: \"a109c9d1-84c7-46ed-8631-37b6d309a388\") " Mar 12 15:32:04 crc kubenswrapper[4921]: I0312 15:32:04.664269 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a109c9d1-84c7-46ed-8631-37b6d309a388-kube-api-access-jxqss" (OuterVolumeSpecName: "kube-api-access-jxqss") pod "a109c9d1-84c7-46ed-8631-37b6d309a388" (UID: "a109c9d1-84c7-46ed-8631-37b6d309a388"). InnerVolumeSpecName "kube-api-access-jxqss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:32:04 crc kubenswrapper[4921]: I0312 15:32:04.758983 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxqss\" (UniqueName: \"kubernetes.io/projected/a109c9d1-84c7-46ed-8631-37b6d309a388-kube-api-access-jxqss\") on node \"crc\" DevicePath \"\"" Mar 12 15:32:05 crc kubenswrapper[4921]: I0312 15:32:05.261176 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555492-j9ww9" event={"ID":"a109c9d1-84c7-46ed-8631-37b6d309a388","Type":"ContainerDied","Data":"3272a36cc99daa3aec2907a974a7660826fa880ff5621b35f8c1dff40a243baf"} Mar 12 15:32:05 crc kubenswrapper[4921]: I0312 15:32:05.261677 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3272a36cc99daa3aec2907a974a7660826fa880ff5621b35f8c1dff40a243baf" Mar 12 15:32:05 crc kubenswrapper[4921]: I0312 15:32:05.261235 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555492-j9ww9" Mar 12 15:32:05 crc kubenswrapper[4921]: I0312 15:32:05.703337 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555486-7twgj"] Mar 12 15:32:05 crc kubenswrapper[4921]: I0312 15:32:05.713977 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555486-7twgj"] Mar 12 15:32:05 crc kubenswrapper[4921]: I0312 15:32:05.996281 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ae661f9-d836-4ace-8e58-8a4b362c683a" path="/var/lib/kubelet/pods/8ae661f9-d836-4ace-8e58-8a4b362c683a/volumes" Mar 12 15:32:09 crc kubenswrapper[4921]: I0312 15:32:09.984313 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:32:09 crc kubenswrapper[4921]: E0312 15:32:09.985574 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:32:21 crc kubenswrapper[4921]: I0312 15:32:21.983889 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:32:21 crc kubenswrapper[4921]: E0312 15:32:21.984979 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:32:32 crc kubenswrapper[4921]: I0312 15:32:32.984630 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:32:32 crc kubenswrapper[4921]: E0312 15:32:32.986068 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:32:36 crc kubenswrapper[4921]: I0312 15:32:36.972496 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/extract/0.log" Mar 12 15:32:45 crc kubenswrapper[4921]: I0312 15:32:45.983750 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:32:45 crc kubenswrapper[4921]: E0312 15:32:45.984873 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:32:52 crc kubenswrapper[4921]: I0312 15:32:52.263024 4921 scope.go:117] "RemoveContainer" containerID="9c278395ac90ec85b24be344af27b73c70ec3445115c61ecbf5a670dfe309326" Mar 12 15:32:52 crc kubenswrapper[4921]: I0312 15:32:52.372044 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-dmwhv_0cc6c5ac-1bcd-4636-924a-8a6d6ebfaeea/manager/0.log" Mar 12 15:32:55 crc kubenswrapper[4921]: I0312 15:32:55.499592 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-zmq56_ac8d4a43-01b6-438e-b1d8-d3521ed82176/manager/0.log" Mar 12 15:32:56 crc kubenswrapper[4921]: I0312 15:32:56.049507 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-j46tf_5908e8b2-d088-4190-8ccf-ea7526921e80/manager/0.log" Mar 12 15:32:56 crc kubenswrapper[4921]: I0312 15:32:56.525349 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-5jt7c_7494cb10-090c-4ac2-bbf1-663979f3e4cf/manager/0.log" Mar 12 15:32:56 crc kubenswrapper[4921]: I0312 15:32:56.908147 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-nq8wj_c6de3785-ea06-49bb-9b39-d8f2f10bce81/manager/0.log" Mar 12 15:32:56 crc kubenswrapper[4921]: I0312 15:32:56.983229 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:32:56 crc kubenswrapper[4921]: E0312 15:32:56.983578 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:32:57 crc kubenswrapper[4921]: I0312 15:32:57.323723 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-fp4rs_001425f5-0a2a-4bdc-a437-d6f9ba3687b4/manager/0.log" Mar 12 15:32:58 crc kubenswrapper[4921]: I0312 15:32:58.031521 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-9tkrv_c09491c8-72c5-4019-91bf-37ee1a3a937c/manager/0.log" Mar 12 15:32:58 crc kubenswrapper[4921]: I0312 15:32:58.535622 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-67xqg_6a1a1aea-a74a-4886-ae24-1d188243e859/manager/0.log" Mar 12 15:32:58 crc kubenswrapper[4921]: I0312 15:32:58.978768 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-v42m2_d4de9b0c-3812-462a-aa80-ffe00e6d47ca/manager/0.log" Mar 12 15:32:59 crc kubenswrapper[4921]: I0312 15:32:59.392714 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-xzm8h_fd1bc9ca-529d-4d59-a236-db1bb5c121ca/manager/0.log" Mar 12 15:32:59 crc kubenswrapper[4921]: I0312 15:32:59.805858 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-692s5_6131e4c9-d85a-4cdf-9cec-128c9e81bc29/manager/0.log" Mar 12 15:33:00 crc kubenswrapper[4921]: I0312 15:33:00.242076 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-kzh67_2394f3bd-4f8b-4036-b240-7ed71b80798a/manager/0.log" Mar 12 15:33:00 crc kubenswrapper[4921]: I0312 15:33:00.706329 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-686d5f9fbd-hmkmx_1a0b0ff9-21c3-452f-9ded-00d374fbbcbe/manager/0.log" Mar 12 15:33:01 crc kubenswrapper[4921]: I0312 15:33:01.105151 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-bz8j7_4e1ee178-3f0e-405a-93cb-9414b2fccbe0/manager/0.log" Mar 12 15:33:01 crc kubenswrapper[4921]: I0312 15:33:01.527458 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8_0c9cd39f-8440-4f22-82ce-d3be95bea1be/manager/0.log" Mar 12 15:33:02 crc kubenswrapper[4921]: I0312 15:33:02.039845 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5bc4df7446-bp8nq_c7db0c3c-40e2-49df-bffc-c0f94b26c92f/operator/0.log" Mar 12 15:33:04 crc kubenswrapper[4921]: I0312 15:33:04.082071 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5785b7957-24wxp_9b888138-4648-48a6-9364-639fb0e0c8b6/manager/0.log" Mar 12 15:33:04 crc kubenswrapper[4921]: I0312 15:33:04.495425 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rrhpc_5f20d433-83bd-4524-a6ce-ef19ef8a1064/registry-server/0.log" Mar 12 15:33:04 crc kubenswrapper[4921]: I0312 15:33:04.944667 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-x4tf4_994c3a47-47a7-4fbe-9f9c-df011597775b/manager/0.log" Mar 12 15:33:05 crc kubenswrapper[4921]: I0312 15:33:05.437191 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-64dcj_3a930c0b-6c3b-4a1d-b02f-1190a124ceb2/manager/0.log" Mar 12 15:33:05 crc kubenswrapper[4921]: I0312 15:33:05.834774 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-h97zm_f0da206d-658e-47e1-9cfb-5b74237c406a/operator/0.log" Mar 12 15:33:06 crc kubenswrapper[4921]: I0312 15:33:06.230441 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-m842c_f2c81917-4047-4d0b-baed-45afa8a53a60/manager/0.log" Mar 12 15:33:06 crc kubenswrapper[4921]: I0312 15:33:06.668580 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-dlgkj_fe35cc9d-bfc6-4a4d-b21f-06ab55672726/manager/0.log" Mar 12 15:33:07 crc kubenswrapper[4921]: I0312 15:33:07.031070 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-2sf7v_ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b/manager/0.log" Mar 12 15:33:07 crc kubenswrapper[4921]: I0312 15:33:07.349349 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-7l7sm_2db21a73-26d9-44d6-aa91-ba8068b0525a/manager/0.log" Mar 12 15:33:07 crc kubenswrapper[4921]: I0312 15:33:07.990227 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:33:07 crc kubenswrapper[4921]: E0312 15:33:07.990934 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:33:12 crc kubenswrapper[4921]: I0312 15:33:12.206929 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-bcbd96998-bx4p5_59a6f440-5a89-42a7-baa1-77a875476665/barbican-api-log/0.log" Mar 12 15:33:13 crc kubenswrapper[4921]: I0312 15:33:13.038357 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76b64f84d4-tpqnj_47867e82-3783-4f22-bc4f-9128016cf98e/barbican-keystone-listener-log/0.log" Mar 12 15:33:13 crc kubenswrapper[4921]: I0312 15:33:13.540498 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-594f99766c-xf6hh_6c4d7515-b40d-418c-b32e-b6a857c040a7/barbican-worker-log/0.log" Mar 12 15:33:14 crc kubenswrapper[4921]: I0312 15:33:14.078437 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf_e5130d9e-9678-42d8-9394-bcced05db054/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:33:14 crc kubenswrapper[4921]: I0312 15:33:14.631947 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f195685b-74f0-4887-8598-367bf4425faa/ceilometer-central-agent/0.log" Mar 12 15:33:15 crc kubenswrapper[4921]: I0312 15:33:15.142059 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-dt558_f5b6000a-13f1-4d52-9a03-3b777b3d651d/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:33:15 crc kubenswrapper[4921]: I0312 15:33:15.633102 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk_cbaebc43-5127-4000-abb3-79a878177cd2/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:33:16 crc kubenswrapper[4921]: I0312 15:33:16.249245 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a5b74f92-1f9b-4321-b549-47269e3eb04c/cinder-api-log/0.log" Mar 12 15:33:16 crc kubenswrapper[4921]: I0312 15:33:16.899686 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-1_b1c64c98-e301-4386-b33e-ccd4fde7592d/cinder-api-log/0.log" Mar 12 15:33:20 crc kubenswrapper[4921]: I0312 15:33:20.003615 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0ca55d43-e73b-403b-9760-f71e8b926650/cinder-backup/0.log" Mar 12 15:33:20 crc kubenswrapper[4921]: I0312 15:33:20.573722 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7cda98bc-d6ac-4204-8477-8ecd7dafb976/cinder-scheduler/0.log" Mar 12 15:33:22 crc kubenswrapper[4921]: I0312 15:33:22.983347 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:33:22 crc kubenswrapper[4921]: E0312 15:33:22.984421 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:33:23 crc kubenswrapper[4921]: I0312 15:33:23.465106 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_8671593e-1709-4d99-ae81-8639ee492d20/cinder-volume/0.log" Mar 12 15:33:24 crc kubenswrapper[4921]: I0312 15:33:24.018943 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw_0a18ea59-b5e6-40e3-8096-0f2bda4563bb/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:33:24 crc kubenswrapper[4921]: I0312 15:33:24.549291 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pcpck_5a0ab9f2-e0b6-40e1-9816-a11f8135ed75/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:33:25 crc kubenswrapper[4921]: I0312 15:33:25.748005 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5664d5cbb7-9rpxn_5f732887-96f4-4cd5-9a36-df3848958280/dnsmasq-dns/0.log" Mar 12 15:33:26 crc kubenswrapper[4921]: I0312 15:33:26.185797 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3ddcb284-70a7-47da-8b0e-e5ba1f0a9443/glance-log/0.log" Mar 12 15:33:26 crc kubenswrapper[4921]: I0312 15:33:26.610550 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-1_5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b/glance-log/0.log" Mar 12 15:33:27 crc kubenswrapper[4921]: I0312 15:33:27.007910 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d506b9f9-1563-432f-9b21-760ceb017fe9/glance-log/0.log" Mar 12 15:33:27 crc kubenswrapper[4921]: I0312 15:33:27.427879 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-1_739d7b6f-9f1d-4052-958f-e08821db9361/glance-log/0.log" Mar 12 15:33:28 crc kubenswrapper[4921]: I0312 15:33:28.862427 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-bbd56cc76-cwl96_e6e62dec-8193-4d3c-a111-2ee250f79b86/horizon-log/0.log" Mar 12 15:33:29 crc kubenswrapper[4921]: I0312 15:33:29.316552 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl_c4eac827-ab86-4fef-b974-8638416f5125/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:33:29 crc kubenswrapper[4921]: I0312 15:33:29.731181 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hfsqf_56567424-34cd-49a4-ad03-c72a25a07058/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:33:31 crc kubenswrapper[4921]: I0312 15:33:31.548389 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c8b44c5c7-l6d8m_8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118/keystone-api/0.log" Mar 12 15:33:33 crc kubenswrapper[4921]: I0312 15:33:33.524896 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c8b44c5c7-pc46f_3fcdfac3-13b0-42ac-9396-587a7d443e2a/keystone-api/0.log" Mar 12 15:33:33 crc kubenswrapper[4921]: I0312 15:33:33.983297 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:33:33 crc kubenswrapper[4921]: E0312 15:33:33.983624 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:33:33 crc kubenswrapper[4921]: I0312 15:33:33.992271 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29555401-cfhz9_c85b992e-689f-4f2f-9799-da7e608f6ca8/keystone-cron/0.log" Mar 12 15:33:34 crc kubenswrapper[4921]: I0312 15:33:34.401857 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29555461-nscpw_ce60198f-3189-4ce6-b4a7-32387eb98fa7/keystone-cron/0.log" Mar 12 15:33:34 crc kubenswrapper[4921]: I0312 15:33:34.833104 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_01d94a77-b0dc-48b9-863b-71dbccd74bfb/kube-state-metrics/0.log" Mar 12 15:33:35 crc kubenswrapper[4921]: I0312 15:33:35.286683 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6_2ee1e205-39b3-4648-8c21-4a7cd46b867f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:33:43 crc kubenswrapper[4921]: I0312 15:33:43.519075 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f0c221da-6e02-450a-a048-9c8292c208ff/memcached/0.log" Mar 12 15:33:48 crc kubenswrapper[4921]: I0312 15:33:48.252638 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-547b7895d7-42nbh_f1a475b3-67ed-40db-b403-0f82930d5d36/neutron-api/0.log" Mar 12 15:33:48 crc kubenswrapper[4921]: I0312 15:33:48.982980 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:33:48 crc kubenswrapper[4921]: E0312 15:33:48.983473 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:33:52 crc kubenswrapper[4921]: I0312 15:33:52.611638 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-547b7895d7-9c58r_4d97370e-b2d5-463a-ba6d-5e8e12618140/neutron-api/0.log" Mar 12 15:33:53 crc kubenswrapper[4921]: I0312 15:33:53.470574 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2_f5126789-42a1-4b3d-bc96-384b4db790b6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:33:55 crc kubenswrapper[4921]: I0312 15:33:55.545802 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_148f1f44-e990-4353-b376-1ccbb7f01d0a/nova-api-log/0.log" Mar 12 15:33:58 crc kubenswrapper[4921]: I0312 15:33:58.056662 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-1_ae5ecb59-c6e0-4a5f-a034-059935a3eaff/nova-api-log/0.log" Mar 12 15:33:59 crc kubenswrapper[4921]: I0312 15:33:59.045422 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_072b6f7c-f4af-4657-82e6-ff8acb7404d5/nova-cell0-conductor-conductor/0.log" Mar 12 15:33:59 crc kubenswrapper[4921]: I0312 15:33:59.641219 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a7798e1f-b22a-4ebd-a812-e8c17694cf60/nova-cell1-conductor-conductor/0.log" Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.154054 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555494-ggpnd"] Mar 12 15:34:00 crc kubenswrapper[4921]: E0312 15:34:00.154622 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a109c9d1-84c7-46ed-8631-37b6d309a388" containerName="oc" Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.154646 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="a109c9d1-84c7-46ed-8631-37b6d309a388" containerName="oc" Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.154908 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="a109c9d1-84c7-46ed-8631-37b6d309a388" containerName="oc" Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.155862 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555494-ggpnd" Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.158202 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.158425 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.158587 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.167696 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555494-ggpnd"] Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.204498 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_6f997ce1-fc3d-4a1c-b9a8-d357e879f70d/nova-cell1-novncproxy-novncproxy/0.log" Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.326793 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rklqh\" (UniqueName: \"kubernetes.io/projected/dd441807-f8ad-47d7-8a7a-b4f01fbc7e71-kube-api-access-rklqh\") pod \"auto-csr-approver-29555494-ggpnd\" (UID: \"dd441807-f8ad-47d7-8a7a-b4f01fbc7e71\") " pod="openshift-infra/auto-csr-approver-29555494-ggpnd" Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.430300 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rklqh\" (UniqueName: \"kubernetes.io/projected/dd441807-f8ad-47d7-8a7a-b4f01fbc7e71-kube-api-access-rklqh\") pod \"auto-csr-approver-29555494-ggpnd\" (UID: \"dd441807-f8ad-47d7-8a7a-b4f01fbc7e71\") " pod="openshift-infra/auto-csr-approver-29555494-ggpnd" Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.450324 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rklqh\" (UniqueName: \"kubernetes.io/projected/dd441807-f8ad-47d7-8a7a-b4f01fbc7e71-kube-api-access-rklqh\") pod \"auto-csr-approver-29555494-ggpnd\" (UID: \"dd441807-f8ad-47d7-8a7a-b4f01fbc7e71\") " pod="openshift-infra/auto-csr-approver-29555494-ggpnd" Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.483246 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555494-ggpnd" Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.695727 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j_bcef78dc-2d5d-4a04-b106-2b54e1b11292/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.959858 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555494-ggpnd"] Mar 12 15:34:00 crc kubenswrapper[4921]: I0312 15:34:00.969024 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:34:01 crc kubenswrapper[4921]: I0312 15:34:01.219973 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a8089872-446f-4355-94d8-8b82e1b04030/nova-metadata-log/0.log" Mar 12 15:34:01 crc kubenswrapper[4921]: I0312 15:34:01.338582 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555494-ggpnd" event={"ID":"dd441807-f8ad-47d7-8a7a-b4f01fbc7e71","Type":"ContainerStarted","Data":"0b34e324f42dfe33b2f87ae16644a989e9926b2db7999f9720f8913f91c73cfd"} Mar 12 15:34:01 crc kubenswrapper[4921]: I0312 15:34:01.983082 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:34:01 crc kubenswrapper[4921]: E0312 15:34:01.983378 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:34:02 crc kubenswrapper[4921]: I0312 15:34:02.821551 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b3862104-1cf4-4b79-ab48-f94ad1e83964/nova-scheduler-scheduler/0.log" Mar 12 15:34:03 crc kubenswrapper[4921]: I0312 15:34:03.266418 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_69b5525a-14c6-453f-9673-11d9e63dd25a/galera/0.log" Mar 12 15:34:03 crc kubenswrapper[4921]: I0312 15:34:03.364104 4921 generic.go:334] "Generic (PLEG): container finished" podID="dd441807-f8ad-47d7-8a7a-b4f01fbc7e71" containerID="dfce88e1982cbc7a0d76d4e8be83401431411d477402072842a4ec5e0909bd9a" exitCode=0 Mar 12 15:34:03 crc kubenswrapper[4921]: I0312 15:34:03.364157 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555494-ggpnd" event={"ID":"dd441807-f8ad-47d7-8a7a-b4f01fbc7e71","Type":"ContainerDied","Data":"dfce88e1982cbc7a0d76d4e8be83401431411d477402072842a4ec5e0909bd9a"} Mar 12 15:34:03 crc kubenswrapper[4921]: I0312 15:34:03.722028 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ab9571cc-4c2d-4462-adc5-f84bd590bcca/galera/0.log" Mar 12 15:34:04 crc kubenswrapper[4921]: I0312 15:34:04.150047 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_345031e5-3e52-4b4e-ba3d-73bc5c3fe95d/openstackclient/0.log" Mar 12 15:34:04 crc kubenswrapper[4921]: I0312 15:34:04.586597 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-zhfgt_0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb/openstack-network-exporter/0.log" Mar 12 15:34:04 crc kubenswrapper[4921]: I0312 15:34:04.715557 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555494-ggpnd" Mar 12 15:34:04 crc kubenswrapper[4921]: I0312 15:34:04.837904 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rklqh\" (UniqueName: \"kubernetes.io/projected/dd441807-f8ad-47d7-8a7a-b4f01fbc7e71-kube-api-access-rklqh\") pod \"dd441807-f8ad-47d7-8a7a-b4f01fbc7e71\" (UID: \"dd441807-f8ad-47d7-8a7a-b4f01fbc7e71\") " Mar 12 15:34:04 crc kubenswrapper[4921]: I0312 15:34:04.844684 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd441807-f8ad-47d7-8a7a-b4f01fbc7e71-kube-api-access-rklqh" (OuterVolumeSpecName: "kube-api-access-rklqh") pod "dd441807-f8ad-47d7-8a7a-b4f01fbc7e71" (UID: "dd441807-f8ad-47d7-8a7a-b4f01fbc7e71"). InnerVolumeSpecName "kube-api-access-rklqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:34:04 crc kubenswrapper[4921]: I0312 15:34:04.940992 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rklqh\" (UniqueName: \"kubernetes.io/projected/dd441807-f8ad-47d7-8a7a-b4f01fbc7e71-kube-api-access-rklqh\") on node \"crc\" DevicePath \"\"" Mar 12 15:34:04 crc kubenswrapper[4921]: I0312 15:34:04.971475 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z4nmg_f2c49e53-e8d4-4f9b-a05e-f44516144d43/ovsdb-server/0.log" Mar 12 15:34:05 crc kubenswrapper[4921]: I0312 15:34:05.383608 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555494-ggpnd" event={"ID":"dd441807-f8ad-47d7-8a7a-b4f01fbc7e71","Type":"ContainerDied","Data":"0b34e324f42dfe33b2f87ae16644a989e9926b2db7999f9720f8913f91c73cfd"} Mar 12 15:34:05 crc kubenswrapper[4921]: I0312 15:34:05.383658 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b34e324f42dfe33b2f87ae16644a989e9926b2db7999f9720f8913f91c73cfd" Mar 12 15:34:05 crc kubenswrapper[4921]: I0312 15:34:05.383669 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555494-ggpnd" Mar 12 15:34:05 crc kubenswrapper[4921]: I0312 15:34:05.397362 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s4mtb_6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb/ovn-controller/0.log" Mar 12 15:34:05 crc kubenswrapper[4921]: I0312 15:34:05.783563 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555488-sdk2q"] Mar 12 15:34:05 crc kubenswrapper[4921]: I0312 15:34:05.796069 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555488-sdk2q"] Mar 12 15:34:05 crc kubenswrapper[4921]: I0312 15:34:05.861858 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-p2wxb_8697c3cf-f4d2-45fb-9347-c580192e39d2/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:34:05 crc kubenswrapper[4921]: I0312 15:34:05.993910 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="594712a0-438a-4239-86bf-77785f152327" path="/var/lib/kubelet/pods/594712a0-438a-4239-86bf-77785f152327/volumes" Mar 12 15:34:06 crc kubenswrapper[4921]: I0312 15:34:06.343643 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_47b82052-6f75-4fe5-b4af-9726f2a59c2f/ovn-northd/0.log" Mar 12 15:34:06 crc kubenswrapper[4921]: I0312 15:34:06.797938 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ed0ceb5e-c541-4d3f-99b9-1865684ffa9d/ovsdbserver-nb/0.log" Mar 12 15:34:07 crc kubenswrapper[4921]: I0312 15:34:07.265285 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_cae9c939-db1a-4372-b8a0-ff4e9892cb85/ovsdbserver-nb/0.log" Mar 12 15:34:07 crc kubenswrapper[4921]: I0312 15:34:07.694137 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_228e4171-a3c9-483e-bfa6-1e0cef68384c/ovsdbserver-sb/0.log" Mar 12 15:34:08 crc kubenswrapper[4921]: I0312 15:34:08.801907 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f7ffb8f48-l6m2k_0091a555-ed5b-415c-ba49-7d2c64fdf54d/placement-log/0.log" Mar 12 15:34:09 crc kubenswrapper[4921]: I0312 15:34:09.226387 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b28ef2e5-d1ca-460a-9c97-a058c098ef64/rabbitmq/0.log" Mar 12 15:34:09 crc kubenswrapper[4921]: I0312 15:34:09.676259 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e627c0e-6753-4c4a-ad5f-7d36e4373a2c/rabbitmq/0.log" Mar 12 15:34:10 crc kubenswrapper[4921]: I0312 15:34:10.087032 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z_55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:34:10 crc kubenswrapper[4921]: I0312 15:34:10.518381 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5_66cfa5a2-1910-4504-84cb-24e75749c210/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:34:10 crc kubenswrapper[4921]: I0312 15:34:10.967561 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-nzzfd_095fb2e2-a411-4c41-bf21-1c8b69166a54/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:34:11 crc kubenswrapper[4921]: I0312 15:34:11.410604 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-7x2dm_7dc60d30-c59f-4cd4-b798-7e8214c0fa52/ssh-known-hosts-edpm-deployment/0.log" Mar 12 15:34:11 crc kubenswrapper[4921]: I0312 15:34:11.873826 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_b061c47e-9c37-48ed-a879-9263d780de9f/tempest-tests-tempest-tests-runner/0.log" Mar 12 15:34:12 crc kubenswrapper[4921]: I0312 15:34:12.291796 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_5d16b762-c737-4831-ae57-099f1da5d7fb/test-operator-logs-container/0.log" Mar 12 15:34:12 crc kubenswrapper[4921]: I0312 15:34:12.753417 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm_36211ec3-db4f-4485-a93d-08dd120af919/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:34:14 crc kubenswrapper[4921]: I0312 15:34:14.983758 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:34:14 crc kubenswrapper[4921]: E0312 15:34:14.984083 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:34:18 crc kubenswrapper[4921]: I0312 15:34:18.997679 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q88bg"] Mar 12 15:34:18 crc kubenswrapper[4921]: E0312 15:34:18.998691 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd441807-f8ad-47d7-8a7a-b4f01fbc7e71" containerName="oc" Mar 12 15:34:18 crc kubenswrapper[4921]: I0312 15:34:18.998705 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd441807-f8ad-47d7-8a7a-b4f01fbc7e71" containerName="oc" Mar 12 15:34:18 crc kubenswrapper[4921]: I0312 15:34:18.999024 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd441807-f8ad-47d7-8a7a-b4f01fbc7e71" containerName="oc" Mar 12 15:34:19 crc kubenswrapper[4921]: I0312 15:34:19.000451 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:19 crc kubenswrapper[4921]: I0312 15:34:19.019671 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q88bg"] Mar 12 15:34:19 crc kubenswrapper[4921]: I0312 15:34:19.066144 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db225958-f98a-495a-8890-9e58c4afd19d-catalog-content\") pod \"redhat-marketplace-q88bg\" (UID: \"db225958-f98a-495a-8890-9e58c4afd19d\") " pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:19 crc kubenswrapper[4921]: I0312 15:34:19.066510 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db225958-f98a-495a-8890-9e58c4afd19d-utilities\") pod \"redhat-marketplace-q88bg\" (UID: \"db225958-f98a-495a-8890-9e58c4afd19d\") " pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:19 crc kubenswrapper[4921]: I0312 15:34:19.066678 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clxb5\" (UniqueName: \"kubernetes.io/projected/db225958-f98a-495a-8890-9e58c4afd19d-kube-api-access-clxb5\") pod \"redhat-marketplace-q88bg\" (UID: \"db225958-f98a-495a-8890-9e58c4afd19d\") " pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:19 crc kubenswrapper[4921]: I0312 15:34:19.168796 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clxb5\" (UniqueName: \"kubernetes.io/projected/db225958-f98a-495a-8890-9e58c4afd19d-kube-api-access-clxb5\") pod \"redhat-marketplace-q88bg\" (UID: \"db225958-f98a-495a-8890-9e58c4afd19d\") " pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:19 crc kubenswrapper[4921]: I0312 15:34:19.168957 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db225958-f98a-495a-8890-9e58c4afd19d-catalog-content\") pod \"redhat-marketplace-q88bg\" (UID: \"db225958-f98a-495a-8890-9e58c4afd19d\") " pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:19 crc kubenswrapper[4921]: I0312 15:34:19.169117 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db225958-f98a-495a-8890-9e58c4afd19d-utilities\") pod \"redhat-marketplace-q88bg\" (UID: \"db225958-f98a-495a-8890-9e58c4afd19d\") " pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:19 crc kubenswrapper[4921]: I0312 15:34:19.169636 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db225958-f98a-495a-8890-9e58c4afd19d-catalog-content\") pod \"redhat-marketplace-q88bg\" (UID: \"db225958-f98a-495a-8890-9e58c4afd19d\") " pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:19 crc kubenswrapper[4921]: I0312 15:34:19.169804 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db225958-f98a-495a-8890-9e58c4afd19d-utilities\") pod \"redhat-marketplace-q88bg\" (UID: \"db225958-f98a-495a-8890-9e58c4afd19d\") " pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:19 crc kubenswrapper[4921]: I0312 15:34:19.189909 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clxb5\" (UniqueName: \"kubernetes.io/projected/db225958-f98a-495a-8890-9e58c4afd19d-kube-api-access-clxb5\") pod \"redhat-marketplace-q88bg\" (UID: \"db225958-f98a-495a-8890-9e58c4afd19d\") " pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:19 crc kubenswrapper[4921]: I0312 15:34:19.322509 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:19 crc kubenswrapper[4921]: I0312 15:34:19.820006 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q88bg"] Mar 12 15:34:20 crc kubenswrapper[4921]: I0312 15:34:20.531791 4921 generic.go:334] "Generic (PLEG): container finished" podID="db225958-f98a-495a-8890-9e58c4afd19d" containerID="7bfe957de2b8815a8bf650d20dbdfdf3eb07098ae657c27d61236cef118fe6b8" exitCode=0 Mar 12 15:34:20 crc kubenswrapper[4921]: I0312 15:34:20.531927 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q88bg" event={"ID":"db225958-f98a-495a-8890-9e58c4afd19d","Type":"ContainerDied","Data":"7bfe957de2b8815a8bf650d20dbdfdf3eb07098ae657c27d61236cef118fe6b8"} Mar 12 15:34:20 crc kubenswrapper[4921]: I0312 15:34:20.533794 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q88bg" event={"ID":"db225958-f98a-495a-8890-9e58c4afd19d","Type":"ContainerStarted","Data":"2401b9c1b249af7e41e2f148b999729e5c65867cfe1dc65ac6612a086d04d7c4"} Mar 12 15:34:21 crc kubenswrapper[4921]: I0312 15:34:21.545501 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q88bg" event={"ID":"db225958-f98a-495a-8890-9e58c4afd19d","Type":"ContainerStarted","Data":"1e1fbd17d69073dd1b95749c8d63aa82f733d3d56d0e336729665947262dd208"} Mar 12 15:34:22 crc kubenswrapper[4921]: I0312 15:34:22.564537 4921 generic.go:334] "Generic (PLEG): container finished" podID="db225958-f98a-495a-8890-9e58c4afd19d" containerID="1e1fbd17d69073dd1b95749c8d63aa82f733d3d56d0e336729665947262dd208" exitCode=0 Mar 12 15:34:22 crc kubenswrapper[4921]: I0312 15:34:22.564599 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q88bg" event={"ID":"db225958-f98a-495a-8890-9e58c4afd19d","Type":"ContainerDied","Data":"1e1fbd17d69073dd1b95749c8d63aa82f733d3d56d0e336729665947262dd208"} Mar 12 15:34:23 crc kubenswrapper[4921]: I0312 15:34:23.576155 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q88bg" event={"ID":"db225958-f98a-495a-8890-9e58c4afd19d","Type":"ContainerStarted","Data":"9c871a4be64049f2fa53adda52b7bb9ae3dc33b39b962c4de92f90531338a273"} Mar 12 15:34:23 crc kubenswrapper[4921]: I0312 15:34:23.605294 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q88bg" podStartSLOduration=3.117080225 podStartE2EDuration="5.60527651s" podCreationTimestamp="2026-03-12 15:34:18 +0000 UTC" firstStartedPulling="2026-03-12 15:34:20.533483605 +0000 UTC m=+8683.223555576" lastFinishedPulling="2026-03-12 15:34:23.02167989 +0000 UTC m=+8685.711751861" observedRunningTime="2026-03-12 15:34:23.594502646 +0000 UTC m=+8686.284574617" watchObservedRunningTime="2026-03-12 15:34:23.60527651 +0000 UTC m=+8686.295348481" Mar 12 15:34:28 crc kubenswrapper[4921]: I0312 15:34:28.983946 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:34:28 crc kubenswrapper[4921]: E0312 15:34:28.984789 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:34:29 crc kubenswrapper[4921]: I0312 15:34:29.323504 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:29 crc kubenswrapper[4921]: I0312 15:34:29.323570 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:29 crc kubenswrapper[4921]: I0312 15:34:29.373354 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:29 crc kubenswrapper[4921]: I0312 15:34:29.681581 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:29 crc kubenswrapper[4921]: I0312 15:34:29.733690 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q88bg"] Mar 12 15:34:31 crc kubenswrapper[4921]: I0312 15:34:31.651077 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q88bg" podUID="db225958-f98a-495a-8890-9e58c4afd19d" containerName="registry-server" containerID="cri-o://9c871a4be64049f2fa53adda52b7bb9ae3dc33b39b962c4de92f90531338a273" gracePeriod=2 Mar 12 15:34:32 crc kubenswrapper[4921]: I0312 15:34:32.692676 4921 generic.go:334] "Generic (PLEG): container finished" podID="db225958-f98a-495a-8890-9e58c4afd19d" containerID="9c871a4be64049f2fa53adda52b7bb9ae3dc33b39b962c4de92f90531338a273" exitCode=0 Mar 12 15:34:32 crc kubenswrapper[4921]: I0312 15:34:32.692723 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q88bg" event={"ID":"db225958-f98a-495a-8890-9e58c4afd19d","Type":"ContainerDied","Data":"9c871a4be64049f2fa53adda52b7bb9ae3dc33b39b962c4de92f90531338a273"} Mar 12 15:34:32 crc kubenswrapper[4921]: I0312 15:34:32.693001 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q88bg" event={"ID":"db225958-f98a-495a-8890-9e58c4afd19d","Type":"ContainerDied","Data":"2401b9c1b249af7e41e2f148b999729e5c65867cfe1dc65ac6612a086d04d7c4"} Mar 12 15:34:32 crc kubenswrapper[4921]: I0312 15:34:32.693016 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2401b9c1b249af7e41e2f148b999729e5c65867cfe1dc65ac6612a086d04d7c4" Mar 12 15:34:32 crc kubenswrapper[4921]: I0312 15:34:32.700208 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:32 crc kubenswrapper[4921]: I0312 15:34:32.788575 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clxb5\" (UniqueName: \"kubernetes.io/projected/db225958-f98a-495a-8890-9e58c4afd19d-kube-api-access-clxb5\") pod \"db225958-f98a-495a-8890-9e58c4afd19d\" (UID: \"db225958-f98a-495a-8890-9e58c4afd19d\") " Mar 12 15:34:32 crc kubenswrapper[4921]: I0312 15:34:32.788752 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db225958-f98a-495a-8890-9e58c4afd19d-catalog-content\") pod \"db225958-f98a-495a-8890-9e58c4afd19d\" (UID: \"db225958-f98a-495a-8890-9e58c4afd19d\") " Mar 12 15:34:32 crc kubenswrapper[4921]: I0312 15:34:32.788840 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db225958-f98a-495a-8890-9e58c4afd19d-utilities\") pod \"db225958-f98a-495a-8890-9e58c4afd19d\" (UID: \"db225958-f98a-495a-8890-9e58c4afd19d\") " Mar 12 15:34:32 crc kubenswrapper[4921]: I0312 15:34:32.790192 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db225958-f98a-495a-8890-9e58c4afd19d-utilities" (OuterVolumeSpecName: "utilities") pod "db225958-f98a-495a-8890-9e58c4afd19d" (UID: "db225958-f98a-495a-8890-9e58c4afd19d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:34:32 crc kubenswrapper[4921]: I0312 15:34:32.809882 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db225958-f98a-495a-8890-9e58c4afd19d-kube-api-access-clxb5" (OuterVolumeSpecName: "kube-api-access-clxb5") pod "db225958-f98a-495a-8890-9e58c4afd19d" (UID: "db225958-f98a-495a-8890-9e58c4afd19d"). InnerVolumeSpecName "kube-api-access-clxb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:34:32 crc kubenswrapper[4921]: I0312 15:34:32.872981 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db225958-f98a-495a-8890-9e58c4afd19d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db225958-f98a-495a-8890-9e58c4afd19d" (UID: "db225958-f98a-495a-8890-9e58c4afd19d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:34:32 crc kubenswrapper[4921]: I0312 15:34:32.892310 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db225958-f98a-495a-8890-9e58c4afd19d-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:34:32 crc kubenswrapper[4921]: I0312 15:34:32.892356 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clxb5\" (UniqueName: \"kubernetes.io/projected/db225958-f98a-495a-8890-9e58c4afd19d-kube-api-access-clxb5\") on node \"crc\" DevicePath \"\"" Mar 12 15:34:32 crc kubenswrapper[4921]: I0312 15:34:32.892371 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db225958-f98a-495a-8890-9e58c4afd19d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:34:33 crc kubenswrapper[4921]: I0312 15:34:33.705323 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q88bg" Mar 12 15:34:33 crc kubenswrapper[4921]: I0312 15:34:33.749390 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q88bg"] Mar 12 15:34:33 crc kubenswrapper[4921]: I0312 15:34:33.766682 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q88bg"] Mar 12 15:34:33 crc kubenswrapper[4921]: I0312 15:34:33.993675 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db225958-f98a-495a-8890-9e58c4afd19d" path="/var/lib/kubelet/pods/db225958-f98a-495a-8890-9e58c4afd19d/volumes" Mar 12 15:34:41 crc kubenswrapper[4921]: I0312 15:34:41.986241 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:34:41 crc kubenswrapper[4921]: E0312 15:34:41.987098 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:34:48 crc kubenswrapper[4921]: I0312 15:34:48.023517 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/extract/0.log" Mar 12 15:34:52 crc kubenswrapper[4921]: I0312 15:34:52.392984 4921 scope.go:117] "RemoveContainer" containerID="45be2bb72a956c93c3583edf55de76d90b2dfde4092e328553566151abbd1b4e" Mar 12 15:34:53 crc kubenswrapper[4921]: I0312 15:34:53.983231 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:34:53 crc kubenswrapper[4921]: E0312 15:34:53.983907 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:35:03 crc kubenswrapper[4921]: I0312 15:35:03.686247 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-dmwhv_0cc6c5ac-1bcd-4636-924a-8a6d6ebfaeea/manager/0.log" Mar 12 15:35:04 crc kubenswrapper[4921]: I0312 15:35:04.983285 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:35:04 crc kubenswrapper[4921]: E0312 15:35:04.983542 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:35:06 crc kubenswrapper[4921]: I0312 15:35:06.763806 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-zmq56_ac8d4a43-01b6-438e-b1d8-d3521ed82176/manager/0.log" Mar 12 15:35:07 crc kubenswrapper[4921]: I0312 15:35:07.155789 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-j46tf_5908e8b2-d088-4190-8ccf-ea7526921e80/manager/0.log" Mar 12 15:35:07 crc kubenswrapper[4921]: I0312 15:35:07.625166 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-5jt7c_7494cb10-090c-4ac2-bbf1-663979f3e4cf/manager/0.log" Mar 12 15:35:08 crc kubenswrapper[4921]: I0312 15:35:08.035711 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-nq8wj_c6de3785-ea06-49bb-9b39-d8f2f10bce81/manager/0.log" Mar 12 15:35:08 crc kubenswrapper[4921]: I0312 15:35:08.459851 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-fp4rs_001425f5-0a2a-4bdc-a437-d6f9ba3687b4/manager/0.log" Mar 12 15:35:09 crc kubenswrapper[4921]: I0312 15:35:09.111367 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-9tkrv_c09491c8-72c5-4019-91bf-37ee1a3a937c/manager/0.log" Mar 12 15:35:09 crc kubenswrapper[4921]: I0312 15:35:09.461911 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-67xqg_6a1a1aea-a74a-4886-ae24-1d188243e859/manager/0.log" Mar 12 15:35:09 crc kubenswrapper[4921]: I0312 15:35:09.905577 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-v42m2_d4de9b0c-3812-462a-aa80-ffe00e6d47ca/manager/0.log" Mar 12 15:35:10 crc kubenswrapper[4921]: I0312 15:35:10.331756 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-xzm8h_fd1bc9ca-529d-4d59-a236-db1bb5c121ca/manager/0.log" Mar 12 15:35:10 crc kubenswrapper[4921]: I0312 15:35:10.774736 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-692s5_6131e4c9-d85a-4cdf-9cec-128c9e81bc29/manager/0.log" Mar 12 15:35:11 crc kubenswrapper[4921]: I0312 15:35:11.180498 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-kzh67_2394f3bd-4f8b-4036-b240-7ed71b80798a/manager/0.log" Mar 12 15:35:11 crc kubenswrapper[4921]: I0312 15:35:11.648927 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-686d5f9fbd-hmkmx_1a0b0ff9-21c3-452f-9ded-00d374fbbcbe/manager/0.log" Mar 12 15:35:12 crc kubenswrapper[4921]: I0312 15:35:12.056219 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-bz8j7_4e1ee178-3f0e-405a-93cb-9414b2fccbe0/manager/0.log" Mar 12 15:35:12 crc kubenswrapper[4921]: I0312 15:35:12.589614 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8_0c9cd39f-8440-4f22-82ce-d3be95bea1be/manager/0.log" Mar 12 15:35:13 crc kubenswrapper[4921]: I0312 15:35:13.115245 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5bc4df7446-bp8nq_c7db0c3c-40e2-49df-bffc-c0f94b26c92f/operator/0.log" Mar 12 15:35:14 crc kubenswrapper[4921]: I0312 15:35:14.946155 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5785b7957-24wxp_9b888138-4648-48a6-9364-639fb0e0c8b6/manager/0.log" Mar 12 15:35:15 crc kubenswrapper[4921]: I0312 15:35:15.300040 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rrhpc_5f20d433-83bd-4524-a6ce-ef19ef8a1064/registry-server/0.log" Mar 12 15:35:15 crc kubenswrapper[4921]: I0312 15:35:15.781790 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-x4tf4_994c3a47-47a7-4fbe-9f9c-df011597775b/manager/0.log" Mar 12 15:35:16 crc kubenswrapper[4921]: I0312 15:35:16.208996 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-64dcj_3a930c0b-6c3b-4a1d-b02f-1190a124ceb2/manager/0.log" Mar 12 15:35:16 crc kubenswrapper[4921]: I0312 15:35:16.605338 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-h97zm_f0da206d-658e-47e1-9cfb-5b74237c406a/operator/0.log" Mar 12 15:35:17 crc kubenswrapper[4921]: I0312 15:35:17.035295 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-m842c_f2c81917-4047-4d0b-baed-45afa8a53a60/manager/0.log" Mar 12 15:35:17 crc kubenswrapper[4921]: I0312 15:35:17.489041 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-dlgkj_fe35cc9d-bfc6-4a4d-b21f-06ab55672726/manager/0.log" Mar 12 15:35:17 crc kubenswrapper[4921]: I0312 15:35:17.895874 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-2sf7v_ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b/manager/0.log" Mar 12 15:35:18 crc kubenswrapper[4921]: I0312 15:35:18.293492 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-7l7sm_2db21a73-26d9-44d6-aa91-ba8068b0525a/manager/0.log" Mar 12 15:35:19 crc kubenswrapper[4921]: I0312 15:35:19.995434 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:35:19 crc kubenswrapper[4921]: E0312 15:35:19.996686 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:35:34 crc kubenswrapper[4921]: I0312 15:35:34.984913 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:35:34 crc kubenswrapper[4921]: E0312 15:35:34.986586 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.453921 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2x9vp/must-gather-4n57n"] Mar 12 15:35:42 crc kubenswrapper[4921]: E0312 15:35:42.454799 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db225958-f98a-495a-8890-9e58c4afd19d" containerName="registry-server" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.454884 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="db225958-f98a-495a-8890-9e58c4afd19d" containerName="registry-server" Mar 12 15:35:42 crc kubenswrapper[4921]: E0312 15:35:42.454922 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db225958-f98a-495a-8890-9e58c4afd19d" containerName="extract-utilities" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.454931 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="db225958-f98a-495a-8890-9e58c4afd19d" containerName="extract-utilities" Mar 12 15:35:42 crc kubenswrapper[4921]: E0312 15:35:42.454947 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db225958-f98a-495a-8890-9e58c4afd19d" containerName="extract-content" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.454954 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="db225958-f98a-495a-8890-9e58c4afd19d" containerName="extract-content" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.455213 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="db225958-f98a-495a-8890-9e58c4afd19d" containerName="registry-server" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.456414 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/must-gather-4n57n" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.459274 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2x9vp"/"openshift-service-ca.crt" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.463329 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2x9vp"/"kube-root-ca.crt" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.476246 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2x9vp/must-gather-4n57n"] Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.599918 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d151fa11-3e9e-4a3e-855d-e9fbb1e0742f-must-gather-output\") pod \"must-gather-4n57n\" (UID: \"d151fa11-3e9e-4a3e-855d-e9fbb1e0742f\") " pod="openshift-must-gather-2x9vp/must-gather-4n57n" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.599960 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp7qp\" (UniqueName: \"kubernetes.io/projected/d151fa11-3e9e-4a3e-855d-e9fbb1e0742f-kube-api-access-kp7qp\") pod \"must-gather-4n57n\" (UID: \"d151fa11-3e9e-4a3e-855d-e9fbb1e0742f\") " pod="openshift-must-gather-2x9vp/must-gather-4n57n" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.702690 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d151fa11-3e9e-4a3e-855d-e9fbb1e0742f-must-gather-output\") pod \"must-gather-4n57n\" (UID: \"d151fa11-3e9e-4a3e-855d-e9fbb1e0742f\") " pod="openshift-must-gather-2x9vp/must-gather-4n57n" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.702741 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp7qp\" (UniqueName: \"kubernetes.io/projected/d151fa11-3e9e-4a3e-855d-e9fbb1e0742f-kube-api-access-kp7qp\") pod \"must-gather-4n57n\" (UID: \"d151fa11-3e9e-4a3e-855d-e9fbb1e0742f\") " pod="openshift-must-gather-2x9vp/must-gather-4n57n" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.703254 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d151fa11-3e9e-4a3e-855d-e9fbb1e0742f-must-gather-output\") pod \"must-gather-4n57n\" (UID: \"d151fa11-3e9e-4a3e-855d-e9fbb1e0742f\") " pod="openshift-must-gather-2x9vp/must-gather-4n57n" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.721568 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp7qp\" (UniqueName: \"kubernetes.io/projected/d151fa11-3e9e-4a3e-855d-e9fbb1e0742f-kube-api-access-kp7qp\") pod \"must-gather-4n57n\" (UID: \"d151fa11-3e9e-4a3e-855d-e9fbb1e0742f\") " pod="openshift-must-gather-2x9vp/must-gather-4n57n" Mar 12 15:35:42 crc kubenswrapper[4921]: I0312 15:35:42.790263 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/must-gather-4n57n" Mar 12 15:35:43 crc kubenswrapper[4921]: I0312 15:35:43.230576 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2x9vp/must-gather-4n57n"] Mar 12 15:35:43 crc kubenswrapper[4921]: I0312 15:35:43.303927 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2x9vp/must-gather-4n57n" event={"ID":"d151fa11-3e9e-4a3e-855d-e9fbb1e0742f","Type":"ContainerStarted","Data":"8cc9fe51d66dc1843fa0c466a49df07a5442dc6de63c342f5d192d9ea94214a9"} Mar 12 15:35:44 crc kubenswrapper[4921]: I0312 15:35:44.314491 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2x9vp/must-gather-4n57n" event={"ID":"d151fa11-3e9e-4a3e-855d-e9fbb1e0742f","Type":"ContainerStarted","Data":"5f615a73d197807257ad196e2d3421820699c42121526fba80e93cefb0372ec5"} Mar 12 15:35:44 crc kubenswrapper[4921]: I0312 15:35:44.316158 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2x9vp/must-gather-4n57n" event={"ID":"d151fa11-3e9e-4a3e-855d-e9fbb1e0742f","Type":"ContainerStarted","Data":"69b2d10ddcb7a9dd5a1c1063fa5943c354364d0d1c843868489d897c77066dc6"} Mar 12 15:35:45 crc kubenswrapper[4921]: I0312 15:35:45.984052 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:35:45 crc kubenswrapper[4921]: E0312 15:35:45.985383 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:35:47 crc kubenswrapper[4921]: I0312 15:35:47.525322 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2x9vp/must-gather-4n57n" podStartSLOduration=5.525302191 podStartE2EDuration="5.525302191s" podCreationTimestamp="2026-03-12 15:35:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:35:44.342886223 +0000 UTC m=+8767.032958204" watchObservedRunningTime="2026-03-12 15:35:47.525302191 +0000 UTC m=+8770.215374162" Mar 12 15:35:47 crc kubenswrapper[4921]: I0312 15:35:47.535465 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2x9vp/crc-debug-nrv66"] Mar 12 15:35:47 crc kubenswrapper[4921]: I0312 15:35:47.536869 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/crc-debug-nrv66" Mar 12 15:35:47 crc kubenswrapper[4921]: I0312 15:35:47.539200 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2x9vp"/"default-dockercfg-84r74" Mar 12 15:35:47 crc kubenswrapper[4921]: I0312 15:35:47.704709 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d432a2a2-7a23-459e-a5e0-b603e5e49090-host\") pod \"crc-debug-nrv66\" (UID: \"d432a2a2-7a23-459e-a5e0-b603e5e49090\") " pod="openshift-must-gather-2x9vp/crc-debug-nrv66" Mar 12 15:35:47 crc kubenswrapper[4921]: I0312 15:35:47.704779 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh2dp\" (UniqueName: \"kubernetes.io/projected/d432a2a2-7a23-459e-a5e0-b603e5e49090-kube-api-access-nh2dp\") pod \"crc-debug-nrv66\" (UID: \"d432a2a2-7a23-459e-a5e0-b603e5e49090\") " pod="openshift-must-gather-2x9vp/crc-debug-nrv66" Mar 12 15:35:47 crc kubenswrapper[4921]: I0312 15:35:47.806804 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d432a2a2-7a23-459e-a5e0-b603e5e49090-host\") pod \"crc-debug-nrv66\" (UID: \"d432a2a2-7a23-459e-a5e0-b603e5e49090\") " pod="openshift-must-gather-2x9vp/crc-debug-nrv66" Mar 12 15:35:47 crc kubenswrapper[4921]: I0312 15:35:47.806883 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d432a2a2-7a23-459e-a5e0-b603e5e49090-host\") pod \"crc-debug-nrv66\" (UID: \"d432a2a2-7a23-459e-a5e0-b603e5e49090\") " pod="openshift-must-gather-2x9vp/crc-debug-nrv66" Mar 12 15:35:47 crc kubenswrapper[4921]: I0312 15:35:47.806902 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh2dp\" (UniqueName: \"kubernetes.io/projected/d432a2a2-7a23-459e-a5e0-b603e5e49090-kube-api-access-nh2dp\") pod \"crc-debug-nrv66\" (UID: \"d432a2a2-7a23-459e-a5e0-b603e5e49090\") " pod="openshift-must-gather-2x9vp/crc-debug-nrv66" Mar 12 15:35:47 crc kubenswrapper[4921]: I0312 15:35:47.834686 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh2dp\" (UniqueName: \"kubernetes.io/projected/d432a2a2-7a23-459e-a5e0-b603e5e49090-kube-api-access-nh2dp\") pod \"crc-debug-nrv66\" (UID: \"d432a2a2-7a23-459e-a5e0-b603e5e49090\") " pod="openshift-must-gather-2x9vp/crc-debug-nrv66" Mar 12 15:35:47 crc kubenswrapper[4921]: I0312 15:35:47.859015 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/crc-debug-nrv66" Mar 12 15:35:47 crc kubenswrapper[4921]: W0312 15:35:47.887848 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd432a2a2_7a23_459e_a5e0_b603e5e49090.slice/crio-ea71ee30b7d6b730752cca3464bafd7dec2207ae98dd270d6ac79c9cd73aca18 WatchSource:0}: Error finding container ea71ee30b7d6b730752cca3464bafd7dec2207ae98dd270d6ac79c9cd73aca18: Status 404 returned error can't find the container with id ea71ee30b7d6b730752cca3464bafd7dec2207ae98dd270d6ac79c9cd73aca18 Mar 12 15:35:48 crc kubenswrapper[4921]: I0312 15:35:48.363569 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2x9vp/crc-debug-nrv66" event={"ID":"d432a2a2-7a23-459e-a5e0-b603e5e49090","Type":"ContainerStarted","Data":"a5cd69f583528e84702952573f454c3d155e4e4f55ef079121cb44b5443da541"} Mar 12 15:35:48 crc kubenswrapper[4921]: I0312 15:35:48.364403 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2x9vp/crc-debug-nrv66" event={"ID":"d432a2a2-7a23-459e-a5e0-b603e5e49090","Type":"ContainerStarted","Data":"ea71ee30b7d6b730752cca3464bafd7dec2207ae98dd270d6ac79c9cd73aca18"} Mar 12 15:35:56 crc kubenswrapper[4921]: I0312 15:35:56.983950 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:35:56 crc kubenswrapper[4921]: E0312 15:35:56.984680 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:36:00 crc kubenswrapper[4921]: I0312 15:36:00.143496 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2x9vp/crc-debug-nrv66" podStartSLOduration=13.143473264 podStartE2EDuration="13.143473264s" podCreationTimestamp="2026-03-12 15:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:35:48.383784743 +0000 UTC m=+8771.073856724" watchObservedRunningTime="2026-03-12 15:36:00.143473264 +0000 UTC m=+8782.833545255" Mar 12 15:36:00 crc kubenswrapper[4921]: I0312 15:36:00.145536 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555496-swhbq"] Mar 12 15:36:00 crc kubenswrapper[4921]: I0312 15:36:00.147526 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555496-swhbq" Mar 12 15:36:00 crc kubenswrapper[4921]: I0312 15:36:00.149602 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:36:00 crc kubenswrapper[4921]: I0312 15:36:00.150161 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:36:00 crc kubenswrapper[4921]: I0312 15:36:00.151319 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:36:00 crc kubenswrapper[4921]: I0312 15:36:00.173217 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555496-swhbq"] Mar 12 15:36:00 crc kubenswrapper[4921]: I0312 15:36:00.273643 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9zvh\" (UniqueName: \"kubernetes.io/projected/63494d31-fe81-4c38-8cf3-c23b9224a622-kube-api-access-t9zvh\") pod \"auto-csr-approver-29555496-swhbq\" (UID: \"63494d31-fe81-4c38-8cf3-c23b9224a622\") " pod="openshift-infra/auto-csr-approver-29555496-swhbq" Mar 12 15:36:00 crc kubenswrapper[4921]: I0312 15:36:00.375110 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9zvh\" (UniqueName: \"kubernetes.io/projected/63494d31-fe81-4c38-8cf3-c23b9224a622-kube-api-access-t9zvh\") pod \"auto-csr-approver-29555496-swhbq\" (UID: \"63494d31-fe81-4c38-8cf3-c23b9224a622\") " pod="openshift-infra/auto-csr-approver-29555496-swhbq" Mar 12 15:36:00 crc kubenswrapper[4921]: I0312 15:36:00.395551 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9zvh\" (UniqueName: \"kubernetes.io/projected/63494d31-fe81-4c38-8cf3-c23b9224a622-kube-api-access-t9zvh\") pod \"auto-csr-approver-29555496-swhbq\" (UID: \"63494d31-fe81-4c38-8cf3-c23b9224a622\") " pod="openshift-infra/auto-csr-approver-29555496-swhbq" Mar 12 15:36:00 crc kubenswrapper[4921]: I0312 15:36:00.468364 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555496-swhbq" Mar 12 15:36:00 crc kubenswrapper[4921]: I0312 15:36:00.933367 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555496-swhbq"] Mar 12 15:36:01 crc kubenswrapper[4921]: I0312 15:36:01.456702 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555496-swhbq" event={"ID":"63494d31-fe81-4c38-8cf3-c23b9224a622","Type":"ContainerStarted","Data":"2748dc51513d35f8bcf994811fdf5514a2ee32cbf74ccd4e1fcf90eeb58ff6ce"} Mar 12 15:36:02 crc kubenswrapper[4921]: I0312 15:36:02.466425 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555496-swhbq" event={"ID":"63494d31-fe81-4c38-8cf3-c23b9224a622","Type":"ContainerStarted","Data":"6eabf58f4c0b301783bc9d83c11bb80d5e3ba65cf922b44717c0021379eab4b3"} Mar 12 15:36:02 crc kubenswrapper[4921]: I0312 15:36:02.490108 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555496-swhbq" podStartSLOduration=1.2852893779999999 podStartE2EDuration="2.490091115s" podCreationTimestamp="2026-03-12 15:36:00 +0000 UTC" firstStartedPulling="2026-03-12 15:36:00.945551165 +0000 UTC m=+8783.635623136" lastFinishedPulling="2026-03-12 15:36:02.150352902 +0000 UTC m=+8784.840424873" observedRunningTime="2026-03-12 15:36:02.482432598 +0000 UTC m=+8785.172504569" watchObservedRunningTime="2026-03-12 15:36:02.490091115 +0000 UTC m=+8785.180163086" Mar 12 15:36:03 crc kubenswrapper[4921]: I0312 15:36:03.482649 4921 generic.go:334] "Generic (PLEG): container finished" podID="63494d31-fe81-4c38-8cf3-c23b9224a622" containerID="6eabf58f4c0b301783bc9d83c11bb80d5e3ba65cf922b44717c0021379eab4b3" exitCode=0 Mar 12 15:36:03 crc kubenswrapper[4921]: I0312 15:36:03.482828 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555496-swhbq" event={"ID":"63494d31-fe81-4c38-8cf3-c23b9224a622","Type":"ContainerDied","Data":"6eabf58f4c0b301783bc9d83c11bb80d5e3ba65cf922b44717c0021379eab4b3"} Mar 12 15:36:04 crc kubenswrapper[4921]: I0312 15:36:04.888673 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555496-swhbq" Mar 12 15:36:05 crc kubenswrapper[4921]: I0312 15:36:05.006454 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9zvh\" (UniqueName: \"kubernetes.io/projected/63494d31-fe81-4c38-8cf3-c23b9224a622-kube-api-access-t9zvh\") pod \"63494d31-fe81-4c38-8cf3-c23b9224a622\" (UID: \"63494d31-fe81-4c38-8cf3-c23b9224a622\") " Mar 12 15:36:05 crc kubenswrapper[4921]: I0312 15:36:05.025026 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63494d31-fe81-4c38-8cf3-c23b9224a622-kube-api-access-t9zvh" (OuterVolumeSpecName: "kube-api-access-t9zvh") pod "63494d31-fe81-4c38-8cf3-c23b9224a622" (UID: "63494d31-fe81-4c38-8cf3-c23b9224a622"). InnerVolumeSpecName "kube-api-access-t9zvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:36:05 crc kubenswrapper[4921]: I0312 15:36:05.109227 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9zvh\" (UniqueName: \"kubernetes.io/projected/63494d31-fe81-4c38-8cf3-c23b9224a622-kube-api-access-t9zvh\") on node \"crc\" DevicePath \"\"" Mar 12 15:36:05 crc kubenswrapper[4921]: I0312 15:36:05.501199 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555496-swhbq" event={"ID":"63494d31-fe81-4c38-8cf3-c23b9224a622","Type":"ContainerDied","Data":"2748dc51513d35f8bcf994811fdf5514a2ee32cbf74ccd4e1fcf90eeb58ff6ce"} Mar 12 15:36:05 crc kubenswrapper[4921]: I0312 15:36:05.501606 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2748dc51513d35f8bcf994811fdf5514a2ee32cbf74ccd4e1fcf90eeb58ff6ce" Mar 12 15:36:05 crc kubenswrapper[4921]: I0312 15:36:05.501342 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555496-swhbq" Mar 12 15:36:05 crc kubenswrapper[4921]: I0312 15:36:05.554270 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555490-dhtm5"] Mar 12 15:36:05 crc kubenswrapper[4921]: I0312 15:36:05.566044 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555490-dhtm5"] Mar 12 15:36:05 crc kubenswrapper[4921]: I0312 15:36:05.996321 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a13769b-bac4-4484-8f58-fefc1c5532ee" path="/var/lib/kubelet/pods/1a13769b-bac4-4484-8f58-fefc1c5532ee/volumes" Mar 12 15:36:07 crc kubenswrapper[4921]: I0312 15:36:07.992214 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:36:07 crc kubenswrapper[4921]: E0312 15:36:07.993023 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:36:21 crc kubenswrapper[4921]: I0312 15:36:21.983505 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:36:21 crc kubenswrapper[4921]: E0312 15:36:21.984208 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:36:27 crc kubenswrapper[4921]: I0312 15:36:27.687030 4921 generic.go:334] "Generic (PLEG): container finished" podID="d432a2a2-7a23-459e-a5e0-b603e5e49090" containerID="a5cd69f583528e84702952573f454c3d155e4e4f55ef079121cb44b5443da541" exitCode=0 Mar 12 15:36:27 crc kubenswrapper[4921]: I0312 15:36:27.687126 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2x9vp/crc-debug-nrv66" event={"ID":"d432a2a2-7a23-459e-a5e0-b603e5e49090","Type":"ContainerDied","Data":"a5cd69f583528e84702952573f454c3d155e4e4f55ef079121cb44b5443da541"} Mar 12 15:36:28 crc kubenswrapper[4921]: I0312 15:36:28.801106 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/crc-debug-nrv66" Mar 12 15:36:28 crc kubenswrapper[4921]: I0312 15:36:28.860121 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2x9vp/crc-debug-nrv66"] Mar 12 15:36:28 crc kubenswrapper[4921]: I0312 15:36:28.869940 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2x9vp/crc-debug-nrv66"] Mar 12 15:36:28 crc kubenswrapper[4921]: I0312 15:36:28.912571 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh2dp\" (UniqueName: \"kubernetes.io/projected/d432a2a2-7a23-459e-a5e0-b603e5e49090-kube-api-access-nh2dp\") pod \"d432a2a2-7a23-459e-a5e0-b603e5e49090\" (UID: \"d432a2a2-7a23-459e-a5e0-b603e5e49090\") " Mar 12 15:36:28 crc kubenswrapper[4921]: I0312 15:36:28.912761 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d432a2a2-7a23-459e-a5e0-b603e5e49090-host\") pod \"d432a2a2-7a23-459e-a5e0-b603e5e49090\" (UID: \"d432a2a2-7a23-459e-a5e0-b603e5e49090\") " Mar 12 15:36:28 crc kubenswrapper[4921]: I0312 15:36:28.912866 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d432a2a2-7a23-459e-a5e0-b603e5e49090-host" (OuterVolumeSpecName: "host") pod "d432a2a2-7a23-459e-a5e0-b603e5e49090" (UID: "d432a2a2-7a23-459e-a5e0-b603e5e49090"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:36:28 crc kubenswrapper[4921]: I0312 15:36:28.913363 4921 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d432a2a2-7a23-459e-a5e0-b603e5e49090-host\") on node \"crc\" DevicePath \"\"" Mar 12 15:36:28 crc kubenswrapper[4921]: I0312 15:36:28.919062 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d432a2a2-7a23-459e-a5e0-b603e5e49090-kube-api-access-nh2dp" (OuterVolumeSpecName: "kube-api-access-nh2dp") pod "d432a2a2-7a23-459e-a5e0-b603e5e49090" (UID: "d432a2a2-7a23-459e-a5e0-b603e5e49090"). InnerVolumeSpecName "kube-api-access-nh2dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:36:29 crc kubenswrapper[4921]: I0312 15:36:29.015097 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh2dp\" (UniqueName: \"kubernetes.io/projected/d432a2a2-7a23-459e-a5e0-b603e5e49090-kube-api-access-nh2dp\") on node \"crc\" DevicePath \"\"" Mar 12 15:36:29 crc kubenswrapper[4921]: I0312 15:36:29.704754 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea71ee30b7d6b730752cca3464bafd7dec2207ae98dd270d6ac79c9cd73aca18" Mar 12 15:36:29 crc kubenswrapper[4921]: I0312 15:36:29.704802 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/crc-debug-nrv66" Mar 12 15:36:29 crc kubenswrapper[4921]: I0312 15:36:29.992126 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d432a2a2-7a23-459e-a5e0-b603e5e49090" path="/var/lib/kubelet/pods/d432a2a2-7a23-459e-a5e0-b603e5e49090/volumes" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.008790 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2x9vp/crc-debug-qls74"] Mar 12 15:36:30 crc kubenswrapper[4921]: E0312 15:36:30.009187 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d432a2a2-7a23-459e-a5e0-b603e5e49090" containerName="container-00" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.009202 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d432a2a2-7a23-459e-a5e0-b603e5e49090" containerName="container-00" Mar 12 15:36:30 crc kubenswrapper[4921]: E0312 15:36:30.009221 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63494d31-fe81-4c38-8cf3-c23b9224a622" containerName="oc" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.009227 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="63494d31-fe81-4c38-8cf3-c23b9224a622" containerName="oc" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.009400 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d432a2a2-7a23-459e-a5e0-b603e5e49090" containerName="container-00" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.009431 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="63494d31-fe81-4c38-8cf3-c23b9224a622" containerName="oc" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.010054 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/crc-debug-qls74" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.012090 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2x9vp"/"default-dockercfg-84r74" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.136609 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t5p5\" (UniqueName: \"kubernetes.io/projected/1bf9f752-7792-4d9b-987d-ff72399267f2-kube-api-access-6t5p5\") pod \"crc-debug-qls74\" (UID: \"1bf9f752-7792-4d9b-987d-ff72399267f2\") " pod="openshift-must-gather-2x9vp/crc-debug-qls74" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.136939 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1bf9f752-7792-4d9b-987d-ff72399267f2-host\") pod \"crc-debug-qls74\" (UID: \"1bf9f752-7792-4d9b-987d-ff72399267f2\") " pod="openshift-must-gather-2x9vp/crc-debug-qls74" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.238698 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t5p5\" (UniqueName: \"kubernetes.io/projected/1bf9f752-7792-4d9b-987d-ff72399267f2-kube-api-access-6t5p5\") pod \"crc-debug-qls74\" (UID: \"1bf9f752-7792-4d9b-987d-ff72399267f2\") " pod="openshift-must-gather-2x9vp/crc-debug-qls74" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.238889 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1bf9f752-7792-4d9b-987d-ff72399267f2-host\") pod \"crc-debug-qls74\" (UID: \"1bf9f752-7792-4d9b-987d-ff72399267f2\") " pod="openshift-must-gather-2x9vp/crc-debug-qls74" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.239071 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1bf9f752-7792-4d9b-987d-ff72399267f2-host\") pod \"crc-debug-qls74\" (UID: \"1bf9f752-7792-4d9b-987d-ff72399267f2\") " pod="openshift-must-gather-2x9vp/crc-debug-qls74" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.263345 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t5p5\" (UniqueName: \"kubernetes.io/projected/1bf9f752-7792-4d9b-987d-ff72399267f2-kube-api-access-6t5p5\") pod \"crc-debug-qls74\" (UID: \"1bf9f752-7792-4d9b-987d-ff72399267f2\") " pod="openshift-must-gather-2x9vp/crc-debug-qls74" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.328000 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/crc-debug-qls74" Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.716341 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2x9vp/crc-debug-qls74" event={"ID":"1bf9f752-7792-4d9b-987d-ff72399267f2","Type":"ContainerStarted","Data":"5f698f2d333370c30f0b8052f511a04f1ec1808c0feab145e64a55405b04938d"} Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.716715 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2x9vp/crc-debug-qls74" event={"ID":"1bf9f752-7792-4d9b-987d-ff72399267f2","Type":"ContainerStarted","Data":"ed8683e60078e912d415e0d8f3789b1c167f1c32e52c1e82eecb50c051c9b1c7"} Mar 12 15:36:30 crc kubenswrapper[4921]: I0312 15:36:30.733079 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2x9vp/crc-debug-qls74" podStartSLOduration=1.733061255 podStartE2EDuration="1.733061255s" podCreationTimestamp="2026-03-12 15:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 15:36:30.732871289 +0000 UTC m=+8813.422943270" watchObservedRunningTime="2026-03-12 15:36:30.733061255 +0000 UTC m=+8813.423133226" Mar 12 15:36:31 crc kubenswrapper[4921]: I0312 15:36:31.725317 4921 generic.go:334] "Generic (PLEG): container finished" podID="1bf9f752-7792-4d9b-987d-ff72399267f2" containerID="5f698f2d333370c30f0b8052f511a04f1ec1808c0feab145e64a55405b04938d" exitCode=0 Mar 12 15:36:31 crc kubenswrapper[4921]: I0312 15:36:31.725359 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2x9vp/crc-debug-qls74" event={"ID":"1bf9f752-7792-4d9b-987d-ff72399267f2","Type":"ContainerDied","Data":"5f698f2d333370c30f0b8052f511a04f1ec1808c0feab145e64a55405b04938d"} Mar 12 15:36:32 crc kubenswrapper[4921]: I0312 15:36:32.856278 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/crc-debug-qls74" Mar 12 15:36:32 crc kubenswrapper[4921]: I0312 15:36:32.912420 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t5p5\" (UniqueName: \"kubernetes.io/projected/1bf9f752-7792-4d9b-987d-ff72399267f2-kube-api-access-6t5p5\") pod \"1bf9f752-7792-4d9b-987d-ff72399267f2\" (UID: \"1bf9f752-7792-4d9b-987d-ff72399267f2\") " Mar 12 15:36:32 crc kubenswrapper[4921]: I0312 15:36:32.914105 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1bf9f752-7792-4d9b-987d-ff72399267f2-host\") pod \"1bf9f752-7792-4d9b-987d-ff72399267f2\" (UID: \"1bf9f752-7792-4d9b-987d-ff72399267f2\") " Mar 12 15:36:32 crc kubenswrapper[4921]: I0312 15:36:32.914212 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1bf9f752-7792-4d9b-987d-ff72399267f2-host" (OuterVolumeSpecName: "host") pod "1bf9f752-7792-4d9b-987d-ff72399267f2" (UID: "1bf9f752-7792-4d9b-987d-ff72399267f2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:36:32 crc kubenswrapper[4921]: I0312 15:36:32.914691 4921 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1bf9f752-7792-4d9b-987d-ff72399267f2-host\") on node \"crc\" DevicePath \"\"" Mar 12 15:36:32 crc kubenswrapper[4921]: I0312 15:36:32.941119 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf9f752-7792-4d9b-987d-ff72399267f2-kube-api-access-6t5p5" (OuterVolumeSpecName: "kube-api-access-6t5p5") pod "1bf9f752-7792-4d9b-987d-ff72399267f2" (UID: "1bf9f752-7792-4d9b-987d-ff72399267f2"). InnerVolumeSpecName "kube-api-access-6t5p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:36:33 crc kubenswrapper[4921]: I0312 15:36:33.016133 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t5p5\" (UniqueName: \"kubernetes.io/projected/1bf9f752-7792-4d9b-987d-ff72399267f2-kube-api-access-6t5p5\") on node \"crc\" DevicePath \"\"" Mar 12 15:36:33 crc kubenswrapper[4921]: I0312 15:36:33.143900 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2x9vp/crc-debug-qls74"] Mar 12 15:36:33 crc kubenswrapper[4921]: I0312 15:36:33.154238 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2x9vp/crc-debug-qls74"] Mar 12 15:36:33 crc kubenswrapper[4921]: I0312 15:36:33.746054 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed8683e60078e912d415e0d8f3789b1c167f1c32e52c1e82eecb50c051c9b1c7" Mar 12 15:36:33 crc kubenswrapper[4921]: I0312 15:36:33.746119 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/crc-debug-qls74" Mar 12 15:36:33 crc kubenswrapper[4921]: I0312 15:36:33.994956 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf9f752-7792-4d9b-987d-ff72399267f2" path="/var/lib/kubelet/pods/1bf9f752-7792-4d9b-987d-ff72399267f2/volumes" Mar 12 15:36:34 crc kubenswrapper[4921]: I0312 15:36:34.442384 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2x9vp/crc-debug-qdhl7"] Mar 12 15:36:34 crc kubenswrapper[4921]: E0312 15:36:34.443182 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bf9f752-7792-4d9b-987d-ff72399267f2" containerName="container-00" Mar 12 15:36:34 crc kubenswrapper[4921]: I0312 15:36:34.443206 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bf9f752-7792-4d9b-987d-ff72399267f2" containerName="container-00" Mar 12 15:36:34 crc kubenswrapper[4921]: I0312 15:36:34.443455 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bf9f752-7792-4d9b-987d-ff72399267f2" containerName="container-00" Mar 12 15:36:34 crc kubenswrapper[4921]: I0312 15:36:34.444179 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/crc-debug-qdhl7" Mar 12 15:36:34 crc kubenswrapper[4921]: I0312 15:36:34.446535 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2x9vp"/"default-dockercfg-84r74" Mar 12 15:36:34 crc kubenswrapper[4921]: I0312 15:36:34.546884 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6250eecc-a19e-47d9-bc61-81bd620373fc-host\") pod \"crc-debug-qdhl7\" (UID: \"6250eecc-a19e-47d9-bc61-81bd620373fc\") " pod="openshift-must-gather-2x9vp/crc-debug-qdhl7" Mar 12 15:36:34 crc kubenswrapper[4921]: I0312 15:36:34.547015 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztct8\" (UniqueName: \"kubernetes.io/projected/6250eecc-a19e-47d9-bc61-81bd620373fc-kube-api-access-ztct8\") pod \"crc-debug-qdhl7\" (UID: \"6250eecc-a19e-47d9-bc61-81bd620373fc\") " pod="openshift-must-gather-2x9vp/crc-debug-qdhl7" Mar 12 15:36:34 crc kubenswrapper[4921]: I0312 15:36:34.649836 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6250eecc-a19e-47d9-bc61-81bd620373fc-host\") pod \"crc-debug-qdhl7\" (UID: \"6250eecc-a19e-47d9-bc61-81bd620373fc\") " pod="openshift-must-gather-2x9vp/crc-debug-qdhl7" Mar 12 15:36:34 crc kubenswrapper[4921]: I0312 15:36:34.649908 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztct8\" (UniqueName: \"kubernetes.io/projected/6250eecc-a19e-47d9-bc61-81bd620373fc-kube-api-access-ztct8\") pod \"crc-debug-qdhl7\" (UID: \"6250eecc-a19e-47d9-bc61-81bd620373fc\") " pod="openshift-must-gather-2x9vp/crc-debug-qdhl7" Mar 12 15:36:34 crc kubenswrapper[4921]: I0312 15:36:34.649957 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6250eecc-a19e-47d9-bc61-81bd620373fc-host\") pod \"crc-debug-qdhl7\" (UID: \"6250eecc-a19e-47d9-bc61-81bd620373fc\") " pod="openshift-must-gather-2x9vp/crc-debug-qdhl7" Mar 12 15:36:34 crc kubenswrapper[4921]: I0312 15:36:34.668186 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztct8\" (UniqueName: \"kubernetes.io/projected/6250eecc-a19e-47d9-bc61-81bd620373fc-kube-api-access-ztct8\") pod \"crc-debug-qdhl7\" (UID: \"6250eecc-a19e-47d9-bc61-81bd620373fc\") " pod="openshift-must-gather-2x9vp/crc-debug-qdhl7" Mar 12 15:36:34 crc kubenswrapper[4921]: I0312 15:36:34.760160 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/crc-debug-qdhl7" Mar 12 15:36:34 crc kubenswrapper[4921]: I0312 15:36:34.983211 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:36:34 crc kubenswrapper[4921]: E0312 15:36:34.983674 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:36:35 crc kubenswrapper[4921]: I0312 15:36:35.769216 4921 generic.go:334] "Generic (PLEG): container finished" podID="6250eecc-a19e-47d9-bc61-81bd620373fc" containerID="b4d595030d28deb3a6a06d713e8ba894e45775bcdece64317704a82a560c8816" exitCode=0 Mar 12 15:36:35 crc kubenswrapper[4921]: I0312 15:36:35.769262 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2x9vp/crc-debug-qdhl7" event={"ID":"6250eecc-a19e-47d9-bc61-81bd620373fc","Type":"ContainerDied","Data":"b4d595030d28deb3a6a06d713e8ba894e45775bcdece64317704a82a560c8816"} Mar 12 15:36:35 crc kubenswrapper[4921]: I0312 15:36:35.769287 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2x9vp/crc-debug-qdhl7" event={"ID":"6250eecc-a19e-47d9-bc61-81bd620373fc","Type":"ContainerStarted","Data":"6c74b4d138796313f9f640026e7e5fbf57ed1b9968ca61ad7208d8bc1309c742"} Mar 12 15:36:35 crc kubenswrapper[4921]: I0312 15:36:35.843213 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2x9vp/crc-debug-qdhl7"] Mar 12 15:36:35 crc kubenswrapper[4921]: I0312 15:36:35.860752 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2x9vp/crc-debug-qdhl7"] Mar 12 15:36:36 crc kubenswrapper[4921]: I0312 15:36:36.880247 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/crc-debug-qdhl7" Mar 12 15:36:36 crc kubenswrapper[4921]: I0312 15:36:36.998738 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztct8\" (UniqueName: \"kubernetes.io/projected/6250eecc-a19e-47d9-bc61-81bd620373fc-kube-api-access-ztct8\") pod \"6250eecc-a19e-47d9-bc61-81bd620373fc\" (UID: \"6250eecc-a19e-47d9-bc61-81bd620373fc\") " Mar 12 15:36:36 crc kubenswrapper[4921]: I0312 15:36:36.998889 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6250eecc-a19e-47d9-bc61-81bd620373fc-host\") pod \"6250eecc-a19e-47d9-bc61-81bd620373fc\" (UID: \"6250eecc-a19e-47d9-bc61-81bd620373fc\") " Mar 12 15:36:36 crc kubenswrapper[4921]: I0312 15:36:36.999150 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6250eecc-a19e-47d9-bc61-81bd620373fc-host" (OuterVolumeSpecName: "host") pod "6250eecc-a19e-47d9-bc61-81bd620373fc" (UID: "6250eecc-a19e-47d9-bc61-81bd620373fc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 15:36:36 crc kubenswrapper[4921]: I0312 15:36:36.999372 4921 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6250eecc-a19e-47d9-bc61-81bd620373fc-host\") on node \"crc\" DevicePath \"\"" Mar 12 15:36:37 crc kubenswrapper[4921]: I0312 15:36:37.003925 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6250eecc-a19e-47d9-bc61-81bd620373fc-kube-api-access-ztct8" (OuterVolumeSpecName: "kube-api-access-ztct8") pod "6250eecc-a19e-47d9-bc61-81bd620373fc" (UID: "6250eecc-a19e-47d9-bc61-81bd620373fc"). InnerVolumeSpecName "kube-api-access-ztct8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:36:37 crc kubenswrapper[4921]: I0312 15:36:37.102104 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztct8\" (UniqueName: \"kubernetes.io/projected/6250eecc-a19e-47d9-bc61-81bd620373fc-kube-api-access-ztct8\") on node \"crc\" DevicePath \"\"" Mar 12 15:36:37 crc kubenswrapper[4921]: I0312 15:36:37.789766 4921 scope.go:117] "RemoveContainer" containerID="b4d595030d28deb3a6a06d713e8ba894e45775bcdece64317704a82a560c8816" Mar 12 15:36:37 crc kubenswrapper[4921]: I0312 15:36:37.789838 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/crc-debug-qdhl7" Mar 12 15:36:37 crc kubenswrapper[4921]: I0312 15:36:37.995441 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6250eecc-a19e-47d9-bc61-81bd620373fc" path="/var/lib/kubelet/pods/6250eecc-a19e-47d9-bc61-81bd620373fc/volumes" Mar 12 15:36:45 crc kubenswrapper[4921]: I0312 15:36:45.983934 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:36:45 crc kubenswrapper[4921]: E0312 15:36:45.984686 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:36:52 crc kubenswrapper[4921]: I0312 15:36:52.494288 4921 scope.go:117] "RemoveContainer" containerID="38da3af9d2069f7cbd520d4b35ed77ce32b373721aed72037f0520e55f5fa6bc" Mar 12 15:36:57 crc kubenswrapper[4921]: I0312 15:36:57.993935 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:36:58 crc kubenswrapper[4921]: I0312 15:36:58.959692 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"e71e67d67f2a250359c6ca0c79b048ff9319acfa4f7e5973d16d402b376ad816"} Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.577267 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lwtpk"] Mar 12 15:37:14 crc kubenswrapper[4921]: E0312 15:37:14.579329 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6250eecc-a19e-47d9-bc61-81bd620373fc" containerName="container-00" Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.579361 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="6250eecc-a19e-47d9-bc61-81bd620373fc" containerName="container-00" Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.579586 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="6250eecc-a19e-47d9-bc61-81bd620373fc" containerName="container-00" Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.581344 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.599601 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lwtpk"] Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.681341 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-catalog-content\") pod \"redhat-operators-lwtpk\" (UID: \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\") " pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.681538 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-utilities\") pod \"redhat-operators-lwtpk\" (UID: \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\") " pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.681582 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7rl2\" (UniqueName: \"kubernetes.io/projected/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-kube-api-access-k7rl2\") pod \"redhat-operators-lwtpk\" (UID: \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\") " pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.784298 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-catalog-content\") pod \"redhat-operators-lwtpk\" (UID: \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\") " pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.784394 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-utilities\") pod \"redhat-operators-lwtpk\" (UID: \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\") " pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.784449 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7rl2\" (UniqueName: \"kubernetes.io/projected/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-kube-api-access-k7rl2\") pod \"redhat-operators-lwtpk\" (UID: \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\") " pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.785371 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-catalog-content\") pod \"redhat-operators-lwtpk\" (UID: \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\") " pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.785636 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-utilities\") pod \"redhat-operators-lwtpk\" (UID: \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\") " pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.810131 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7rl2\" (UniqueName: \"kubernetes.io/projected/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-kube-api-access-k7rl2\") pod \"redhat-operators-lwtpk\" (UID: \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\") " pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:14 crc kubenswrapper[4921]: I0312 15:37:14.902765 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:15 crc kubenswrapper[4921]: I0312 15:37:15.424861 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lwtpk"] Mar 12 15:37:16 crc kubenswrapper[4921]: I0312 15:37:16.119855 4921 generic.go:334] "Generic (PLEG): container finished" podID="63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" containerID="ba7cff01e97e8269a4530e504fa036a96b6dcef1b41ae039bd7c222734ea5f77" exitCode=0 Mar 12 15:37:16 crc kubenswrapper[4921]: I0312 15:37:16.120197 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lwtpk" event={"ID":"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a","Type":"ContainerDied","Data":"ba7cff01e97e8269a4530e504fa036a96b6dcef1b41ae039bd7c222734ea5f77"} Mar 12 15:37:16 crc kubenswrapper[4921]: I0312 15:37:16.120258 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lwtpk" event={"ID":"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a","Type":"ContainerStarted","Data":"e50b800dd0398c370f405cc5a99dd7f319ea6df9b713abd28d73ec310e22525f"} Mar 12 15:37:17 crc kubenswrapper[4921]: I0312 15:37:17.130112 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lwtpk" event={"ID":"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a","Type":"ContainerStarted","Data":"a5a67568f6563d378eef7cfdcaddb8f5ebeb3a42f37c38eb655d7296c9e2085e"} Mar 12 15:37:18 crc kubenswrapper[4921]: I0312 15:37:18.139408 4921 generic.go:334] "Generic (PLEG): container finished" podID="63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" containerID="a5a67568f6563d378eef7cfdcaddb8f5ebeb3a42f37c38eb655d7296c9e2085e" exitCode=0 Mar 12 15:37:18 crc kubenswrapper[4921]: I0312 15:37:18.139520 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lwtpk" event={"ID":"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a","Type":"ContainerDied","Data":"a5a67568f6563d378eef7cfdcaddb8f5ebeb3a42f37c38eb655d7296c9e2085e"} Mar 12 15:37:20 crc kubenswrapper[4921]: I0312 15:37:20.168164 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lwtpk" event={"ID":"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a","Type":"ContainerStarted","Data":"23b9ca05a5736155397f005416ad5e824bb58a0ec822745c75f6fed5ca55536f"} Mar 12 15:37:20 crc kubenswrapper[4921]: I0312 15:37:20.195053 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lwtpk" podStartSLOduration=3.346552246 podStartE2EDuration="6.195037571s" podCreationTimestamp="2026-03-12 15:37:14 +0000 UTC" firstStartedPulling="2026-03-12 15:37:16.122280344 +0000 UTC m=+8858.812352315" lastFinishedPulling="2026-03-12 15:37:18.970765669 +0000 UTC m=+8861.660837640" observedRunningTime="2026-03-12 15:37:20.186528717 +0000 UTC m=+8862.876600708" watchObservedRunningTime="2026-03-12 15:37:20.195037571 +0000 UTC m=+8862.885109532" Mar 12 15:37:24 crc kubenswrapper[4921]: I0312 15:37:24.902897 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:24 crc kubenswrapper[4921]: I0312 15:37:24.903449 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:25 crc kubenswrapper[4921]: I0312 15:37:25.962591 4921 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lwtpk" podUID="63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" containerName="registry-server" probeResult="failure" output=< Mar 12 15:37:25 crc kubenswrapper[4921]: timeout: failed to connect service ":50051" within 1s Mar 12 15:37:25 crc kubenswrapper[4921]: > Mar 12 15:37:34 crc kubenswrapper[4921]: I0312 15:37:34.964773 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:35 crc kubenswrapper[4921]: I0312 15:37:35.143323 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:35 crc kubenswrapper[4921]: I0312 15:37:35.230931 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lwtpk"] Mar 12 15:37:36 crc kubenswrapper[4921]: I0312 15:37:36.305746 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lwtpk" podUID="63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" containerName="registry-server" containerID="cri-o://23b9ca05a5736155397f005416ad5e824bb58a0ec822745c75f6fed5ca55536f" gracePeriod=2 Mar 12 15:37:36 crc kubenswrapper[4921]: I0312 15:37:36.844579 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.030367 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-utilities\") pod \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\" (UID: \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\") " Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.030422 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7rl2\" (UniqueName: \"kubernetes.io/projected/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-kube-api-access-k7rl2\") pod \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\" (UID: \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\") " Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.030498 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-catalog-content\") pod \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\" (UID: \"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a\") " Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.031338 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-utilities" (OuterVolumeSpecName: "utilities") pod "63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" (UID: "63a56ed0-8ecf-439c-b6e9-e1bcca9b684a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.039995 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-kube-api-access-k7rl2" (OuterVolumeSpecName: "kube-api-access-k7rl2") pod "63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" (UID: "63a56ed0-8ecf-439c-b6e9-e1bcca9b684a"). InnerVolumeSpecName "kube-api-access-k7rl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.133480 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.133517 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7rl2\" (UniqueName: \"kubernetes.io/projected/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-kube-api-access-k7rl2\") on node \"crc\" DevicePath \"\"" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.179424 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" (UID: "63a56ed0-8ecf-439c-b6e9-e1bcca9b684a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.232199 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-bcbd96998-bx4p5_59a6f440-5a89-42a7-baa1-77a875476665/barbican-api/0.log" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.235250 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.328175 4921 generic.go:334] "Generic (PLEG): container finished" podID="63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" containerID="23b9ca05a5736155397f005416ad5e824bb58a0ec822745c75f6fed5ca55536f" exitCode=0 Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.328323 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lwtpk" event={"ID":"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a","Type":"ContainerDied","Data":"23b9ca05a5736155397f005416ad5e824bb58a0ec822745c75f6fed5ca55536f"} Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.328493 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lwtpk" event={"ID":"63a56ed0-8ecf-439c-b6e9-e1bcca9b684a","Type":"ContainerDied","Data":"e50b800dd0398c370f405cc5a99dd7f319ea6df9b713abd28d73ec310e22525f"} Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.328512 4921 scope.go:117] "RemoveContainer" containerID="23b9ca05a5736155397f005416ad5e824bb58a0ec822745c75f6fed5ca55536f" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.328405 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lwtpk" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.384873 4921 scope.go:117] "RemoveContainer" containerID="a5a67568f6563d378eef7cfdcaddb8f5ebeb3a42f37c38eb655d7296c9e2085e" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.406470 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lwtpk"] Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.407480 4921 scope.go:117] "RemoveContainer" containerID="ba7cff01e97e8269a4530e504fa036a96b6dcef1b41ae039bd7c222734ea5f77" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.421298 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lwtpk"] Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.457464 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-bcbd96998-bx4p5_59a6f440-5a89-42a7-baa1-77a875476665/barbican-api-log/0.log" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.467184 4921 scope.go:117] "RemoveContainer" containerID="23b9ca05a5736155397f005416ad5e824bb58a0ec822745c75f6fed5ca55536f" Mar 12 15:37:37 crc kubenswrapper[4921]: E0312 15:37:37.467965 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b9ca05a5736155397f005416ad5e824bb58a0ec822745c75f6fed5ca55536f\": container with ID starting with 23b9ca05a5736155397f005416ad5e824bb58a0ec822745c75f6fed5ca55536f not found: ID does not exist" containerID="23b9ca05a5736155397f005416ad5e824bb58a0ec822745c75f6fed5ca55536f" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.468001 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b9ca05a5736155397f005416ad5e824bb58a0ec822745c75f6fed5ca55536f"} err="failed to get container status \"23b9ca05a5736155397f005416ad5e824bb58a0ec822745c75f6fed5ca55536f\": rpc error: code = NotFound desc = could not find container \"23b9ca05a5736155397f005416ad5e824bb58a0ec822745c75f6fed5ca55536f\": container with ID starting with 23b9ca05a5736155397f005416ad5e824bb58a0ec822745c75f6fed5ca55536f not found: ID does not exist" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.468032 4921 scope.go:117] "RemoveContainer" containerID="a5a67568f6563d378eef7cfdcaddb8f5ebeb3a42f37c38eb655d7296c9e2085e" Mar 12 15:37:37 crc kubenswrapper[4921]: E0312 15:37:37.469861 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5a67568f6563d378eef7cfdcaddb8f5ebeb3a42f37c38eb655d7296c9e2085e\": container with ID starting with a5a67568f6563d378eef7cfdcaddb8f5ebeb3a42f37c38eb655d7296c9e2085e not found: ID does not exist" containerID="a5a67568f6563d378eef7cfdcaddb8f5ebeb3a42f37c38eb655d7296c9e2085e" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.469893 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5a67568f6563d378eef7cfdcaddb8f5ebeb3a42f37c38eb655d7296c9e2085e"} err="failed to get container status \"a5a67568f6563d378eef7cfdcaddb8f5ebeb3a42f37c38eb655d7296c9e2085e\": rpc error: code = NotFound desc = could not find container \"a5a67568f6563d378eef7cfdcaddb8f5ebeb3a42f37c38eb655d7296c9e2085e\": container with ID starting with a5a67568f6563d378eef7cfdcaddb8f5ebeb3a42f37c38eb655d7296c9e2085e not found: ID does not exist" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.469953 4921 scope.go:117] "RemoveContainer" containerID="ba7cff01e97e8269a4530e504fa036a96b6dcef1b41ae039bd7c222734ea5f77" Mar 12 15:37:37 crc kubenswrapper[4921]: E0312 15:37:37.475943 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba7cff01e97e8269a4530e504fa036a96b6dcef1b41ae039bd7c222734ea5f77\": container with ID starting with ba7cff01e97e8269a4530e504fa036a96b6dcef1b41ae039bd7c222734ea5f77 not found: ID does not exist" containerID="ba7cff01e97e8269a4530e504fa036a96b6dcef1b41ae039bd7c222734ea5f77" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.476000 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba7cff01e97e8269a4530e504fa036a96b6dcef1b41ae039bd7c222734ea5f77"} err="failed to get container status \"ba7cff01e97e8269a4530e504fa036a96b6dcef1b41ae039bd7c222734ea5f77\": rpc error: code = NotFound desc = could not find container \"ba7cff01e97e8269a4530e504fa036a96b6dcef1b41ae039bd7c222734ea5f77\": container with ID starting with ba7cff01e97e8269a4530e504fa036a96b6dcef1b41ae039bd7c222734ea5f77 not found: ID does not exist" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.491536 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76b64f84d4-tpqnj_47867e82-3783-4f22-bc4f-9128016cf98e/barbican-keystone-listener/0.log" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.715885 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-594f99766c-xf6hh_6c4d7515-b40d-418c-b32e-b6a857c040a7/barbican-worker/0.log" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.830695 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-594f99766c-xf6hh_6c4d7515-b40d-418c-b32e-b6a857c040a7/barbican-worker-log/0.log" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.880515 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-76b64f84d4-tpqnj_47867e82-3783-4f22-bc4f-9128016cf98e/barbican-keystone-listener-log/0.log" Mar 12 15:37:37 crc kubenswrapper[4921]: I0312 15:37:37.995349 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" path="/var/lib/kubelet/pods/63a56ed0-8ecf-439c-b6e9-e1bcca9b684a/volumes" Mar 12 15:37:38 crc kubenswrapper[4921]: I0312 15:37:38.048679 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-4xhdf_e5130d9e-9678-42d8-9394-bcced05db054/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:37:38 crc kubenswrapper[4921]: I0312 15:37:38.156173 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f195685b-74f0-4887-8598-367bf4425faa/ceilometer-central-agent/0.log" Mar 12 15:37:38 crc kubenswrapper[4921]: I0312 15:37:38.239233 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f195685b-74f0-4887-8598-367bf4425faa/ceilometer-notification-agent/0.log" Mar 12 15:37:38 crc kubenswrapper[4921]: I0312 15:37:38.251556 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f195685b-74f0-4887-8598-367bf4425faa/proxy-httpd/0.log" Mar 12 15:37:38 crc kubenswrapper[4921]: I0312 15:37:38.291124 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f195685b-74f0-4887-8598-367bf4425faa/sg-core/0.log" Mar 12 15:37:38 crc kubenswrapper[4921]: I0312 15:37:38.465309 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-dt558_f5b6000a-13f1-4d52-9a03-3b777b3d651d/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:37:38 crc kubenswrapper[4921]: I0312 15:37:38.553642 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bnlpk_cbaebc43-5127-4000-abb3-79a878177cd2/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:37:38 crc kubenswrapper[4921]: I0312 15:37:38.845185 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a5b74f92-1f9b-4321-b549-47269e3eb04c/cinder-api-log/0.log" Mar 12 15:37:38 crc kubenswrapper[4921]: I0312 15:37:38.890405 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a5b74f92-1f9b-4321-b549-47269e3eb04c/cinder-api/0.log" Mar 12 15:37:39 crc kubenswrapper[4921]: I0312 15:37:39.248388 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-1_b1c64c98-e301-4386-b33e-ccd4fde7592d/cinder-api-log/0.log" Mar 12 15:37:39 crc kubenswrapper[4921]: I0312 15:37:39.254836 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-1_b1c64c98-e301-4386-b33e-ccd4fde7592d/cinder-api/0.log" Mar 12 15:37:39 crc kubenswrapper[4921]: I0312 15:37:39.513458 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0ca55d43-e73b-403b-9760-f71e8b926650/probe/0.log" Mar 12 15:37:39 crc kubenswrapper[4921]: I0312 15:37:39.862957 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7cda98bc-d6ac-4204-8477-8ecd7dafb976/cinder-scheduler/0.log" Mar 12 15:37:40 crc kubenswrapper[4921]: I0312 15:37:40.020174 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7cda98bc-d6ac-4204-8477-8ecd7dafb976/probe/0.log" Mar 12 15:37:40 crc kubenswrapper[4921]: I0312 15:37:40.366035 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_8671593e-1709-4d99-ae81-8639ee492d20/probe/0.log" Mar 12 15:37:40 crc kubenswrapper[4921]: I0312 15:37:40.694833 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-6jxfw_0a18ea59-b5e6-40e3-8096-0f2bda4563bb/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:37:40 crc kubenswrapper[4921]: I0312 15:37:40.989317 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pcpck_5a0ab9f2-e0b6-40e1-9816-a11f8135ed75/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:37:41 crc kubenswrapper[4921]: I0312 15:37:41.289853 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5664d5cbb7-9rpxn_5f732887-96f4-4cd5-9a36-df3848958280/init/0.log" Mar 12 15:37:41 crc kubenswrapper[4921]: I0312 15:37:41.482716 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5664d5cbb7-9rpxn_5f732887-96f4-4cd5-9a36-df3848958280/init/0.log" Mar 12 15:37:41 crc kubenswrapper[4921]: I0312 15:37:41.796753 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3ddcb284-70a7-47da-8b0e-e5ba1f0a9443/glance-httpd/0.log" Mar 12 15:37:42 crc kubenswrapper[4921]: I0312 15:37:42.015893 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3ddcb284-70a7-47da-8b0e-e5ba1f0a9443/glance-log/0.log" Mar 12 15:37:42 crc kubenswrapper[4921]: I0312 15:37:42.251019 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-1_5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b/glance-httpd/0.log" Mar 12 15:37:42 crc kubenswrapper[4921]: I0312 15:37:42.292708 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-1_5f1d0fd6-231e-4fa8-8bd9-8331d1803d4b/glance-log/0.log" Mar 12 15:37:42 crc kubenswrapper[4921]: I0312 15:37:42.485085 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5664d5cbb7-9rpxn_5f732887-96f4-4cd5-9a36-df3848958280/dnsmasq-dns/0.log" Mar 12 15:37:42 crc kubenswrapper[4921]: I0312 15:37:42.592242 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d506b9f9-1563-432f-9b21-760ceb017fe9/glance-httpd/0.log" Mar 12 15:37:42 crc kubenswrapper[4921]: I0312 15:37:42.673986 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_d506b9f9-1563-432f-9b21-760ceb017fe9/glance-log/0.log" Mar 12 15:37:42 crc kubenswrapper[4921]: I0312 15:37:42.784976 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0ca55d43-e73b-403b-9760-f71e8b926650/cinder-backup/0.log" Mar 12 15:37:42 crc kubenswrapper[4921]: I0312 15:37:42.893849 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-1_739d7b6f-9f1d-4052-958f-e08821db9361/glance-log/0.log" Mar 12 15:37:42 crc kubenswrapper[4921]: I0312 15:37:42.960924 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-1_739d7b6f-9f1d-4052-958f-e08821db9361/glance-httpd/0.log" Mar 12 15:37:43 crc kubenswrapper[4921]: I0312 15:37:43.161407 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_8671593e-1709-4d99-ae81-8639ee492d20/cinder-volume/0.log" Mar 12 15:37:43 crc kubenswrapper[4921]: I0312 15:37:43.203878 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-bbd56cc76-cwl96_e6e62dec-8193-4d3c-a111-2ee250f79b86/horizon/0.log" Mar 12 15:37:43 crc kubenswrapper[4921]: I0312 15:37:43.405466 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bgsdl_c4eac827-ab86-4fef-b974-8638416f5125/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:37:43 crc kubenswrapper[4921]: I0312 15:37:43.495292 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-hfsqf_56567424-34cd-49a4-ad03-c72a25a07058/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:37:44 crc kubenswrapper[4921]: I0312 15:37:44.171069 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29555401-cfhz9_c85b992e-689f-4f2f-9799-da7e608f6ca8/keystone-cron/0.log" Mar 12 15:37:44 crc kubenswrapper[4921]: I0312 15:37:44.180112 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-bbd56cc76-cwl96_e6e62dec-8193-4d3c-a111-2ee250f79b86/horizon-log/0.log" Mar 12 15:37:44 crc kubenswrapper[4921]: I0312 15:37:44.420091 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29555461-nscpw_ce60198f-3189-4ce6-b4a7-32387eb98fa7/keystone-cron/0.log" Mar 12 15:37:44 crc kubenswrapper[4921]: I0312 15:37:44.469335 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_01d94a77-b0dc-48b9-863b-71dbccd74bfb/kube-state-metrics/0.log" Mar 12 15:37:44 crc kubenswrapper[4921]: I0312 15:37:44.748593 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-kc6f6_2ee1e205-39b3-4648-8c21-4a7cd46b867f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:37:45 crc kubenswrapper[4921]: I0312 15:37:45.905750 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c8b44c5c7-l6d8m_8dfe0096-91f2-4f81-b7a9-a5ac9a3d0118/keystone-api/0.log" Mar 12 15:37:46 crc kubenswrapper[4921]: I0312 15:37:46.007843 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-547b7895d7-42nbh_f1a475b3-67ed-40db-b403-0f82930d5d36/neutron-httpd/0.log" Mar 12 15:37:46 crc kubenswrapper[4921]: I0312 15:37:46.182123 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c8b44c5c7-pc46f_3fcdfac3-13b0-42ac-9396-587a7d443e2a/keystone-api/0.log" Mar 12 15:37:46 crc kubenswrapper[4921]: I0312 15:37:46.669628 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvrp2_f5126789-42a1-4b3d-bc96-384b4db790b6/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:37:47 crc kubenswrapper[4921]: I0312 15:37:47.082006 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-547b7895d7-9c58r_4d97370e-b2d5-463a-ba6d-5e8e12618140/neutron-httpd/0.log" Mar 12 15:37:50 crc kubenswrapper[4921]: I0312 15:37:50.232516 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_148f1f44-e990-4353-b376-1ccbb7f01d0a/nova-api-log/0.log" Mar 12 15:37:52 crc kubenswrapper[4921]: I0312 15:37:52.866991 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-547b7895d7-9c58r_4d97370e-b2d5-463a-ba6d-5e8e12618140/neutron-api/0.log" Mar 12 15:37:53 crc kubenswrapper[4921]: I0312 15:37:53.246631 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-547b7895d7-42nbh_f1a475b3-67ed-40db-b403-0f82930d5d36/neutron-api/0.log" Mar 12 15:37:53 crc kubenswrapper[4921]: I0312 15:37:53.638064 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_148f1f44-e990-4353-b376-1ccbb7f01d0a/nova-api-api/0.log" Mar 12 15:37:54 crc kubenswrapper[4921]: I0312 15:37:54.652693 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a7798e1f-b22a-4ebd-a812-e8c17694cf60/nova-cell1-conductor-conductor/0.log" Mar 12 15:37:54 crc kubenswrapper[4921]: I0312 15:37:54.735376 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_072b6f7c-f4af-4657-82e6-ff8acb7404d5/nova-cell0-conductor-conductor/0.log" Mar 12 15:37:55 crc kubenswrapper[4921]: I0312 15:37:55.256292 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_6f997ce1-fc3d-4a1c-b9a8-d357e879f70d/nova-cell1-novncproxy-novncproxy/0.log" Mar 12 15:37:55 crc kubenswrapper[4921]: I0312 15:37:55.461028 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-2c46j_bcef78dc-2d5d-4a04-b106-2b54e1b11292/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:37:55 crc kubenswrapper[4921]: I0312 15:37:55.805494 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a8089872-446f-4355-94d8-8b82e1b04030/nova-metadata-log/0.log" Mar 12 15:37:56 crc kubenswrapper[4921]: I0312 15:37:56.550933 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-1_ae5ecb59-c6e0-4a5f-a034-059935a3eaff/nova-api-log/0.log" Mar 12 15:37:56 crc kubenswrapper[4921]: I0312 15:37:56.976143 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_69b5525a-14c6-453f-9673-11d9e63dd25a/mysql-bootstrap/0.log" Mar 12 15:37:57 crc kubenswrapper[4921]: I0312 15:37:57.109118 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-1_ae5ecb59-c6e0-4a5f-a034-059935a3eaff/nova-api-api/0.log" Mar 12 15:37:57 crc kubenswrapper[4921]: I0312 15:37:57.215617 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_69b5525a-14c6-453f-9673-11d9e63dd25a/mysql-bootstrap/0.log" Mar 12 15:37:57 crc kubenswrapper[4921]: I0312 15:37:57.339072 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_69b5525a-14c6-453f-9673-11d9e63dd25a/galera/0.log" Mar 12 15:37:57 crc kubenswrapper[4921]: I0312 15:37:57.395759 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a8089872-446f-4355-94d8-8b82e1b04030/nova-metadata-metadata/0.log" Mar 12 15:37:57 crc kubenswrapper[4921]: I0312 15:37:57.556153 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ab9571cc-4c2d-4462-adc5-f84bd590bcca/mysql-bootstrap/0.log" Mar 12 15:37:57 crc kubenswrapper[4921]: I0312 15:37:57.792827 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ab9571cc-4c2d-4462-adc5-f84bd590bcca/galera/0.log" Mar 12 15:37:57 crc kubenswrapper[4921]: I0312 15:37:57.829031 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ab9571cc-4c2d-4462-adc5-f84bd590bcca/mysql-bootstrap/0.log" Mar 12 15:37:57 crc kubenswrapper[4921]: I0312 15:37:57.869346 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b3862104-1cf4-4b79-ab48-f94ad1e83964/nova-scheduler-scheduler/0.log" Mar 12 15:37:57 crc kubenswrapper[4921]: I0312 15:37:57.968598 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_345031e5-3e52-4b4e-ba3d-73bc5c3fe95d/openstackclient/0.log" Mar 12 15:37:58 crc kubenswrapper[4921]: I0312 15:37:58.066388 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-zhfgt_0b5f8311-11bc-477e-b80a-ed2fa2ebc3bb/openstack-network-exporter/0.log" Mar 12 15:37:58 crc kubenswrapper[4921]: I0312 15:37:58.193658 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z4nmg_f2c49e53-e8d4-4f9b-a05e-f44516144d43/ovsdb-server-init/0.log" Mar 12 15:37:58 crc kubenswrapper[4921]: I0312 15:37:58.409462 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z4nmg_f2c49e53-e8d4-4f9b-a05e-f44516144d43/ovsdb-server/0.log" Mar 12 15:37:58 crc kubenswrapper[4921]: I0312 15:37:58.422755 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z4nmg_f2c49e53-e8d4-4f9b-a05e-f44516144d43/ovsdb-server-init/0.log" Mar 12 15:37:58 crc kubenswrapper[4921]: I0312 15:37:58.450025 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-z4nmg_f2c49e53-e8d4-4f9b-a05e-f44516144d43/ovs-vswitchd/0.log" Mar 12 15:37:58 crc kubenswrapper[4921]: I0312 15:37:58.609275 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s4mtb_6475132a-27dd-4c0b-bdd9-9c8b6fc8bbfb/ovn-controller/0.log" Mar 12 15:37:58 crc kubenswrapper[4921]: I0312 15:37:58.671204 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-p2wxb_8697c3cf-f4d2-45fb-9347-c580192e39d2/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:37:58 crc kubenswrapper[4921]: I0312 15:37:58.811201 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_47b82052-6f75-4fe5-b4af-9726f2a59c2f/openstack-network-exporter/0.log" Mar 12 15:37:58 crc kubenswrapper[4921]: I0312 15:37:58.877047 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_47b82052-6f75-4fe5-b4af-9726f2a59c2f/ovn-northd/0.log" Mar 12 15:37:58 crc kubenswrapper[4921]: I0312 15:37:58.949326 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ed0ceb5e-c541-4d3f-99b9-1865684ffa9d/openstack-network-exporter/0.log" Mar 12 15:37:59 crc kubenswrapper[4921]: I0312 15:37:59.065534 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ed0ceb5e-c541-4d3f-99b9-1865684ffa9d/ovsdbserver-nb/0.log" Mar 12 15:37:59 crc kubenswrapper[4921]: I0312 15:37:59.222218 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_cae9c939-db1a-4372-b8a0-ff4e9892cb85/openstack-network-exporter/0.log" Mar 12 15:37:59 crc kubenswrapper[4921]: I0312 15:37:59.251920 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_cae9c939-db1a-4372-b8a0-ff4e9892cb85/ovsdbserver-nb/0.log" Mar 12 15:37:59 crc kubenswrapper[4921]: I0312 15:37:59.380710 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_228e4171-a3c9-483e-bfa6-1e0cef68384c/openstack-network-exporter/0.log" Mar 12 15:37:59 crc kubenswrapper[4921]: I0312 15:37:59.422354 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_228e4171-a3c9-483e-bfa6-1e0cef68384c/ovsdbserver-sb/0.log" Mar 12 15:37:59 crc kubenswrapper[4921]: I0312 15:37:59.741163 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b28ef2e5-d1ca-460a-9c97-a058c098ef64/setup-container/0.log" Mar 12 15:37:59 crc kubenswrapper[4921]: I0312 15:37:59.931952 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b28ef2e5-d1ca-460a-9c97-a058c098ef64/setup-container/0.log" Mar 12 15:37:59 crc kubenswrapper[4921]: I0312 15:37:59.955369 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b28ef2e5-d1ca-460a-9c97-a058c098ef64/rabbitmq/0.log" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.149725 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555498-564vr"] Mar 12 15:38:00 crc kubenswrapper[4921]: E0312 15:38:00.150410 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" containerName="extract-utilities" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.150434 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" containerName="extract-utilities" Mar 12 15:38:00 crc kubenswrapper[4921]: E0312 15:38:00.150448 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" containerName="registry-server" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.150457 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" containerName="registry-server" Mar 12 15:38:00 crc kubenswrapper[4921]: E0312 15:38:00.150496 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" containerName="extract-content" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.150503 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" containerName="extract-content" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.150733 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="63a56ed0-8ecf-439c-b6e9-e1bcca9b684a" containerName="registry-server" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.151725 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555498-564vr" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.154965 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.155046 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.155122 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.182301 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e627c0e-6753-4c4a-ad5f-7d36e4373a2c/setup-container/0.log" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.187020 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555498-564vr"] Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.300327 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p66s5\" (UniqueName: \"kubernetes.io/projected/207d4e43-31a5-402e-a4d4-c9b33593f136-kube-api-access-p66s5\") pod \"auto-csr-approver-29555498-564vr\" (UID: \"207d4e43-31a5-402e-a4d4-c9b33593f136\") " pod="openshift-infra/auto-csr-approver-29555498-564vr" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.348482 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f7ffb8f48-l6m2k_0091a555-ed5b-415c-ba49-7d2c64fdf54d/placement-api/0.log" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.402019 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p66s5\" (UniqueName: \"kubernetes.io/projected/207d4e43-31a5-402e-a4d4-c9b33593f136-kube-api-access-p66s5\") pod \"auto-csr-approver-29555498-564vr\" (UID: \"207d4e43-31a5-402e-a4d4-c9b33593f136\") " pod="openshift-infra/auto-csr-approver-29555498-564vr" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.423784 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p66s5\" (UniqueName: \"kubernetes.io/projected/207d4e43-31a5-402e-a4d4-c9b33593f136-kube-api-access-p66s5\") pod \"auto-csr-approver-29555498-564vr\" (UID: \"207d4e43-31a5-402e-a4d4-c9b33593f136\") " pod="openshift-infra/auto-csr-approver-29555498-564vr" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.471251 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555498-564vr" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.482801 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e627c0e-6753-4c4a-ad5f-7d36e4373a2c/setup-container/0.log" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.538905 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_7e627c0e-6753-4c4a-ad5f-7d36e4373a2c/rabbitmq/0.log" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.640226 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7f7ffb8f48-l6m2k_0091a555-ed5b-415c-ba49-7d2c64fdf54d/placement-log/0.log" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.731084 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gxv6z_55bbba5f-5f7a-44d9-8fb0-eb6d6cd69da2/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.851119 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f0c221da-6e02-450a-a048-9c8292c208ff/memcached/0.log" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.870896 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-8mlp5_66cfa5a2-1910-4504-84cb-24e75749c210/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.943544 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-nzzfd_095fb2e2-a411-4c41-bf21-1c8b69166a54/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:38:00 crc kubenswrapper[4921]: I0312 15:38:00.994287 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555498-564vr"] Mar 12 15:38:01 crc kubenswrapper[4921]: I0312 15:38:01.086604 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-7x2dm_7dc60d30-c59f-4cd4-b798-7e8214c0fa52/ssh-known-hosts-edpm-deployment/0.log" Mar 12 15:38:01 crc kubenswrapper[4921]: I0312 15:38:01.169270 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_b061c47e-9c37-48ed-a879-9263d780de9f/tempest-tests-tempest-tests-runner/0.log" Mar 12 15:38:01 crc kubenswrapper[4921]: I0312 15:38:01.268002 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_5d16b762-c737-4831-ae57-099f1da5d7fb/test-operator-logs-container/0.log" Mar 12 15:38:01 crc kubenswrapper[4921]: I0312 15:38:01.313338 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-s4zpm_36211ec3-db4f-4485-a93d-08dd120af919/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 12 15:38:01 crc kubenswrapper[4921]: I0312 15:38:01.568134 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555498-564vr" event={"ID":"207d4e43-31a5-402e-a4d4-c9b33593f136","Type":"ContainerStarted","Data":"8e443fe2b13073a0ff222e55c19e93675ff2d6e3825b7b99ddb47cc43d440497"} Mar 12 15:38:02 crc kubenswrapper[4921]: I0312 15:38:02.580066 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555498-564vr" event={"ID":"207d4e43-31a5-402e-a4d4-c9b33593f136","Type":"ContainerStarted","Data":"38b8c581eb85163b5bf0ba28b2f1bf9469a2d90b1d6aaf22aed772a03817b6ec"} Mar 12 15:38:02 crc kubenswrapper[4921]: I0312 15:38:02.599295 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555498-564vr" podStartSLOduration=1.558784521 podStartE2EDuration="2.599275429s" podCreationTimestamp="2026-03-12 15:38:00 +0000 UTC" firstStartedPulling="2026-03-12 15:38:00.999286928 +0000 UTC m=+8903.689358899" lastFinishedPulling="2026-03-12 15:38:02.039777836 +0000 UTC m=+8904.729849807" observedRunningTime="2026-03-12 15:38:02.591931891 +0000 UTC m=+8905.282003862" watchObservedRunningTime="2026-03-12 15:38:02.599275429 +0000 UTC m=+8905.289347400" Mar 12 15:38:03 crc kubenswrapper[4921]: I0312 15:38:03.590904 4921 generic.go:334] "Generic (PLEG): container finished" podID="207d4e43-31a5-402e-a4d4-c9b33593f136" containerID="38b8c581eb85163b5bf0ba28b2f1bf9469a2d90b1d6aaf22aed772a03817b6ec" exitCode=0 Mar 12 15:38:03 crc kubenswrapper[4921]: I0312 15:38:03.591008 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555498-564vr" event={"ID":"207d4e43-31a5-402e-a4d4-c9b33593f136","Type":"ContainerDied","Data":"38b8c581eb85163b5bf0ba28b2f1bf9469a2d90b1d6aaf22aed772a03817b6ec"} Mar 12 15:38:04 crc kubenswrapper[4921]: I0312 15:38:04.947147 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555498-564vr" Mar 12 15:38:05 crc kubenswrapper[4921]: I0312 15:38:05.095639 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p66s5\" (UniqueName: \"kubernetes.io/projected/207d4e43-31a5-402e-a4d4-c9b33593f136-kube-api-access-p66s5\") pod \"207d4e43-31a5-402e-a4d4-c9b33593f136\" (UID: \"207d4e43-31a5-402e-a4d4-c9b33593f136\") " Mar 12 15:38:05 crc kubenswrapper[4921]: I0312 15:38:05.104424 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/207d4e43-31a5-402e-a4d4-c9b33593f136-kube-api-access-p66s5" (OuterVolumeSpecName: "kube-api-access-p66s5") pod "207d4e43-31a5-402e-a4d4-c9b33593f136" (UID: "207d4e43-31a5-402e-a4d4-c9b33593f136"). InnerVolumeSpecName "kube-api-access-p66s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:38:05 crc kubenswrapper[4921]: I0312 15:38:05.198623 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p66s5\" (UniqueName: \"kubernetes.io/projected/207d4e43-31a5-402e-a4d4-c9b33593f136-kube-api-access-p66s5\") on node \"crc\" DevicePath \"\"" Mar 12 15:38:05 crc kubenswrapper[4921]: I0312 15:38:05.609140 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555498-564vr" Mar 12 15:38:05 crc kubenswrapper[4921]: I0312 15:38:05.609169 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555498-564vr" event={"ID":"207d4e43-31a5-402e-a4d4-c9b33593f136","Type":"ContainerDied","Data":"8e443fe2b13073a0ff222e55c19e93675ff2d6e3825b7b99ddb47cc43d440497"} Mar 12 15:38:05 crc kubenswrapper[4921]: I0312 15:38:05.609199 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e443fe2b13073a0ff222e55c19e93675ff2d6e3825b7b99ddb47cc43d440497" Mar 12 15:38:05 crc kubenswrapper[4921]: I0312 15:38:05.686846 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555492-j9ww9"] Mar 12 15:38:05 crc kubenswrapper[4921]: I0312 15:38:05.696021 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555492-j9ww9"] Mar 12 15:38:05 crc kubenswrapper[4921]: I0312 15:38:05.997101 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a109c9d1-84c7-46ed-8631-37b6d309a388" path="/var/lib/kubelet/pods/a109c9d1-84c7-46ed-8631-37b6d309a388/volumes" Mar 12 15:38:22 crc kubenswrapper[4921]: I0312 15:38:22.949770 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/util/0.log" Mar 12 15:38:23 crc kubenswrapper[4921]: I0312 15:38:23.104771 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/util/0.log" Mar 12 15:38:23 crc kubenswrapper[4921]: I0312 15:38:23.167156 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/pull/0.log" Mar 12 15:38:23 crc kubenswrapper[4921]: I0312 15:38:23.380202 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/pull/0.log" Mar 12 15:38:23 crc kubenswrapper[4921]: I0312 15:38:23.494413 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/util/0.log" Mar 12 15:38:23 crc kubenswrapper[4921]: I0312 15:38:23.534322 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/pull/0.log" Mar 12 15:38:23 crc kubenswrapper[4921]: I0312 15:38:23.561714 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4d52f25c614d14ea9d555eaa2e62114c0c7d01977d30b495569716fcders97t_ee35f8dc-1fbf-4466-86c0-17d859d09951/extract/0.log" Mar 12 15:38:23 crc kubenswrapper[4921]: I0312 15:38:23.952711 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-j46tf_5908e8b2-d088-4190-8ccf-ea7526921e80/manager/0.log" Mar 12 15:38:24 crc kubenswrapper[4921]: I0312 15:38:24.370658 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-5jt7c_7494cb10-090c-4ac2-bbf1-663979f3e4cf/manager/0.log" Mar 12 15:38:24 crc kubenswrapper[4921]: I0312 15:38:24.501009 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-nq8wj_c6de3785-ea06-49bb-9b39-d8f2f10bce81/manager/0.log" Mar 12 15:38:24 crc kubenswrapper[4921]: I0312 15:38:24.905854 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-fp4rs_001425f5-0a2a-4bdc-a437-d6f9ba3687b4/manager/0.log" Mar 12 15:38:25 crc kubenswrapper[4921]: I0312 15:38:25.415219 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-67xqg_6a1a1aea-a74a-4886-ae24-1d188243e859/manager/0.log" Mar 12 15:38:25 crc kubenswrapper[4921]: I0312 15:38:25.640671 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-9tkrv_c09491c8-72c5-4019-91bf-37ee1a3a937c/manager/0.log" Mar 12 15:38:25 crc kubenswrapper[4921]: I0312 15:38:25.966384 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-v42m2_d4de9b0c-3812-462a-aa80-ffe00e6d47ca/manager/0.log" Mar 12 15:38:26 crc kubenswrapper[4921]: I0312 15:38:26.180004 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-xzm8h_fd1bc9ca-529d-4d59-a236-db1bb5c121ca/manager/0.log" Mar 12 15:38:26 crc kubenswrapper[4921]: I0312 15:38:26.444549 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-692s5_6131e4c9-d85a-4cdf-9cec-128c9e81bc29/manager/0.log" Mar 12 15:38:26 crc kubenswrapper[4921]: I0312 15:38:26.713901 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-kzh67_2394f3bd-4f8b-4036-b240-7ed71b80798a/manager/0.log" Mar 12 15:38:27 crc kubenswrapper[4921]: I0312 15:38:27.022850 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-686d5f9fbd-hmkmx_1a0b0ff9-21c3-452f-9ded-00d374fbbcbe/manager/0.log" Mar 12 15:38:27 crc kubenswrapper[4921]: I0312 15:38:27.059234 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-bz8j7_4e1ee178-3f0e-405a-93cb-9414b2fccbe0/manager/0.log" Mar 12 15:38:27 crc kubenswrapper[4921]: I0312 15:38:27.338234 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-zmq56_ac8d4a43-01b6-438e-b1d8-d3521ed82176/manager/0.log" Mar 12 15:38:27 crc kubenswrapper[4921]: I0312 15:38:27.472274 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7sq7h8_0c9cd39f-8440-4f22-82ce-d3be95bea1be/manager/0.log" Mar 12 15:38:27 crc kubenswrapper[4921]: I0312 15:38:27.658198 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5bc4df7446-bp8nq_c7db0c3c-40e2-49df-bffc-c0f94b26c92f/operator/0.log" Mar 12 15:38:27 crc kubenswrapper[4921]: I0312 15:38:27.821097 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-rrhpc_5f20d433-83bd-4524-a6ce-ef19ef8a1064/registry-server/0.log" Mar 12 15:38:28 crc kubenswrapper[4921]: I0312 15:38:28.116033 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-x4tf4_994c3a47-47a7-4fbe-9f9c-df011597775b/manager/0.log" Mar 12 15:38:28 crc kubenswrapper[4921]: I0312 15:38:28.158471 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-64dcj_3a930c0b-6c3b-4a1d-b02f-1190a124ceb2/manager/0.log" Mar 12 15:38:28 crc kubenswrapper[4921]: I0312 15:38:28.409038 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-h97zm_f0da206d-658e-47e1-9cfb-5b74237c406a/operator/0.log" Mar 12 15:38:28 crc kubenswrapper[4921]: I0312 15:38:28.520234 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-m842c_f2c81917-4047-4d0b-baed-45afa8a53a60/manager/0.log" Mar 12 15:38:28 crc kubenswrapper[4921]: I0312 15:38:28.813784 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-dlgkj_fe35cc9d-bfc6-4a4d-b21f-06ab55672726/manager/0.log" Mar 12 15:38:28 crc kubenswrapper[4921]: I0312 15:38:28.856034 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-2sf7v_ca8b207a-2cf1-455c-b7b4-0f7e2ec5a91b/manager/0.log" Mar 12 15:38:29 crc kubenswrapper[4921]: I0312 15:38:29.111196 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-7l7sm_2db21a73-26d9-44d6-aa91-ba8068b0525a/manager/0.log" Mar 12 15:38:29 crc kubenswrapper[4921]: I0312 15:38:29.572957 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5785b7957-24wxp_9b888138-4648-48a6-9364-639fb0e0c8b6/manager/0.log" Mar 12 15:38:38 crc kubenswrapper[4921]: I0312 15:38:38.779902 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-dmwhv_0cc6c5ac-1bcd-4636-924a-8a6d6ebfaeea/manager/0.log" Mar 12 15:38:49 crc kubenswrapper[4921]: I0312 15:38:49.734251 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-x8rdl_5cb8dae4-ba53-4d26-9cdd-9099acd8ebd4/control-plane-machine-set-operator/0.log" Mar 12 15:38:49 crc kubenswrapper[4921]: I0312 15:38:49.912975 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-r7sfx_345c99f7-75d2-48da-9a45-6fd8ce5c92da/kube-rbac-proxy/0.log" Mar 12 15:38:49 crc kubenswrapper[4921]: I0312 15:38:49.974624 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-r7sfx_345c99f7-75d2-48da-9a45-6fd8ce5c92da/machine-api-operator/0.log" Mar 12 15:38:52 crc kubenswrapper[4921]: I0312 15:38:52.648610 4921 scope.go:117] "RemoveContainer" containerID="169778feb9448b75c38a3428bbfda9502b2b3a16781990d8545c0b97334b970d" Mar 12 15:39:01 crc kubenswrapper[4921]: I0312 15:39:01.295134 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-d22fr_aabe30ef-92c9-4d25-8278-09d1dba1583b/cert-manager-controller/0.log" Mar 12 15:39:01 crc kubenswrapper[4921]: I0312 15:39:01.525794 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-jw9bb_5e022cd5-783e-4dbe-a554-42a43e2bc746/cert-manager-webhook/0.log" Mar 12 15:39:01 crc kubenswrapper[4921]: I0312 15:39:01.530585 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-bbpqn_b02a546a-2d4e-4de2-9673-9c7b2d37a6e8/cert-manager-cainjector/0.log" Mar 12 15:39:12 crc kubenswrapper[4921]: I0312 15:39:12.820482 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-ppq69_e1bd23bf-3c09-41ff-9840-3397219f3f4d/nmstate-console-plugin/0.log" Mar 12 15:39:13 crc kubenswrapper[4921]: I0312 15:39:13.027278 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-2x8kb_7fea2e61-eacd-4cef-9425-2e03106cf6f4/nmstate-handler/0.log" Mar 12 15:39:13 crc kubenswrapper[4921]: I0312 15:39:13.051454 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-tkdph_e3a3372c-64ea-4841-91b6-55d6dbc9490a/kube-rbac-proxy/0.log" Mar 12 15:39:13 crc kubenswrapper[4921]: I0312 15:39:13.152668 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-tkdph_e3a3372c-64ea-4841-91b6-55d6dbc9490a/nmstate-metrics/0.log" Mar 12 15:39:13 crc kubenswrapper[4921]: I0312 15:39:13.265447 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-nvv9l_7258907e-4b4e-41d5-aac1-9d0fb967e5fd/nmstate-operator/0.log" Mar 12 15:39:13 crc kubenswrapper[4921]: I0312 15:39:13.403176 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-kf975_20f1f547-f958-419e-a5c2-58695625d6ad/nmstate-webhook/0.log" Mar 12 15:39:26 crc kubenswrapper[4921]: I0312 15:39:26.324103 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:39:26 crc kubenswrapper[4921]: I0312 15:39:26.324695 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:39:37 crc kubenswrapper[4921]: I0312 15:39:37.852412 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-nzvhg_ceb498e3-36d0-4f72-9c07-54807b7a11ea/kube-rbac-proxy/0.log" Mar 12 15:39:37 crc kubenswrapper[4921]: I0312 15:39:37.926629 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-nzvhg_ceb498e3-36d0-4f72-9c07-54807b7a11ea/controller/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.084730 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-frr-files/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.262559 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-frr-files/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.262636 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-metrics/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.280449 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-reloader/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.288837 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-reloader/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.474312 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-metrics/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.475031 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-metrics/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.479164 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-frr-files/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.485323 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-reloader/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.684134 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-frr-files/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.720927 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-metrics/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.722716 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/controller/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.729570 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/cp-reloader/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.904982 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/frr-metrics/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.955788 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/kube-rbac-proxy/0.log" Mar 12 15:39:38 crc kubenswrapper[4921]: I0312 15:39:38.978316 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/kube-rbac-proxy-frr/0.log" Mar 12 15:39:39 crc kubenswrapper[4921]: I0312 15:39:39.122681 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/reloader/0.log" Mar 12 15:39:39 crc kubenswrapper[4921]: I0312 15:39:39.183638 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jn8d5_aabfc338-f7a1-46a8-a02a-daf1adc64862/frr-k8s-webhook-server/0.log" Mar 12 15:39:39 crc kubenswrapper[4921]: I0312 15:39:39.467127 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-74b4d54bf-8p27k_ccbab5b1-d08b-4c2d-9ac9-f265e0bf8234/manager/0.log" Mar 12 15:39:39 crc kubenswrapper[4921]: I0312 15:39:39.601210 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-78c99c5f4b-pq84h_7a20ce4c-4e95-4fcd-ba22-212cc219c81f/webhook-server/0.log" Mar 12 15:39:39 crc kubenswrapper[4921]: I0312 15:39:39.753589 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zfh6j_8ae92198-0eeb-414f-859a-27c54e4338bf/kube-rbac-proxy/0.log" Mar 12 15:39:40 crc kubenswrapper[4921]: I0312 15:39:40.431101 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zfh6j_8ae92198-0eeb-414f-859a-27c54e4338bf/speaker/0.log" Mar 12 15:39:41 crc kubenswrapper[4921]: I0312 15:39:41.477783 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-qcglj_2ebf7941-9d40-49cf-ad40-530b5e696770/frr/0.log" Mar 12 15:39:51 crc kubenswrapper[4921]: I0312 15:39:51.916647 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk_6cbcab60-00bb-4477-a36c-5d3f8298ab6b/util/0.log" Mar 12 15:39:52 crc kubenswrapper[4921]: I0312 15:39:52.148566 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk_6cbcab60-00bb-4477-a36c-5d3f8298ab6b/util/0.log" Mar 12 15:39:52 crc kubenswrapper[4921]: I0312 15:39:52.172007 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk_6cbcab60-00bb-4477-a36c-5d3f8298ab6b/pull/0.log" Mar 12 15:39:52 crc kubenswrapper[4921]: I0312 15:39:52.177980 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk_6cbcab60-00bb-4477-a36c-5d3f8298ab6b/pull/0.log" Mar 12 15:39:52 crc kubenswrapper[4921]: I0312 15:39:52.318777 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk_6cbcab60-00bb-4477-a36c-5d3f8298ab6b/util/0.log" Mar 12 15:39:52 crc kubenswrapper[4921]: I0312 15:39:52.331128 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk_6cbcab60-00bb-4477-a36c-5d3f8298ab6b/pull/0.log" Mar 12 15:39:52 crc kubenswrapper[4921]: I0312 15:39:52.341863 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l4mtk_6cbcab60-00bb-4477-a36c-5d3f8298ab6b/extract/0.log" Mar 12 15:39:52 crc kubenswrapper[4921]: I0312 15:39:52.499153 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7_8247093d-09e8-4ff9-8a21-902c3135b7ab/util/0.log" Mar 12 15:39:52 crc kubenswrapper[4921]: I0312 15:39:52.656754 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7_8247093d-09e8-4ff9-8a21-902c3135b7ab/util/0.log" Mar 12 15:39:52 crc kubenswrapper[4921]: I0312 15:39:52.659174 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7_8247093d-09e8-4ff9-8a21-902c3135b7ab/pull/0.log" Mar 12 15:39:52 crc kubenswrapper[4921]: I0312 15:39:52.696752 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7_8247093d-09e8-4ff9-8a21-902c3135b7ab/pull/0.log" Mar 12 15:39:52 crc kubenswrapper[4921]: I0312 15:39:52.839417 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7_8247093d-09e8-4ff9-8a21-902c3135b7ab/util/0.log" Mar 12 15:39:52 crc kubenswrapper[4921]: I0312 15:39:52.839480 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7_8247093d-09e8-4ff9-8a21-902c3135b7ab/pull/0.log" Mar 12 15:39:52 crc kubenswrapper[4921]: I0312 15:39:52.845023 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c18rqt7_8247093d-09e8-4ff9-8a21-902c3135b7ab/extract/0.log" Mar 12 15:39:53 crc kubenswrapper[4921]: I0312 15:39:53.009648 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gjth_01d61927-e67d-49cf-97e5-70d2fed9192b/extract-utilities/0.log" Mar 12 15:39:53 crc kubenswrapper[4921]: I0312 15:39:53.190536 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gjth_01d61927-e67d-49cf-97e5-70d2fed9192b/extract-content/0.log" Mar 12 15:39:53 crc kubenswrapper[4921]: I0312 15:39:53.194665 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gjth_01d61927-e67d-49cf-97e5-70d2fed9192b/extract-content/0.log" Mar 12 15:39:53 crc kubenswrapper[4921]: I0312 15:39:53.212831 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gjth_01d61927-e67d-49cf-97e5-70d2fed9192b/extract-utilities/0.log" Mar 12 15:39:53 crc kubenswrapper[4921]: I0312 15:39:53.374980 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gjth_01d61927-e67d-49cf-97e5-70d2fed9192b/extract-utilities/0.log" Mar 12 15:39:53 crc kubenswrapper[4921]: I0312 15:39:53.376847 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gjth_01d61927-e67d-49cf-97e5-70d2fed9192b/extract-content/0.log" Mar 12 15:39:53 crc kubenswrapper[4921]: I0312 15:39:53.595086 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5622m_e8698537-b9bf-41de-9d12-68d07948c6e4/extract-utilities/0.log" Mar 12 15:39:53 crc kubenswrapper[4921]: I0312 15:39:53.840019 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5622m_e8698537-b9bf-41de-9d12-68d07948c6e4/extract-content/0.log" Mar 12 15:39:53 crc kubenswrapper[4921]: I0312 15:39:53.856554 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5622m_e8698537-b9bf-41de-9d12-68d07948c6e4/extract-content/0.log" Mar 12 15:39:53 crc kubenswrapper[4921]: I0312 15:39:53.876897 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5622m_e8698537-b9bf-41de-9d12-68d07948c6e4/extract-utilities/0.log" Mar 12 15:39:54 crc kubenswrapper[4921]: I0312 15:39:54.044531 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5622m_e8698537-b9bf-41de-9d12-68d07948c6e4/extract-utilities/0.log" Mar 12 15:39:54 crc kubenswrapper[4921]: I0312 15:39:54.085036 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5622m_e8698537-b9bf-41de-9d12-68d07948c6e4/extract-content/0.log" Mar 12 15:39:54 crc kubenswrapper[4921]: I0312 15:39:54.347257 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-cc774_f8eea941-027c-44f8-a189-b7e9b3c6cb55/marketplace-operator/0.log" Mar 12 15:39:54 crc kubenswrapper[4921]: I0312 15:39:54.519238 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnh65_587b8721-fb47-4cd2-8c47-917e0b6dd5dc/extract-utilities/0.log" Mar 12 15:39:54 crc kubenswrapper[4921]: I0312 15:39:54.529688 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5622m_e8698537-b9bf-41de-9d12-68d07948c6e4/registry-server/0.log" Mar 12 15:39:54 crc kubenswrapper[4921]: I0312 15:39:54.809945 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnh65_587b8721-fb47-4cd2-8c47-917e0b6dd5dc/extract-utilities/0.log" Mar 12 15:39:54 crc kubenswrapper[4921]: I0312 15:39:54.811790 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnh65_587b8721-fb47-4cd2-8c47-917e0b6dd5dc/extract-content/0.log" Mar 12 15:39:54 crc kubenswrapper[4921]: I0312 15:39:54.866094 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6gjth_01d61927-e67d-49cf-97e5-70d2fed9192b/registry-server/0.log" Mar 12 15:39:54 crc kubenswrapper[4921]: I0312 15:39:54.881958 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnh65_587b8721-fb47-4cd2-8c47-917e0b6dd5dc/extract-content/0.log" Mar 12 15:39:55 crc kubenswrapper[4921]: I0312 15:39:55.038941 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnh65_587b8721-fb47-4cd2-8c47-917e0b6dd5dc/extract-content/0.log" Mar 12 15:39:55 crc kubenswrapper[4921]: I0312 15:39:55.046153 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnh65_587b8721-fb47-4cd2-8c47-917e0b6dd5dc/extract-utilities/0.log" Mar 12 15:39:55 crc kubenswrapper[4921]: I0312 15:39:55.251070 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bstd5_96baa3f9-7cf9-499b-94ad-0f8cd1a98f76/extract-utilities/0.log" Mar 12 15:39:55 crc kubenswrapper[4921]: I0312 15:39:55.382209 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-jnh65_587b8721-fb47-4cd2-8c47-917e0b6dd5dc/registry-server/0.log" Mar 12 15:39:55 crc kubenswrapper[4921]: I0312 15:39:55.454080 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bstd5_96baa3f9-7cf9-499b-94ad-0f8cd1a98f76/extract-utilities/0.log" Mar 12 15:39:55 crc kubenswrapper[4921]: I0312 15:39:55.476653 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bstd5_96baa3f9-7cf9-499b-94ad-0f8cd1a98f76/extract-content/0.log" Mar 12 15:39:55 crc kubenswrapper[4921]: I0312 15:39:55.491122 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bstd5_96baa3f9-7cf9-499b-94ad-0f8cd1a98f76/extract-content/0.log" Mar 12 15:39:55 crc kubenswrapper[4921]: I0312 15:39:55.687966 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bstd5_96baa3f9-7cf9-499b-94ad-0f8cd1a98f76/extract-utilities/0.log" Mar 12 15:39:55 crc kubenswrapper[4921]: I0312 15:39:55.696453 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bstd5_96baa3f9-7cf9-499b-94ad-0f8cd1a98f76/extract-content/0.log" Mar 12 15:39:56 crc kubenswrapper[4921]: I0312 15:39:56.125334 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bstd5_96baa3f9-7cf9-499b-94ad-0f8cd1a98f76/registry-server/0.log" Mar 12 15:39:56 crc kubenswrapper[4921]: I0312 15:39:56.323406 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:39:56 crc kubenswrapper[4921]: I0312 15:39:56.323463 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:40:00 crc kubenswrapper[4921]: I0312 15:40:00.140536 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555500-w56j6"] Mar 12 15:40:00 crc kubenswrapper[4921]: E0312 15:40:00.141433 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="207d4e43-31a5-402e-a4d4-c9b33593f136" containerName="oc" Mar 12 15:40:00 crc kubenswrapper[4921]: I0312 15:40:00.141447 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="207d4e43-31a5-402e-a4d4-c9b33593f136" containerName="oc" Mar 12 15:40:00 crc kubenswrapper[4921]: I0312 15:40:00.141642 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="207d4e43-31a5-402e-a4d4-c9b33593f136" containerName="oc" Mar 12 15:40:00 crc kubenswrapper[4921]: I0312 15:40:00.142306 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555500-w56j6" Mar 12 15:40:00 crc kubenswrapper[4921]: I0312 15:40:00.144167 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:40:00 crc kubenswrapper[4921]: I0312 15:40:00.144267 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:40:00 crc kubenswrapper[4921]: I0312 15:40:00.146201 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:40:00 crc kubenswrapper[4921]: I0312 15:40:00.154056 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555500-w56j6"] Mar 12 15:40:00 crc kubenswrapper[4921]: I0312 15:40:00.235884 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtqvq\" (UniqueName: \"kubernetes.io/projected/031973f1-465e-445a-b19e-c187e6fe1edb-kube-api-access-gtqvq\") pod \"auto-csr-approver-29555500-w56j6\" (UID: \"031973f1-465e-445a-b19e-c187e6fe1edb\") " pod="openshift-infra/auto-csr-approver-29555500-w56j6" Mar 12 15:40:00 crc kubenswrapper[4921]: I0312 15:40:00.338475 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtqvq\" (UniqueName: \"kubernetes.io/projected/031973f1-465e-445a-b19e-c187e6fe1edb-kube-api-access-gtqvq\") pod \"auto-csr-approver-29555500-w56j6\" (UID: \"031973f1-465e-445a-b19e-c187e6fe1edb\") " pod="openshift-infra/auto-csr-approver-29555500-w56j6" Mar 12 15:40:00 crc kubenswrapper[4921]: I0312 15:40:00.370601 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtqvq\" (UniqueName: \"kubernetes.io/projected/031973f1-465e-445a-b19e-c187e6fe1edb-kube-api-access-gtqvq\") pod \"auto-csr-approver-29555500-w56j6\" (UID: \"031973f1-465e-445a-b19e-c187e6fe1edb\") " pod="openshift-infra/auto-csr-approver-29555500-w56j6" Mar 12 15:40:00 crc kubenswrapper[4921]: I0312 15:40:00.462048 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555500-w56j6" Mar 12 15:40:00 crc kubenswrapper[4921]: I0312 15:40:00.964548 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555500-w56j6"] Mar 12 15:40:00 crc kubenswrapper[4921]: W0312 15:40:00.971482 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod031973f1_465e_445a_b19e_c187e6fe1edb.slice/crio-30d4338c17241a0405eb52cb0ed2cee78e0a488f811479f33c6005109b8076e0 WatchSource:0}: Error finding container 30d4338c17241a0405eb52cb0ed2cee78e0a488f811479f33c6005109b8076e0: Status 404 returned error can't find the container with id 30d4338c17241a0405eb52cb0ed2cee78e0a488f811479f33c6005109b8076e0 Mar 12 15:40:00 crc kubenswrapper[4921]: I0312 15:40:00.975686 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:40:01 crc kubenswrapper[4921]: I0312 15:40:01.646236 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555500-w56j6" event={"ID":"031973f1-465e-445a-b19e-c187e6fe1edb","Type":"ContainerStarted","Data":"30d4338c17241a0405eb52cb0ed2cee78e0a488f811479f33c6005109b8076e0"} Mar 12 15:40:02 crc kubenswrapper[4921]: I0312 15:40:02.655367 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555500-w56j6" event={"ID":"031973f1-465e-445a-b19e-c187e6fe1edb","Type":"ContainerStarted","Data":"40f93a58e8e3186ee135dbfc6c811b164e2aebd1767665b930cd4632d13bf638"} Mar 12 15:40:02 crc kubenswrapper[4921]: I0312 15:40:02.678316 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555500-w56j6" podStartSLOduration=1.466444952 podStartE2EDuration="2.67829788s" podCreationTimestamp="2026-03-12 15:40:00 +0000 UTC" firstStartedPulling="2026-03-12 15:40:00.975344633 +0000 UTC m=+9023.665416614" lastFinishedPulling="2026-03-12 15:40:02.187197571 +0000 UTC m=+9024.877269542" observedRunningTime="2026-03-12 15:40:02.678062093 +0000 UTC m=+9025.368134074" watchObservedRunningTime="2026-03-12 15:40:02.67829788 +0000 UTC m=+9025.368369851" Mar 12 15:40:03 crc kubenswrapper[4921]: I0312 15:40:03.665762 4921 generic.go:334] "Generic (PLEG): container finished" podID="031973f1-465e-445a-b19e-c187e6fe1edb" containerID="40f93a58e8e3186ee135dbfc6c811b164e2aebd1767665b930cd4632d13bf638" exitCode=0 Mar 12 15:40:03 crc kubenswrapper[4921]: I0312 15:40:03.665821 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555500-w56j6" event={"ID":"031973f1-465e-445a-b19e-c187e6fe1edb","Type":"ContainerDied","Data":"40f93a58e8e3186ee135dbfc6c811b164e2aebd1767665b930cd4632d13bf638"} Mar 12 15:40:05 crc kubenswrapper[4921]: I0312 15:40:05.031452 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555500-w56j6" Mar 12 15:40:05 crc kubenswrapper[4921]: I0312 15:40:05.146036 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtqvq\" (UniqueName: \"kubernetes.io/projected/031973f1-465e-445a-b19e-c187e6fe1edb-kube-api-access-gtqvq\") pod \"031973f1-465e-445a-b19e-c187e6fe1edb\" (UID: \"031973f1-465e-445a-b19e-c187e6fe1edb\") " Mar 12 15:40:05 crc kubenswrapper[4921]: I0312 15:40:05.151683 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/031973f1-465e-445a-b19e-c187e6fe1edb-kube-api-access-gtqvq" (OuterVolumeSpecName: "kube-api-access-gtqvq") pod "031973f1-465e-445a-b19e-c187e6fe1edb" (UID: "031973f1-465e-445a-b19e-c187e6fe1edb"). InnerVolumeSpecName "kube-api-access-gtqvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:40:05 crc kubenswrapper[4921]: I0312 15:40:05.248658 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtqvq\" (UniqueName: \"kubernetes.io/projected/031973f1-465e-445a-b19e-c187e6fe1edb-kube-api-access-gtqvq\") on node \"crc\" DevicePath \"\"" Mar 12 15:40:05 crc kubenswrapper[4921]: I0312 15:40:05.683661 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555500-w56j6" event={"ID":"031973f1-465e-445a-b19e-c187e6fe1edb","Type":"ContainerDied","Data":"30d4338c17241a0405eb52cb0ed2cee78e0a488f811479f33c6005109b8076e0"} Mar 12 15:40:05 crc kubenswrapper[4921]: I0312 15:40:05.684001 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30d4338c17241a0405eb52cb0ed2cee78e0a488f811479f33c6005109b8076e0" Mar 12 15:40:05 crc kubenswrapper[4921]: I0312 15:40:05.683722 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555500-w56j6" Mar 12 15:40:05 crc kubenswrapper[4921]: I0312 15:40:05.737774 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555494-ggpnd"] Mar 12 15:40:05 crc kubenswrapper[4921]: I0312 15:40:05.748652 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555494-ggpnd"] Mar 12 15:40:05 crc kubenswrapper[4921]: I0312 15:40:05.993097 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd441807-f8ad-47d7-8a7a-b4f01fbc7e71" path="/var/lib/kubelet/pods/dd441807-f8ad-47d7-8a7a-b4f01fbc7e71/volumes" Mar 12 15:40:26 crc kubenswrapper[4921]: I0312 15:40:26.324814 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:40:26 crc kubenswrapper[4921]: I0312 15:40:26.325615 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:40:26 crc kubenswrapper[4921]: I0312 15:40:26.325673 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 15:40:26 crc kubenswrapper[4921]: I0312 15:40:26.326605 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e71e67d67f2a250359c6ca0c79b048ff9319acfa4f7e5973d16d402b376ad816"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:40:26 crc kubenswrapper[4921]: I0312 15:40:26.326656 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://e71e67d67f2a250359c6ca0c79b048ff9319acfa4f7e5973d16d402b376ad816" gracePeriod=600 Mar 12 15:40:26 crc kubenswrapper[4921]: I0312 15:40:26.873529 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="e71e67d67f2a250359c6ca0c79b048ff9319acfa4f7e5973d16d402b376ad816" exitCode=0 Mar 12 15:40:26 crc kubenswrapper[4921]: I0312 15:40:26.873620 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"e71e67d67f2a250359c6ca0c79b048ff9319acfa4f7e5973d16d402b376ad816"} Mar 12 15:40:26 crc kubenswrapper[4921]: I0312 15:40:26.874303 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerStarted","Data":"53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef"} Mar 12 15:40:26 crc kubenswrapper[4921]: I0312 15:40:26.874385 4921 scope.go:117] "RemoveContainer" containerID="92146b92b91c55940864f049a25b20a508f4929154c8bacc46b1da6c23f14d54" Mar 12 15:40:52 crc kubenswrapper[4921]: I0312 15:40:52.767401 4921 scope.go:117] "RemoveContainer" containerID="1e1fbd17d69073dd1b95749c8d63aa82f733d3d56d0e336729665947262dd208" Mar 12 15:40:52 crc kubenswrapper[4921]: I0312 15:40:52.793403 4921 scope.go:117] "RemoveContainer" containerID="9c871a4be64049f2fa53adda52b7bb9ae3dc33b39b962c4de92f90531338a273" Mar 12 15:40:52 crc kubenswrapper[4921]: I0312 15:40:52.857962 4921 scope.go:117] "RemoveContainer" containerID="7bfe957de2b8815a8bf650d20dbdfdf3eb07098ae657c27d61236cef118fe6b8" Mar 12 15:40:52 crc kubenswrapper[4921]: I0312 15:40:52.877928 4921 scope.go:117] "RemoveContainer" containerID="dfce88e1982cbc7a0d76d4e8be83401431411d477402072842a4ec5e0909bd9a" Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.648632 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rxl9r"] Mar 12 15:41:52 crc kubenswrapper[4921]: E0312 15:41:52.649903 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="031973f1-465e-445a-b19e-c187e6fe1edb" containerName="oc" Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.649922 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="031973f1-465e-445a-b19e-c187e6fe1edb" containerName="oc" Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.650167 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="031973f1-465e-445a-b19e-c187e6fe1edb" containerName="oc" Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.651925 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.662676 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rxl9r"] Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.757712 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-catalog-content\") pod \"community-operators-rxl9r\" (UID: \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\") " pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.757908 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-utilities\") pod \"community-operators-rxl9r\" (UID: \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\") " pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.757962 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7lr9\" (UniqueName: \"kubernetes.io/projected/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-kube-api-access-h7lr9\") pod \"community-operators-rxl9r\" (UID: \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\") " pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.859984 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-utilities\") pod \"community-operators-rxl9r\" (UID: \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\") " pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.860536 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-utilities\") pod \"community-operators-rxl9r\" (UID: \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\") " pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.860832 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7lr9\" (UniqueName: \"kubernetes.io/projected/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-kube-api-access-h7lr9\") pod \"community-operators-rxl9r\" (UID: \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\") " pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.860927 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-catalog-content\") pod \"community-operators-rxl9r\" (UID: \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\") " pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.861444 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-catalog-content\") pod \"community-operators-rxl9r\" (UID: \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\") " pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.898729 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7lr9\" (UniqueName: \"kubernetes.io/projected/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-kube-api-access-h7lr9\") pod \"community-operators-rxl9r\" (UID: \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\") " pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:41:52 crc kubenswrapper[4921]: I0312 15:41:52.987894 4921 scope.go:117] "RemoveContainer" containerID="a5cd69f583528e84702952573f454c3d155e4e4f55ef079121cb44b5443da541" Mar 12 15:41:53 crc kubenswrapper[4921]: I0312 15:41:53.011089 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:41:53 crc kubenswrapper[4921]: I0312 15:41:53.636206 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rxl9r"] Mar 12 15:41:54 crc kubenswrapper[4921]: I0312 15:41:54.587409 4921 generic.go:334] "Generic (PLEG): container finished" podID="5f8ee721-3f4c-4022-bb7b-f5953d422c6a" containerID="b31fa9e9a9f12e50fe3c255963fca862620d3453d0f168ebf4b44584f411c166" exitCode=0 Mar 12 15:41:54 crc kubenswrapper[4921]: I0312 15:41:54.587466 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxl9r" event={"ID":"5f8ee721-3f4c-4022-bb7b-f5953d422c6a","Type":"ContainerDied","Data":"b31fa9e9a9f12e50fe3c255963fca862620d3453d0f168ebf4b44584f411c166"} Mar 12 15:41:54 crc kubenswrapper[4921]: I0312 15:41:54.587878 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxl9r" event={"ID":"5f8ee721-3f4c-4022-bb7b-f5953d422c6a","Type":"ContainerStarted","Data":"04ba54fab1fd1199f795cc03b5939c0c55e19fee5ccf277e762f88aaf8e4ecbe"} Mar 12 15:41:55 crc kubenswrapper[4921]: I0312 15:41:55.625703 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxl9r" event={"ID":"5f8ee721-3f4c-4022-bb7b-f5953d422c6a","Type":"ContainerStarted","Data":"63aaf5a4df8aab06d78dd9adc7cd29a05294b10a5b59424068409a7daeb66582"} Mar 12 15:41:56 crc kubenswrapper[4921]: I0312 15:41:56.634164 4921 generic.go:334] "Generic (PLEG): container finished" podID="5f8ee721-3f4c-4022-bb7b-f5953d422c6a" containerID="63aaf5a4df8aab06d78dd9adc7cd29a05294b10a5b59424068409a7daeb66582" exitCode=0 Mar 12 15:41:56 crc kubenswrapper[4921]: I0312 15:41:56.634209 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxl9r" event={"ID":"5f8ee721-3f4c-4022-bb7b-f5953d422c6a","Type":"ContainerDied","Data":"63aaf5a4df8aab06d78dd9adc7cd29a05294b10a5b59424068409a7daeb66582"} Mar 12 15:41:57 crc kubenswrapper[4921]: I0312 15:41:57.644477 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxl9r" event={"ID":"5f8ee721-3f4c-4022-bb7b-f5953d422c6a","Type":"ContainerStarted","Data":"ae87cd016a087876172438fa48a7ec160ffe04e64398c4f6522ddbcb68201be5"} Mar 12 15:41:57 crc kubenswrapper[4921]: I0312 15:41:57.673685 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rxl9r" podStartSLOduration=3.133413549 podStartE2EDuration="5.673661559s" podCreationTimestamp="2026-03-12 15:41:52 +0000 UTC" firstStartedPulling="2026-03-12 15:41:54.590402418 +0000 UTC m=+9137.280474389" lastFinishedPulling="2026-03-12 15:41:57.130650418 +0000 UTC m=+9139.820722399" observedRunningTime="2026-03-12 15:41:57.666749085 +0000 UTC m=+9140.356821076" watchObservedRunningTime="2026-03-12 15:41:57.673661559 +0000 UTC m=+9140.363733530" Mar 12 15:42:00 crc kubenswrapper[4921]: I0312 15:42:00.176747 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555502-jm2kj"] Mar 12 15:42:00 crc kubenswrapper[4921]: I0312 15:42:00.178475 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555502-jm2kj" Mar 12 15:42:00 crc kubenswrapper[4921]: I0312 15:42:00.180771 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:42:00 crc kubenswrapper[4921]: I0312 15:42:00.181046 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:42:00 crc kubenswrapper[4921]: I0312 15:42:00.182603 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:42:00 crc kubenswrapper[4921]: I0312 15:42:00.200795 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555502-jm2kj"] Mar 12 15:42:00 crc kubenswrapper[4921]: I0312 15:42:00.328712 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s5ld\" (UniqueName: \"kubernetes.io/projected/5111e3fb-dab3-4a7d-a030-cfe517378db3-kube-api-access-5s5ld\") pod \"auto-csr-approver-29555502-jm2kj\" (UID: \"5111e3fb-dab3-4a7d-a030-cfe517378db3\") " pod="openshift-infra/auto-csr-approver-29555502-jm2kj" Mar 12 15:42:00 crc kubenswrapper[4921]: I0312 15:42:00.430802 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s5ld\" (UniqueName: \"kubernetes.io/projected/5111e3fb-dab3-4a7d-a030-cfe517378db3-kube-api-access-5s5ld\") pod \"auto-csr-approver-29555502-jm2kj\" (UID: \"5111e3fb-dab3-4a7d-a030-cfe517378db3\") " pod="openshift-infra/auto-csr-approver-29555502-jm2kj" Mar 12 15:42:00 crc kubenswrapper[4921]: I0312 15:42:00.450611 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s5ld\" (UniqueName: \"kubernetes.io/projected/5111e3fb-dab3-4a7d-a030-cfe517378db3-kube-api-access-5s5ld\") pod \"auto-csr-approver-29555502-jm2kj\" (UID: \"5111e3fb-dab3-4a7d-a030-cfe517378db3\") " pod="openshift-infra/auto-csr-approver-29555502-jm2kj" Mar 12 15:42:00 crc kubenswrapper[4921]: I0312 15:42:00.501334 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555502-jm2kj" Mar 12 15:42:00 crc kubenswrapper[4921]: I0312 15:42:00.977505 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555502-jm2kj"] Mar 12 15:42:01 crc kubenswrapper[4921]: I0312 15:42:01.688095 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555502-jm2kj" event={"ID":"5111e3fb-dab3-4a7d-a030-cfe517378db3","Type":"ContainerStarted","Data":"81ee4a265ab02b4ce02f1d4c5493d8325428ea88dd6b105844887b3f498cc09a"} Mar 12 15:42:02 crc kubenswrapper[4921]: I0312 15:42:02.697797 4921 generic.go:334] "Generic (PLEG): container finished" podID="5111e3fb-dab3-4a7d-a030-cfe517378db3" containerID="0c0063a103e85042962fafa27032e96a55a2f726d85ab02129fbb5cd9176098b" exitCode=0 Mar 12 15:42:02 crc kubenswrapper[4921]: I0312 15:42:02.697853 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555502-jm2kj" event={"ID":"5111e3fb-dab3-4a7d-a030-cfe517378db3","Type":"ContainerDied","Data":"0c0063a103e85042962fafa27032e96a55a2f726d85ab02129fbb5cd9176098b"} Mar 12 15:42:03 crc kubenswrapper[4921]: I0312 15:42:03.011972 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:42:03 crc kubenswrapper[4921]: I0312 15:42:03.012299 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:42:03 crc kubenswrapper[4921]: I0312 15:42:03.061078 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:42:03 crc kubenswrapper[4921]: I0312 15:42:03.750289 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:42:03 crc kubenswrapper[4921]: I0312 15:42:03.812139 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rxl9r"] Mar 12 15:42:04 crc kubenswrapper[4921]: I0312 15:42:04.088683 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555502-jm2kj" Mar 12 15:42:04 crc kubenswrapper[4921]: I0312 15:42:04.203874 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s5ld\" (UniqueName: \"kubernetes.io/projected/5111e3fb-dab3-4a7d-a030-cfe517378db3-kube-api-access-5s5ld\") pod \"5111e3fb-dab3-4a7d-a030-cfe517378db3\" (UID: \"5111e3fb-dab3-4a7d-a030-cfe517378db3\") " Mar 12 15:42:04 crc kubenswrapper[4921]: I0312 15:42:04.212701 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5111e3fb-dab3-4a7d-a030-cfe517378db3-kube-api-access-5s5ld" (OuterVolumeSpecName: "kube-api-access-5s5ld") pod "5111e3fb-dab3-4a7d-a030-cfe517378db3" (UID: "5111e3fb-dab3-4a7d-a030-cfe517378db3"). InnerVolumeSpecName "kube-api-access-5s5ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:42:04 crc kubenswrapper[4921]: I0312 15:42:04.307162 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s5ld\" (UniqueName: \"kubernetes.io/projected/5111e3fb-dab3-4a7d-a030-cfe517378db3-kube-api-access-5s5ld\") on node \"crc\" DevicePath \"\"" Mar 12 15:42:04 crc kubenswrapper[4921]: I0312 15:42:04.717761 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555502-jm2kj" Mar 12 15:42:04 crc kubenswrapper[4921]: I0312 15:42:04.717720 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555502-jm2kj" event={"ID":"5111e3fb-dab3-4a7d-a030-cfe517378db3","Type":"ContainerDied","Data":"81ee4a265ab02b4ce02f1d4c5493d8325428ea88dd6b105844887b3f498cc09a"} Mar 12 15:42:04 crc kubenswrapper[4921]: I0312 15:42:04.717850 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81ee4a265ab02b4ce02f1d4c5493d8325428ea88dd6b105844887b3f498cc09a" Mar 12 15:42:05 crc kubenswrapper[4921]: I0312 15:42:05.156803 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555496-swhbq"] Mar 12 15:42:05 crc kubenswrapper[4921]: I0312 15:42:05.167573 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555496-swhbq"] Mar 12 15:42:05 crc kubenswrapper[4921]: I0312 15:42:05.726055 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rxl9r" podUID="5f8ee721-3f4c-4022-bb7b-f5953d422c6a" containerName="registry-server" containerID="cri-o://ae87cd016a087876172438fa48a7ec160ffe04e64398c4f6522ddbcb68201be5" gracePeriod=2 Mar 12 15:42:05 crc kubenswrapper[4921]: I0312 15:42:05.994954 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63494d31-fe81-4c38-8cf3-c23b9224a622" path="/var/lib/kubelet/pods/63494d31-fe81-4c38-8cf3-c23b9224a622/volumes" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.275068 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.455161 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-catalog-content\") pod \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\" (UID: \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\") " Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.455413 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-utilities\") pod \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\" (UID: \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\") " Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.455581 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7lr9\" (UniqueName: \"kubernetes.io/projected/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-kube-api-access-h7lr9\") pod \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\" (UID: \"5f8ee721-3f4c-4022-bb7b-f5953d422c6a\") " Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.456297 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-utilities" (OuterVolumeSpecName: "utilities") pod "5f8ee721-3f4c-4022-bb7b-f5953d422c6a" (UID: "5f8ee721-3f4c-4022-bb7b-f5953d422c6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.457100 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.464065 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-kube-api-access-h7lr9" (OuterVolumeSpecName: "kube-api-access-h7lr9") pod "5f8ee721-3f4c-4022-bb7b-f5953d422c6a" (UID: "5f8ee721-3f4c-4022-bb7b-f5953d422c6a"). InnerVolumeSpecName "kube-api-access-h7lr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.518192 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f8ee721-3f4c-4022-bb7b-f5953d422c6a" (UID: "5f8ee721-3f4c-4022-bb7b-f5953d422c6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.558999 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7lr9\" (UniqueName: \"kubernetes.io/projected/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-kube-api-access-h7lr9\") on node \"crc\" DevicePath \"\"" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.559031 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f8ee721-3f4c-4022-bb7b-f5953d422c6a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.736302 4921 generic.go:334] "Generic (PLEG): container finished" podID="5f8ee721-3f4c-4022-bb7b-f5953d422c6a" containerID="ae87cd016a087876172438fa48a7ec160ffe04e64398c4f6522ddbcb68201be5" exitCode=0 Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.736379 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxl9r" event={"ID":"5f8ee721-3f4c-4022-bb7b-f5953d422c6a","Type":"ContainerDied","Data":"ae87cd016a087876172438fa48a7ec160ffe04e64398c4f6522ddbcb68201be5"} Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.736666 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rxl9r" event={"ID":"5f8ee721-3f4c-4022-bb7b-f5953d422c6a","Type":"ContainerDied","Data":"04ba54fab1fd1199f795cc03b5939c0c55e19fee5ccf277e762f88aaf8e4ecbe"} Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.736693 4921 scope.go:117] "RemoveContainer" containerID="ae87cd016a087876172438fa48a7ec160ffe04e64398c4f6522ddbcb68201be5" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.736425 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rxl9r" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.755297 4921 scope.go:117] "RemoveContainer" containerID="63aaf5a4df8aab06d78dd9adc7cd29a05294b10a5b59424068409a7daeb66582" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.786210 4921 scope.go:117] "RemoveContainer" containerID="b31fa9e9a9f12e50fe3c255963fca862620d3453d0f168ebf4b44584f411c166" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.787903 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rxl9r"] Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.797788 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rxl9r"] Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.838896 4921 scope.go:117] "RemoveContainer" containerID="ae87cd016a087876172438fa48a7ec160ffe04e64398c4f6522ddbcb68201be5" Mar 12 15:42:06 crc kubenswrapper[4921]: E0312 15:42:06.839377 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae87cd016a087876172438fa48a7ec160ffe04e64398c4f6522ddbcb68201be5\": container with ID starting with ae87cd016a087876172438fa48a7ec160ffe04e64398c4f6522ddbcb68201be5 not found: ID does not exist" containerID="ae87cd016a087876172438fa48a7ec160ffe04e64398c4f6522ddbcb68201be5" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.839415 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae87cd016a087876172438fa48a7ec160ffe04e64398c4f6522ddbcb68201be5"} err="failed to get container status \"ae87cd016a087876172438fa48a7ec160ffe04e64398c4f6522ddbcb68201be5\": rpc error: code = NotFound desc = could not find container \"ae87cd016a087876172438fa48a7ec160ffe04e64398c4f6522ddbcb68201be5\": container with ID starting with ae87cd016a087876172438fa48a7ec160ffe04e64398c4f6522ddbcb68201be5 not found: ID does not exist" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.839441 4921 scope.go:117] "RemoveContainer" containerID="63aaf5a4df8aab06d78dd9adc7cd29a05294b10a5b59424068409a7daeb66582" Mar 12 15:42:06 crc kubenswrapper[4921]: E0312 15:42:06.839778 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63aaf5a4df8aab06d78dd9adc7cd29a05294b10a5b59424068409a7daeb66582\": container with ID starting with 63aaf5a4df8aab06d78dd9adc7cd29a05294b10a5b59424068409a7daeb66582 not found: ID does not exist" containerID="63aaf5a4df8aab06d78dd9adc7cd29a05294b10a5b59424068409a7daeb66582" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.839805 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63aaf5a4df8aab06d78dd9adc7cd29a05294b10a5b59424068409a7daeb66582"} err="failed to get container status \"63aaf5a4df8aab06d78dd9adc7cd29a05294b10a5b59424068409a7daeb66582\": rpc error: code = NotFound desc = could not find container \"63aaf5a4df8aab06d78dd9adc7cd29a05294b10a5b59424068409a7daeb66582\": container with ID starting with 63aaf5a4df8aab06d78dd9adc7cd29a05294b10a5b59424068409a7daeb66582 not found: ID does not exist" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.839835 4921 scope.go:117] "RemoveContainer" containerID="b31fa9e9a9f12e50fe3c255963fca862620d3453d0f168ebf4b44584f411c166" Mar 12 15:42:06 crc kubenswrapper[4921]: E0312 15:42:06.840198 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31fa9e9a9f12e50fe3c255963fca862620d3453d0f168ebf4b44584f411c166\": container with ID starting with b31fa9e9a9f12e50fe3c255963fca862620d3453d0f168ebf4b44584f411c166 not found: ID does not exist" containerID="b31fa9e9a9f12e50fe3c255963fca862620d3453d0f168ebf4b44584f411c166" Mar 12 15:42:06 crc kubenswrapper[4921]: I0312 15:42:06.840224 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31fa9e9a9f12e50fe3c255963fca862620d3453d0f168ebf4b44584f411c166"} err="failed to get container status \"b31fa9e9a9f12e50fe3c255963fca862620d3453d0f168ebf4b44584f411c166\": rpc error: code = NotFound desc = could not find container \"b31fa9e9a9f12e50fe3c255963fca862620d3453d0f168ebf4b44584f411c166\": container with ID starting with b31fa9e9a9f12e50fe3c255963fca862620d3453d0f168ebf4b44584f411c166 not found: ID does not exist" Mar 12 15:42:08 crc kubenswrapper[4921]: I0312 15:42:08.023868 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f8ee721-3f4c-4022-bb7b-f5953d422c6a" path="/var/lib/kubelet/pods/5f8ee721-3f4c-4022-bb7b-f5953d422c6a/volumes" Mar 12 15:42:14 crc kubenswrapper[4921]: I0312 15:42:14.840143 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mmbqn"] Mar 12 15:42:14 crc kubenswrapper[4921]: E0312 15:42:14.841223 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5111e3fb-dab3-4a7d-a030-cfe517378db3" containerName="oc" Mar 12 15:42:14 crc kubenswrapper[4921]: I0312 15:42:14.841240 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5111e3fb-dab3-4a7d-a030-cfe517378db3" containerName="oc" Mar 12 15:42:14 crc kubenswrapper[4921]: E0312 15:42:14.841265 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8ee721-3f4c-4022-bb7b-f5953d422c6a" containerName="extract-content" Mar 12 15:42:14 crc kubenswrapper[4921]: I0312 15:42:14.841274 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8ee721-3f4c-4022-bb7b-f5953d422c6a" containerName="extract-content" Mar 12 15:42:14 crc kubenswrapper[4921]: E0312 15:42:14.841289 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8ee721-3f4c-4022-bb7b-f5953d422c6a" containerName="registry-server" Mar 12 15:42:14 crc kubenswrapper[4921]: I0312 15:42:14.841296 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8ee721-3f4c-4022-bb7b-f5953d422c6a" containerName="registry-server" Mar 12 15:42:14 crc kubenswrapper[4921]: E0312 15:42:14.841330 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8ee721-3f4c-4022-bb7b-f5953d422c6a" containerName="extract-utilities" Mar 12 15:42:14 crc kubenswrapper[4921]: I0312 15:42:14.841338 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8ee721-3f4c-4022-bb7b-f5953d422c6a" containerName="extract-utilities" Mar 12 15:42:14 crc kubenswrapper[4921]: I0312 15:42:14.841558 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5111e3fb-dab3-4a7d-a030-cfe517378db3" containerName="oc" Mar 12 15:42:14 crc kubenswrapper[4921]: I0312 15:42:14.841586 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f8ee721-3f4c-4022-bb7b-f5953d422c6a" containerName="registry-server" Mar 12 15:42:14 crc kubenswrapper[4921]: I0312 15:42:14.843237 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:14 crc kubenswrapper[4921]: I0312 15:42:14.864838 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mmbqn"] Mar 12 15:42:14 crc kubenswrapper[4921]: I0312 15:42:14.931701 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454a1000-3405-4f2a-8183-36b7fdeb8140-catalog-content\") pod \"certified-operators-mmbqn\" (UID: \"454a1000-3405-4f2a-8183-36b7fdeb8140\") " pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:14 crc kubenswrapper[4921]: I0312 15:42:14.931860 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl8g2\" (UniqueName: \"kubernetes.io/projected/454a1000-3405-4f2a-8183-36b7fdeb8140-kube-api-access-jl8g2\") pod \"certified-operators-mmbqn\" (UID: \"454a1000-3405-4f2a-8183-36b7fdeb8140\") " pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:14 crc kubenswrapper[4921]: I0312 15:42:14.931936 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454a1000-3405-4f2a-8183-36b7fdeb8140-utilities\") pod \"certified-operators-mmbqn\" (UID: \"454a1000-3405-4f2a-8183-36b7fdeb8140\") " pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:15 crc kubenswrapper[4921]: I0312 15:42:15.033709 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl8g2\" (UniqueName: \"kubernetes.io/projected/454a1000-3405-4f2a-8183-36b7fdeb8140-kube-api-access-jl8g2\") pod \"certified-operators-mmbqn\" (UID: \"454a1000-3405-4f2a-8183-36b7fdeb8140\") " pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:15 crc kubenswrapper[4921]: I0312 15:42:15.034050 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454a1000-3405-4f2a-8183-36b7fdeb8140-utilities\") pod \"certified-operators-mmbqn\" (UID: \"454a1000-3405-4f2a-8183-36b7fdeb8140\") " pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:15 crc kubenswrapper[4921]: I0312 15:42:15.034344 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454a1000-3405-4f2a-8183-36b7fdeb8140-catalog-content\") pod \"certified-operators-mmbqn\" (UID: \"454a1000-3405-4f2a-8183-36b7fdeb8140\") " pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:15 crc kubenswrapper[4921]: I0312 15:42:15.034573 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454a1000-3405-4f2a-8183-36b7fdeb8140-utilities\") pod \"certified-operators-mmbqn\" (UID: \"454a1000-3405-4f2a-8183-36b7fdeb8140\") " pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:15 crc kubenswrapper[4921]: I0312 15:42:15.034802 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454a1000-3405-4f2a-8183-36b7fdeb8140-catalog-content\") pod \"certified-operators-mmbqn\" (UID: \"454a1000-3405-4f2a-8183-36b7fdeb8140\") " pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:15 crc kubenswrapper[4921]: I0312 15:42:15.055714 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl8g2\" (UniqueName: \"kubernetes.io/projected/454a1000-3405-4f2a-8183-36b7fdeb8140-kube-api-access-jl8g2\") pod \"certified-operators-mmbqn\" (UID: \"454a1000-3405-4f2a-8183-36b7fdeb8140\") " pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:15 crc kubenswrapper[4921]: I0312 15:42:15.171698 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:15 crc kubenswrapper[4921]: I0312 15:42:15.763766 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mmbqn"] Mar 12 15:42:15 crc kubenswrapper[4921]: I0312 15:42:15.822614 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmbqn" event={"ID":"454a1000-3405-4f2a-8183-36b7fdeb8140","Type":"ContainerStarted","Data":"a5b39acb02ac6417ceb4c766c1dbcf5a0fc54a02266dc8a344d55ddc612b2688"} Mar 12 15:42:16 crc kubenswrapper[4921]: I0312 15:42:16.832789 4921 generic.go:334] "Generic (PLEG): container finished" podID="454a1000-3405-4f2a-8183-36b7fdeb8140" containerID="64ef043e2779eacce19d2f2520033ca27d27200502c810e1ab8d6aba166cf2a1" exitCode=0 Mar 12 15:42:16 crc kubenswrapper[4921]: I0312 15:42:16.832852 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmbqn" event={"ID":"454a1000-3405-4f2a-8183-36b7fdeb8140","Type":"ContainerDied","Data":"64ef043e2779eacce19d2f2520033ca27d27200502c810e1ab8d6aba166cf2a1"} Mar 12 15:42:17 crc kubenswrapper[4921]: I0312 15:42:17.844735 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmbqn" event={"ID":"454a1000-3405-4f2a-8183-36b7fdeb8140","Type":"ContainerStarted","Data":"da270aab5b985925cbae16cad537ff0073d3e442e40456e80d480b48682be1c3"} Mar 12 15:42:18 crc kubenswrapper[4921]: I0312 15:42:18.861662 4921 generic.go:334] "Generic (PLEG): container finished" podID="454a1000-3405-4f2a-8183-36b7fdeb8140" containerID="da270aab5b985925cbae16cad537ff0073d3e442e40456e80d480b48682be1c3" exitCode=0 Mar 12 15:42:18 crc kubenswrapper[4921]: I0312 15:42:18.861717 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmbqn" event={"ID":"454a1000-3405-4f2a-8183-36b7fdeb8140","Type":"ContainerDied","Data":"da270aab5b985925cbae16cad537ff0073d3e442e40456e80d480b48682be1c3"} Mar 12 15:42:19 crc kubenswrapper[4921]: I0312 15:42:19.871599 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmbqn" event={"ID":"454a1000-3405-4f2a-8183-36b7fdeb8140","Type":"ContainerStarted","Data":"2cb2c4608459cb998f2af5548b1cc015c979b5002a91cd9289f4b07d53b34e97"} Mar 12 15:42:19 crc kubenswrapper[4921]: I0312 15:42:19.897680 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mmbqn" podStartSLOduration=3.46490748 podStartE2EDuration="5.897656565s" podCreationTimestamp="2026-03-12 15:42:14 +0000 UTC" firstStartedPulling="2026-03-12 15:42:16.837987266 +0000 UTC m=+9159.528059237" lastFinishedPulling="2026-03-12 15:42:19.270736351 +0000 UTC m=+9161.960808322" observedRunningTime="2026-03-12 15:42:19.890498383 +0000 UTC m=+9162.580570354" watchObservedRunningTime="2026-03-12 15:42:19.897656565 +0000 UTC m=+9162.587728546" Mar 12 15:42:25 crc kubenswrapper[4921]: I0312 15:42:25.172094 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:25 crc kubenswrapper[4921]: I0312 15:42:25.172658 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:25 crc kubenswrapper[4921]: I0312 15:42:25.227549 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:25 crc kubenswrapper[4921]: I0312 15:42:25.999423 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:26 crc kubenswrapper[4921]: I0312 15:42:26.074324 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mmbqn"] Mar 12 15:42:26 crc kubenswrapper[4921]: I0312 15:42:26.324358 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:42:26 crc kubenswrapper[4921]: I0312 15:42:26.324423 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:42:27 crc kubenswrapper[4921]: I0312 15:42:27.946527 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mmbqn" podUID="454a1000-3405-4f2a-8183-36b7fdeb8140" containerName="registry-server" containerID="cri-o://2cb2c4608459cb998f2af5548b1cc015c979b5002a91cd9289f4b07d53b34e97" gracePeriod=2 Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.433719 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.555624 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl8g2\" (UniqueName: \"kubernetes.io/projected/454a1000-3405-4f2a-8183-36b7fdeb8140-kube-api-access-jl8g2\") pod \"454a1000-3405-4f2a-8183-36b7fdeb8140\" (UID: \"454a1000-3405-4f2a-8183-36b7fdeb8140\") " Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.555994 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454a1000-3405-4f2a-8183-36b7fdeb8140-utilities\") pod \"454a1000-3405-4f2a-8183-36b7fdeb8140\" (UID: \"454a1000-3405-4f2a-8183-36b7fdeb8140\") " Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.556089 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454a1000-3405-4f2a-8183-36b7fdeb8140-catalog-content\") pod \"454a1000-3405-4f2a-8183-36b7fdeb8140\" (UID: \"454a1000-3405-4f2a-8183-36b7fdeb8140\") " Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.569044 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/454a1000-3405-4f2a-8183-36b7fdeb8140-utilities" (OuterVolumeSpecName: "utilities") pod "454a1000-3405-4f2a-8183-36b7fdeb8140" (UID: "454a1000-3405-4f2a-8183-36b7fdeb8140"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.577197 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/454a1000-3405-4f2a-8183-36b7fdeb8140-kube-api-access-jl8g2" (OuterVolumeSpecName: "kube-api-access-jl8g2") pod "454a1000-3405-4f2a-8183-36b7fdeb8140" (UID: "454a1000-3405-4f2a-8183-36b7fdeb8140"). InnerVolumeSpecName "kube-api-access-jl8g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.659240 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl8g2\" (UniqueName: \"kubernetes.io/projected/454a1000-3405-4f2a-8183-36b7fdeb8140-kube-api-access-jl8g2\") on node \"crc\" DevicePath \"\"" Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.659514 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/454a1000-3405-4f2a-8183-36b7fdeb8140-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.874976 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/454a1000-3405-4f2a-8183-36b7fdeb8140-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "454a1000-3405-4f2a-8183-36b7fdeb8140" (UID: "454a1000-3405-4f2a-8183-36b7fdeb8140"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.964943 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/454a1000-3405-4f2a-8183-36b7fdeb8140-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.975773 4921 generic.go:334] "Generic (PLEG): container finished" podID="454a1000-3405-4f2a-8183-36b7fdeb8140" containerID="2cb2c4608459cb998f2af5548b1cc015c979b5002a91cd9289f4b07d53b34e97" exitCode=0 Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.976067 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmbqn" event={"ID":"454a1000-3405-4f2a-8183-36b7fdeb8140","Type":"ContainerDied","Data":"2cb2c4608459cb998f2af5548b1cc015c979b5002a91cd9289f4b07d53b34e97"} Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.976175 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mmbqn" event={"ID":"454a1000-3405-4f2a-8183-36b7fdeb8140","Type":"ContainerDied","Data":"a5b39acb02ac6417ceb4c766c1dbcf5a0fc54a02266dc8a344d55ddc612b2688"} Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.976258 4921 scope.go:117] "RemoveContainer" containerID="2cb2c4608459cb998f2af5548b1cc015c979b5002a91cd9289f4b07d53b34e97" Mar 12 15:42:28 crc kubenswrapper[4921]: I0312 15:42:28.976490 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mmbqn" Mar 12 15:42:29 crc kubenswrapper[4921]: I0312 15:42:29.005100 4921 scope.go:117] "RemoveContainer" containerID="da270aab5b985925cbae16cad537ff0073d3e442e40456e80d480b48682be1c3" Mar 12 15:42:29 crc kubenswrapper[4921]: I0312 15:42:29.024368 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mmbqn"] Mar 12 15:42:29 crc kubenswrapper[4921]: I0312 15:42:29.040005 4921 scope.go:117] "RemoveContainer" containerID="64ef043e2779eacce19d2f2520033ca27d27200502c810e1ab8d6aba166cf2a1" Mar 12 15:42:29 crc kubenswrapper[4921]: I0312 15:42:29.040343 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mmbqn"] Mar 12 15:42:29 crc kubenswrapper[4921]: I0312 15:42:29.094783 4921 scope.go:117] "RemoveContainer" containerID="2cb2c4608459cb998f2af5548b1cc015c979b5002a91cd9289f4b07d53b34e97" Mar 12 15:42:29 crc kubenswrapper[4921]: E0312 15:42:29.095254 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb2c4608459cb998f2af5548b1cc015c979b5002a91cd9289f4b07d53b34e97\": container with ID starting with 2cb2c4608459cb998f2af5548b1cc015c979b5002a91cd9289f4b07d53b34e97 not found: ID does not exist" containerID="2cb2c4608459cb998f2af5548b1cc015c979b5002a91cd9289f4b07d53b34e97" Mar 12 15:42:29 crc kubenswrapper[4921]: I0312 15:42:29.095300 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb2c4608459cb998f2af5548b1cc015c979b5002a91cd9289f4b07d53b34e97"} err="failed to get container status \"2cb2c4608459cb998f2af5548b1cc015c979b5002a91cd9289f4b07d53b34e97\": rpc error: code = NotFound desc = could not find container \"2cb2c4608459cb998f2af5548b1cc015c979b5002a91cd9289f4b07d53b34e97\": container with ID starting with 2cb2c4608459cb998f2af5548b1cc015c979b5002a91cd9289f4b07d53b34e97 not found: ID does not exist" Mar 12 15:42:29 crc kubenswrapper[4921]: I0312 15:42:29.095328 4921 scope.go:117] "RemoveContainer" containerID="da270aab5b985925cbae16cad537ff0073d3e442e40456e80d480b48682be1c3" Mar 12 15:42:29 crc kubenswrapper[4921]: E0312 15:42:29.095716 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da270aab5b985925cbae16cad537ff0073d3e442e40456e80d480b48682be1c3\": container with ID starting with da270aab5b985925cbae16cad537ff0073d3e442e40456e80d480b48682be1c3 not found: ID does not exist" containerID="da270aab5b985925cbae16cad537ff0073d3e442e40456e80d480b48682be1c3" Mar 12 15:42:29 crc kubenswrapper[4921]: I0312 15:42:29.095896 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da270aab5b985925cbae16cad537ff0073d3e442e40456e80d480b48682be1c3"} err="failed to get container status \"da270aab5b985925cbae16cad537ff0073d3e442e40456e80d480b48682be1c3\": rpc error: code = NotFound desc = could not find container \"da270aab5b985925cbae16cad537ff0073d3e442e40456e80d480b48682be1c3\": container with ID starting with da270aab5b985925cbae16cad537ff0073d3e442e40456e80d480b48682be1c3 not found: ID does not exist" Mar 12 15:42:29 crc kubenswrapper[4921]: I0312 15:42:29.095983 4921 scope.go:117] "RemoveContainer" containerID="64ef043e2779eacce19d2f2520033ca27d27200502c810e1ab8d6aba166cf2a1" Mar 12 15:42:29 crc kubenswrapper[4921]: E0312 15:42:29.096281 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64ef043e2779eacce19d2f2520033ca27d27200502c810e1ab8d6aba166cf2a1\": container with ID starting with 64ef043e2779eacce19d2f2520033ca27d27200502c810e1ab8d6aba166cf2a1 not found: ID does not exist" containerID="64ef043e2779eacce19d2f2520033ca27d27200502c810e1ab8d6aba166cf2a1" Mar 12 15:42:29 crc kubenswrapper[4921]: I0312 15:42:29.096305 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64ef043e2779eacce19d2f2520033ca27d27200502c810e1ab8d6aba166cf2a1"} err="failed to get container status \"64ef043e2779eacce19d2f2520033ca27d27200502c810e1ab8d6aba166cf2a1\": rpc error: code = NotFound desc = could not find container \"64ef043e2779eacce19d2f2520033ca27d27200502c810e1ab8d6aba166cf2a1\": container with ID starting with 64ef043e2779eacce19d2f2520033ca27d27200502c810e1ab8d6aba166cf2a1 not found: ID does not exist" Mar 12 15:42:29 crc kubenswrapper[4921]: I0312 15:42:29.994202 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="454a1000-3405-4f2a-8183-36b7fdeb8140" path="/var/lib/kubelet/pods/454a1000-3405-4f2a-8183-36b7fdeb8140/volumes" Mar 12 15:42:53 crc kubenswrapper[4921]: I0312 15:42:53.096762 4921 scope.go:117] "RemoveContainer" containerID="6eabf58f4c0b301783bc9d83c11bb80d5e3ba65cf922b44717c0021379eab4b3" Mar 12 15:42:53 crc kubenswrapper[4921]: I0312 15:42:53.146993 4921 scope.go:117] "RemoveContainer" containerID="5f698f2d333370c30f0b8052f511a04f1ec1808c0feab145e64a55405b04938d" Mar 12 15:42:56 crc kubenswrapper[4921]: I0312 15:42:56.323629 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:42:56 crc kubenswrapper[4921]: I0312 15:42:56.324178 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:43:04 crc kubenswrapper[4921]: I0312 15:43:04.331216 4921 generic.go:334] "Generic (PLEG): container finished" podID="d151fa11-3e9e-4a3e-855d-e9fbb1e0742f" containerID="69b2d10ddcb7a9dd5a1c1063fa5943c354364d0d1c843868489d897c77066dc6" exitCode=0 Mar 12 15:43:04 crc kubenswrapper[4921]: I0312 15:43:04.331313 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2x9vp/must-gather-4n57n" event={"ID":"d151fa11-3e9e-4a3e-855d-e9fbb1e0742f","Type":"ContainerDied","Data":"69b2d10ddcb7a9dd5a1c1063fa5943c354364d0d1c843868489d897c77066dc6"} Mar 12 15:43:04 crc kubenswrapper[4921]: I0312 15:43:04.332578 4921 scope.go:117] "RemoveContainer" containerID="69b2d10ddcb7a9dd5a1c1063fa5943c354364d0d1c843868489d897c77066dc6" Mar 12 15:43:04 crc kubenswrapper[4921]: I0312 15:43:04.848192 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2x9vp_must-gather-4n57n_d151fa11-3e9e-4a3e-855d-e9fbb1e0742f/gather/0.log" Mar 12 15:43:20 crc kubenswrapper[4921]: I0312 15:43:20.964019 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2x9vp/must-gather-4n57n"] Mar 12 15:43:20 crc kubenswrapper[4921]: I0312 15:43:20.964846 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-2x9vp/must-gather-4n57n" podUID="d151fa11-3e9e-4a3e-855d-e9fbb1e0742f" containerName="copy" containerID="cri-o://5f615a73d197807257ad196e2d3421820699c42121526fba80e93cefb0372ec5" gracePeriod=2 Mar 12 15:43:20 crc kubenswrapper[4921]: I0312 15:43:20.973869 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2x9vp/must-gather-4n57n"] Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.436791 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2x9vp_must-gather-4n57n_d151fa11-3e9e-4a3e-855d-e9fbb1e0742f/copy/0.log" Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.437836 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/must-gather-4n57n" Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.475390 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp7qp\" (UniqueName: \"kubernetes.io/projected/d151fa11-3e9e-4a3e-855d-e9fbb1e0742f-kube-api-access-kp7qp\") pod \"d151fa11-3e9e-4a3e-855d-e9fbb1e0742f\" (UID: \"d151fa11-3e9e-4a3e-855d-e9fbb1e0742f\") " Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.475630 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d151fa11-3e9e-4a3e-855d-e9fbb1e0742f-must-gather-output\") pod \"d151fa11-3e9e-4a3e-855d-e9fbb1e0742f\" (UID: \"d151fa11-3e9e-4a3e-855d-e9fbb1e0742f\") " Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.482765 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d151fa11-3e9e-4a3e-855d-e9fbb1e0742f-kube-api-access-kp7qp" (OuterVolumeSpecName: "kube-api-access-kp7qp") pod "d151fa11-3e9e-4a3e-855d-e9fbb1e0742f" (UID: "d151fa11-3e9e-4a3e-855d-e9fbb1e0742f"). InnerVolumeSpecName "kube-api-access-kp7qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.508271 4921 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2x9vp_must-gather-4n57n_d151fa11-3e9e-4a3e-855d-e9fbb1e0742f/copy/0.log" Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.509318 4921 generic.go:334] "Generic (PLEG): container finished" podID="d151fa11-3e9e-4a3e-855d-e9fbb1e0742f" containerID="5f615a73d197807257ad196e2d3421820699c42121526fba80e93cefb0372ec5" exitCode=143 Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.509392 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2x9vp/must-gather-4n57n" Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.509396 4921 scope.go:117] "RemoveContainer" containerID="5f615a73d197807257ad196e2d3421820699c42121526fba80e93cefb0372ec5" Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.548468 4921 scope.go:117] "RemoveContainer" containerID="69b2d10ddcb7a9dd5a1c1063fa5943c354364d0d1c843868489d897c77066dc6" Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.577280 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp7qp\" (UniqueName: \"kubernetes.io/projected/d151fa11-3e9e-4a3e-855d-e9fbb1e0742f-kube-api-access-kp7qp\") on node \"crc\" DevicePath \"\"" Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.638629 4921 scope.go:117] "RemoveContainer" containerID="5f615a73d197807257ad196e2d3421820699c42121526fba80e93cefb0372ec5" Mar 12 15:43:21 crc kubenswrapper[4921]: E0312 15:43:21.639769 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f615a73d197807257ad196e2d3421820699c42121526fba80e93cefb0372ec5\": container with ID starting with 5f615a73d197807257ad196e2d3421820699c42121526fba80e93cefb0372ec5 not found: ID does not exist" containerID="5f615a73d197807257ad196e2d3421820699c42121526fba80e93cefb0372ec5" Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.639981 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f615a73d197807257ad196e2d3421820699c42121526fba80e93cefb0372ec5"} err="failed to get container status \"5f615a73d197807257ad196e2d3421820699c42121526fba80e93cefb0372ec5\": rpc error: code = NotFound desc = could not find container \"5f615a73d197807257ad196e2d3421820699c42121526fba80e93cefb0372ec5\": container with ID starting with 5f615a73d197807257ad196e2d3421820699c42121526fba80e93cefb0372ec5 not found: ID does not exist" Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.640118 4921 scope.go:117] "RemoveContainer" containerID="69b2d10ddcb7a9dd5a1c1063fa5943c354364d0d1c843868489d897c77066dc6" Mar 12 15:43:21 crc kubenswrapper[4921]: E0312 15:43:21.640531 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69b2d10ddcb7a9dd5a1c1063fa5943c354364d0d1c843868489d897c77066dc6\": container with ID starting with 69b2d10ddcb7a9dd5a1c1063fa5943c354364d0d1c843868489d897c77066dc6 not found: ID does not exist" containerID="69b2d10ddcb7a9dd5a1c1063fa5943c354364d0d1c843868489d897c77066dc6" Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.640553 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69b2d10ddcb7a9dd5a1c1063fa5943c354364d0d1c843868489d897c77066dc6"} err="failed to get container status \"69b2d10ddcb7a9dd5a1c1063fa5943c354364d0d1c843868489d897c77066dc6\": rpc error: code = NotFound desc = could not find container \"69b2d10ddcb7a9dd5a1c1063fa5943c354364d0d1c843868489d897c77066dc6\": container with ID starting with 69b2d10ddcb7a9dd5a1c1063fa5943c354364d0d1c843868489d897c77066dc6 not found: ID does not exist" Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.728372 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d151fa11-3e9e-4a3e-855d-e9fbb1e0742f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d151fa11-3e9e-4a3e-855d-e9fbb1e0742f" (UID: "d151fa11-3e9e-4a3e-855d-e9fbb1e0742f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:43:21 crc kubenswrapper[4921]: I0312 15:43:21.780775 4921 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d151fa11-3e9e-4a3e-855d-e9fbb1e0742f-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 12 15:43:22 crc kubenswrapper[4921]: I0312 15:43:22.001148 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d151fa11-3e9e-4a3e-855d-e9fbb1e0742f" path="/var/lib/kubelet/pods/d151fa11-3e9e-4a3e-855d-e9fbb1e0742f/volumes" Mar 12 15:43:26 crc kubenswrapper[4921]: I0312 15:43:26.324401 4921 patch_prober.go:28] interesting pod/machine-config-daemon-fkpqq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 12 15:43:26 crc kubenswrapper[4921]: I0312 15:43:26.324740 4921 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 12 15:43:26 crc kubenswrapper[4921]: I0312 15:43:26.324782 4921 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" Mar 12 15:43:26 crc kubenswrapper[4921]: I0312 15:43:26.325557 4921 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef"} pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 12 15:43:26 crc kubenswrapper[4921]: I0312 15:43:26.325639 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerName="machine-config-daemon" containerID="cri-o://53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" gracePeriod=600 Mar 12 15:43:26 crc kubenswrapper[4921]: E0312 15:43:26.446421 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:43:26 crc kubenswrapper[4921]: I0312 15:43:26.725471 4921 generic.go:334] "Generic (PLEG): container finished" podID="ae82cb49-657a-4b47-8107-0729b9edf47b" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" exitCode=0 Mar 12 15:43:26 crc kubenswrapper[4921]: I0312 15:43:26.725521 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" event={"ID":"ae82cb49-657a-4b47-8107-0729b9edf47b","Type":"ContainerDied","Data":"53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef"} Mar 12 15:43:26 crc kubenswrapper[4921]: I0312 15:43:26.725558 4921 scope.go:117] "RemoveContainer" containerID="e71e67d67f2a250359c6ca0c79b048ff9319acfa4f7e5973d16d402b376ad816" Mar 12 15:43:26 crc kubenswrapper[4921]: I0312 15:43:26.726023 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:43:26 crc kubenswrapper[4921]: E0312 15:43:26.726282 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:43:41 crc kubenswrapper[4921]: I0312 15:43:41.983736 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:43:41 crc kubenswrapper[4921]: E0312 15:43:41.984730 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:43:52 crc kubenswrapper[4921]: I0312 15:43:52.983583 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:43:52 crc kubenswrapper[4921]: E0312 15:43:52.985083 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.137974 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555504-6tfcj"] Mar 12 15:44:00 crc kubenswrapper[4921]: E0312 15:44:00.139153 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="454a1000-3405-4f2a-8183-36b7fdeb8140" containerName="extract-utilities" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.139168 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="454a1000-3405-4f2a-8183-36b7fdeb8140" containerName="extract-utilities" Mar 12 15:44:00 crc kubenswrapper[4921]: E0312 15:44:00.139197 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d151fa11-3e9e-4a3e-855d-e9fbb1e0742f" containerName="copy" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.139203 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d151fa11-3e9e-4a3e-855d-e9fbb1e0742f" containerName="copy" Mar 12 15:44:00 crc kubenswrapper[4921]: E0312 15:44:00.139217 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d151fa11-3e9e-4a3e-855d-e9fbb1e0742f" containerName="gather" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.139223 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="d151fa11-3e9e-4a3e-855d-e9fbb1e0742f" containerName="gather" Mar 12 15:44:00 crc kubenswrapper[4921]: E0312 15:44:00.139231 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="454a1000-3405-4f2a-8183-36b7fdeb8140" containerName="extract-content" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.139237 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="454a1000-3405-4f2a-8183-36b7fdeb8140" containerName="extract-content" Mar 12 15:44:00 crc kubenswrapper[4921]: E0312 15:44:00.139252 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="454a1000-3405-4f2a-8183-36b7fdeb8140" containerName="registry-server" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.139258 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="454a1000-3405-4f2a-8183-36b7fdeb8140" containerName="registry-server" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.139461 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d151fa11-3e9e-4a3e-855d-e9fbb1e0742f" containerName="gather" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.139475 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="d151fa11-3e9e-4a3e-855d-e9fbb1e0742f" containerName="copy" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.139491 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="454a1000-3405-4f2a-8183-36b7fdeb8140" containerName="registry-server" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.140210 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555504-6tfcj" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.145308 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.145411 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.146852 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555504-6tfcj"] Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.147093 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.227374 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p9r5\" (UniqueName: \"kubernetes.io/projected/2ee6e487-ce7c-46d6-97ee-110ff77a78d2-kube-api-access-8p9r5\") pod \"auto-csr-approver-29555504-6tfcj\" (UID: \"2ee6e487-ce7c-46d6-97ee-110ff77a78d2\") " pod="openshift-infra/auto-csr-approver-29555504-6tfcj" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.328848 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p9r5\" (UniqueName: \"kubernetes.io/projected/2ee6e487-ce7c-46d6-97ee-110ff77a78d2-kube-api-access-8p9r5\") pod \"auto-csr-approver-29555504-6tfcj\" (UID: \"2ee6e487-ce7c-46d6-97ee-110ff77a78d2\") " pod="openshift-infra/auto-csr-approver-29555504-6tfcj" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.352158 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p9r5\" (UniqueName: \"kubernetes.io/projected/2ee6e487-ce7c-46d6-97ee-110ff77a78d2-kube-api-access-8p9r5\") pod \"auto-csr-approver-29555504-6tfcj\" (UID: \"2ee6e487-ce7c-46d6-97ee-110ff77a78d2\") " pod="openshift-infra/auto-csr-approver-29555504-6tfcj" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.473875 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555504-6tfcj" Mar 12 15:44:00 crc kubenswrapper[4921]: I0312 15:44:00.946076 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555504-6tfcj"] Mar 12 15:44:01 crc kubenswrapper[4921]: I0312 15:44:01.009724 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555504-6tfcj" event={"ID":"2ee6e487-ce7c-46d6-97ee-110ff77a78d2","Type":"ContainerStarted","Data":"5756aa42969ef9ddb2ee9125507b55441306c397c42e08ef2ede2b77992acd88"} Mar 12 15:44:03 crc kubenswrapper[4921]: I0312 15:44:03.028304 4921 generic.go:334] "Generic (PLEG): container finished" podID="2ee6e487-ce7c-46d6-97ee-110ff77a78d2" containerID="6cd527f90f7fb01627b9e0d209513e0c15ba87feb6e41d54b950d6b2db698701" exitCode=0 Mar 12 15:44:03 crc kubenswrapper[4921]: I0312 15:44:03.028361 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555504-6tfcj" event={"ID":"2ee6e487-ce7c-46d6-97ee-110ff77a78d2","Type":"ContainerDied","Data":"6cd527f90f7fb01627b9e0d209513e0c15ba87feb6e41d54b950d6b2db698701"} Mar 12 15:44:04 crc kubenswrapper[4921]: I0312 15:44:04.393027 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555504-6tfcj" Mar 12 15:44:04 crc kubenswrapper[4921]: I0312 15:44:04.529999 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p9r5\" (UniqueName: \"kubernetes.io/projected/2ee6e487-ce7c-46d6-97ee-110ff77a78d2-kube-api-access-8p9r5\") pod \"2ee6e487-ce7c-46d6-97ee-110ff77a78d2\" (UID: \"2ee6e487-ce7c-46d6-97ee-110ff77a78d2\") " Mar 12 15:44:04 crc kubenswrapper[4921]: I0312 15:44:04.539200 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee6e487-ce7c-46d6-97ee-110ff77a78d2-kube-api-access-8p9r5" (OuterVolumeSpecName: "kube-api-access-8p9r5") pod "2ee6e487-ce7c-46d6-97ee-110ff77a78d2" (UID: "2ee6e487-ce7c-46d6-97ee-110ff77a78d2"). InnerVolumeSpecName "kube-api-access-8p9r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:44:04 crc kubenswrapper[4921]: I0312 15:44:04.632557 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p9r5\" (UniqueName: \"kubernetes.io/projected/2ee6e487-ce7c-46d6-97ee-110ff77a78d2-kube-api-access-8p9r5\") on node \"crc\" DevicePath \"\"" Mar 12 15:44:05 crc kubenswrapper[4921]: I0312 15:44:05.063992 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555504-6tfcj" event={"ID":"2ee6e487-ce7c-46d6-97ee-110ff77a78d2","Type":"ContainerDied","Data":"5756aa42969ef9ddb2ee9125507b55441306c397c42e08ef2ede2b77992acd88"} Mar 12 15:44:05 crc kubenswrapper[4921]: I0312 15:44:05.064536 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5756aa42969ef9ddb2ee9125507b55441306c397c42e08ef2ede2b77992acd88" Mar 12 15:44:05 crc kubenswrapper[4921]: I0312 15:44:05.064647 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555504-6tfcj" Mar 12 15:44:05 crc kubenswrapper[4921]: I0312 15:44:05.468184 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555498-564vr"] Mar 12 15:44:05 crc kubenswrapper[4921]: I0312 15:44:05.481886 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555498-564vr"] Mar 12 15:44:05 crc kubenswrapper[4921]: I0312 15:44:05.984225 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:44:05 crc kubenswrapper[4921]: E0312 15:44:05.984637 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:44:05 crc kubenswrapper[4921]: I0312 15:44:05.997178 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="207d4e43-31a5-402e-a4d4-c9b33593f136" path="/var/lib/kubelet/pods/207d4e43-31a5-402e-a4d4-c9b33593f136/volumes" Mar 12 15:44:20 crc kubenswrapper[4921]: I0312 15:44:20.984412 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:44:20 crc kubenswrapper[4921]: E0312 15:44:20.985902 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:44:35 crc kubenswrapper[4921]: I0312 15:44:35.983477 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:44:35 crc kubenswrapper[4921]: E0312 15:44:35.984316 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:44:45 crc kubenswrapper[4921]: I0312 15:44:45.817648 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f8vdj"] Mar 12 15:44:45 crc kubenswrapper[4921]: E0312 15:44:45.819059 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee6e487-ce7c-46d6-97ee-110ff77a78d2" containerName="oc" Mar 12 15:44:45 crc kubenswrapper[4921]: I0312 15:44:45.819077 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee6e487-ce7c-46d6-97ee-110ff77a78d2" containerName="oc" Mar 12 15:44:45 crc kubenswrapper[4921]: I0312 15:44:45.819364 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee6e487-ce7c-46d6-97ee-110ff77a78d2" containerName="oc" Mar 12 15:44:45 crc kubenswrapper[4921]: I0312 15:44:45.821114 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:45 crc kubenswrapper[4921]: I0312 15:44:45.839457 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8vdj"] Mar 12 15:44:45 crc kubenswrapper[4921]: I0312 15:44:45.966513 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ac1cbd9-fe76-49cc-924f-434e578eebac-catalog-content\") pod \"redhat-marketplace-f8vdj\" (UID: \"8ac1cbd9-fe76-49cc-924f-434e578eebac\") " pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:45 crc kubenswrapper[4921]: I0312 15:44:45.966974 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ac1cbd9-fe76-49cc-924f-434e578eebac-utilities\") pod \"redhat-marketplace-f8vdj\" (UID: \"8ac1cbd9-fe76-49cc-924f-434e578eebac\") " pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:45 crc kubenswrapper[4921]: I0312 15:44:45.967309 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52ndd\" (UniqueName: \"kubernetes.io/projected/8ac1cbd9-fe76-49cc-924f-434e578eebac-kube-api-access-52ndd\") pod \"redhat-marketplace-f8vdj\" (UID: \"8ac1cbd9-fe76-49cc-924f-434e578eebac\") " pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:46 crc kubenswrapper[4921]: I0312 15:44:46.069456 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ac1cbd9-fe76-49cc-924f-434e578eebac-utilities\") pod \"redhat-marketplace-f8vdj\" (UID: \"8ac1cbd9-fe76-49cc-924f-434e578eebac\") " pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:46 crc kubenswrapper[4921]: I0312 15:44:46.069604 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52ndd\" (UniqueName: \"kubernetes.io/projected/8ac1cbd9-fe76-49cc-924f-434e578eebac-kube-api-access-52ndd\") pod \"redhat-marketplace-f8vdj\" (UID: \"8ac1cbd9-fe76-49cc-924f-434e578eebac\") " pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:46 crc kubenswrapper[4921]: I0312 15:44:46.069693 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ac1cbd9-fe76-49cc-924f-434e578eebac-catalog-content\") pod \"redhat-marketplace-f8vdj\" (UID: \"8ac1cbd9-fe76-49cc-924f-434e578eebac\") " pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:46 crc kubenswrapper[4921]: I0312 15:44:46.070183 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ac1cbd9-fe76-49cc-924f-434e578eebac-utilities\") pod \"redhat-marketplace-f8vdj\" (UID: \"8ac1cbd9-fe76-49cc-924f-434e578eebac\") " pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:46 crc kubenswrapper[4921]: I0312 15:44:46.070206 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ac1cbd9-fe76-49cc-924f-434e578eebac-catalog-content\") pod \"redhat-marketplace-f8vdj\" (UID: \"8ac1cbd9-fe76-49cc-924f-434e578eebac\") " pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:46 crc kubenswrapper[4921]: I0312 15:44:46.094667 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52ndd\" (UniqueName: \"kubernetes.io/projected/8ac1cbd9-fe76-49cc-924f-434e578eebac-kube-api-access-52ndd\") pod \"redhat-marketplace-f8vdj\" (UID: \"8ac1cbd9-fe76-49cc-924f-434e578eebac\") " pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:46 crc kubenswrapper[4921]: I0312 15:44:46.146209 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:46 crc kubenswrapper[4921]: I0312 15:44:46.636852 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8vdj"] Mar 12 15:44:46 crc kubenswrapper[4921]: W0312 15:44:46.641431 4921 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ac1cbd9_fe76_49cc_924f_434e578eebac.slice/crio-8acec882768c3de76e7247f0bd082e89bd6323cba85e1c6967c168c4747074c4 WatchSource:0}: Error finding container 8acec882768c3de76e7247f0bd082e89bd6323cba85e1c6967c168c4747074c4: Status 404 returned error can't find the container with id 8acec882768c3de76e7247f0bd082e89bd6323cba85e1c6967c168c4747074c4 Mar 12 15:44:47 crc kubenswrapper[4921]: I0312 15:44:47.470038 4921 generic.go:334] "Generic (PLEG): container finished" podID="8ac1cbd9-fe76-49cc-924f-434e578eebac" containerID="d87e579fca261ed3ec5fc231991962222834e5cab3335f02294204402312e2b2" exitCode=0 Mar 12 15:44:47 crc kubenswrapper[4921]: I0312 15:44:47.470097 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8vdj" event={"ID":"8ac1cbd9-fe76-49cc-924f-434e578eebac","Type":"ContainerDied","Data":"d87e579fca261ed3ec5fc231991962222834e5cab3335f02294204402312e2b2"} Mar 12 15:44:47 crc kubenswrapper[4921]: I0312 15:44:47.470354 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8vdj" event={"ID":"8ac1cbd9-fe76-49cc-924f-434e578eebac","Type":"ContainerStarted","Data":"8acec882768c3de76e7247f0bd082e89bd6323cba85e1c6967c168c4747074c4"} Mar 12 15:44:48 crc kubenswrapper[4921]: I0312 15:44:48.481663 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8vdj" event={"ID":"8ac1cbd9-fe76-49cc-924f-434e578eebac","Type":"ContainerStarted","Data":"ef4b151715104240b36e16f1eff4f3faa4892ecabf9c2421c2dbe1389a9cfe5c"} Mar 12 15:44:49 crc kubenswrapper[4921]: I0312 15:44:49.495051 4921 generic.go:334] "Generic (PLEG): container finished" podID="8ac1cbd9-fe76-49cc-924f-434e578eebac" containerID="ef4b151715104240b36e16f1eff4f3faa4892ecabf9c2421c2dbe1389a9cfe5c" exitCode=0 Mar 12 15:44:49 crc kubenswrapper[4921]: I0312 15:44:49.495105 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8vdj" event={"ID":"8ac1cbd9-fe76-49cc-924f-434e578eebac","Type":"ContainerDied","Data":"ef4b151715104240b36e16f1eff4f3faa4892ecabf9c2421c2dbe1389a9cfe5c"} Mar 12 15:44:49 crc kubenswrapper[4921]: I0312 15:44:49.983427 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:44:49 crc kubenswrapper[4921]: E0312 15:44:49.984039 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:44:50 crc kubenswrapper[4921]: I0312 15:44:50.506471 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8vdj" event={"ID":"8ac1cbd9-fe76-49cc-924f-434e578eebac","Type":"ContainerStarted","Data":"186fe1721c37413aaf1f28a737610990cc03d7d72e0c6fe87237ba2477e27d3d"} Mar 12 15:44:50 crc kubenswrapper[4921]: I0312 15:44:50.531458 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f8vdj" podStartSLOduration=2.788246337 podStartE2EDuration="5.531430848s" podCreationTimestamp="2026-03-12 15:44:45 +0000 UTC" firstStartedPulling="2026-03-12 15:44:47.471798728 +0000 UTC m=+9310.161870699" lastFinishedPulling="2026-03-12 15:44:50.214983239 +0000 UTC m=+9312.905055210" observedRunningTime="2026-03-12 15:44:50.528321872 +0000 UTC m=+9313.218393853" watchObservedRunningTime="2026-03-12 15:44:50.531430848 +0000 UTC m=+9313.221502819" Mar 12 15:44:53 crc kubenswrapper[4921]: I0312 15:44:53.318357 4921 scope.go:117] "RemoveContainer" containerID="38b8c581eb85163b5bf0ba28b2f1bf9469a2d90b1d6aaf22aed772a03817b6ec" Mar 12 15:44:56 crc kubenswrapper[4921]: I0312 15:44:56.146334 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:56 crc kubenswrapper[4921]: I0312 15:44:56.146933 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:56 crc kubenswrapper[4921]: I0312 15:44:56.349125 4921 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:56 crc kubenswrapper[4921]: I0312 15:44:56.620632 4921 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:56 crc kubenswrapper[4921]: I0312 15:44:56.676614 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8vdj"] Mar 12 15:44:58 crc kubenswrapper[4921]: I0312 15:44:58.584959 4921 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f8vdj" podUID="8ac1cbd9-fe76-49cc-924f-434e578eebac" containerName="registry-server" containerID="cri-o://186fe1721c37413aaf1f28a737610990cc03d7d72e0c6fe87237ba2477e27d3d" gracePeriod=2 Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.065126 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.149207 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ac1cbd9-fe76-49cc-924f-434e578eebac-utilities\") pod \"8ac1cbd9-fe76-49cc-924f-434e578eebac\" (UID: \"8ac1cbd9-fe76-49cc-924f-434e578eebac\") " Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.149259 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ac1cbd9-fe76-49cc-924f-434e578eebac-catalog-content\") pod \"8ac1cbd9-fe76-49cc-924f-434e578eebac\" (UID: \"8ac1cbd9-fe76-49cc-924f-434e578eebac\") " Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.149372 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52ndd\" (UniqueName: \"kubernetes.io/projected/8ac1cbd9-fe76-49cc-924f-434e578eebac-kube-api-access-52ndd\") pod \"8ac1cbd9-fe76-49cc-924f-434e578eebac\" (UID: \"8ac1cbd9-fe76-49cc-924f-434e578eebac\") " Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.150402 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ac1cbd9-fe76-49cc-924f-434e578eebac-utilities" (OuterVolumeSpecName: "utilities") pod "8ac1cbd9-fe76-49cc-924f-434e578eebac" (UID: "8ac1cbd9-fe76-49cc-924f-434e578eebac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.150806 4921 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ac1cbd9-fe76-49cc-924f-434e578eebac-utilities\") on node \"crc\" DevicePath \"\"" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.154551 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac1cbd9-fe76-49cc-924f-434e578eebac-kube-api-access-52ndd" (OuterVolumeSpecName: "kube-api-access-52ndd") pod "8ac1cbd9-fe76-49cc-924f-434e578eebac" (UID: "8ac1cbd9-fe76-49cc-924f-434e578eebac"). InnerVolumeSpecName "kube-api-access-52ndd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.178484 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ac1cbd9-fe76-49cc-924f-434e578eebac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ac1cbd9-fe76-49cc-924f-434e578eebac" (UID: "8ac1cbd9-fe76-49cc-924f-434e578eebac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.252844 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52ndd\" (UniqueName: \"kubernetes.io/projected/8ac1cbd9-fe76-49cc-924f-434e578eebac-kube-api-access-52ndd\") on node \"crc\" DevicePath \"\"" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.252883 4921 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ac1cbd9-fe76-49cc-924f-434e578eebac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.598009 4921 generic.go:334] "Generic (PLEG): container finished" podID="8ac1cbd9-fe76-49cc-924f-434e578eebac" containerID="186fe1721c37413aaf1f28a737610990cc03d7d72e0c6fe87237ba2477e27d3d" exitCode=0 Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.598128 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f8vdj" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.598136 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8vdj" event={"ID":"8ac1cbd9-fe76-49cc-924f-434e578eebac","Type":"ContainerDied","Data":"186fe1721c37413aaf1f28a737610990cc03d7d72e0c6fe87237ba2477e27d3d"} Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.598532 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f8vdj" event={"ID":"8ac1cbd9-fe76-49cc-924f-434e578eebac","Type":"ContainerDied","Data":"8acec882768c3de76e7247f0bd082e89bd6323cba85e1c6967c168c4747074c4"} Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.598579 4921 scope.go:117] "RemoveContainer" containerID="186fe1721c37413aaf1f28a737610990cc03d7d72e0c6fe87237ba2477e27d3d" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.630951 4921 scope.go:117] "RemoveContainer" containerID="ef4b151715104240b36e16f1eff4f3faa4892ecabf9c2421c2dbe1389a9cfe5c" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.652937 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8vdj"] Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.660591 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f8vdj"] Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.673925 4921 scope.go:117] "RemoveContainer" containerID="d87e579fca261ed3ec5fc231991962222834e5cab3335f02294204402312e2b2" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.719275 4921 scope.go:117] "RemoveContainer" containerID="186fe1721c37413aaf1f28a737610990cc03d7d72e0c6fe87237ba2477e27d3d" Mar 12 15:44:59 crc kubenswrapper[4921]: E0312 15:44:59.719905 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"186fe1721c37413aaf1f28a737610990cc03d7d72e0c6fe87237ba2477e27d3d\": container with ID starting with 186fe1721c37413aaf1f28a737610990cc03d7d72e0c6fe87237ba2477e27d3d not found: ID does not exist" containerID="186fe1721c37413aaf1f28a737610990cc03d7d72e0c6fe87237ba2477e27d3d" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.719946 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186fe1721c37413aaf1f28a737610990cc03d7d72e0c6fe87237ba2477e27d3d"} err="failed to get container status \"186fe1721c37413aaf1f28a737610990cc03d7d72e0c6fe87237ba2477e27d3d\": rpc error: code = NotFound desc = could not find container \"186fe1721c37413aaf1f28a737610990cc03d7d72e0c6fe87237ba2477e27d3d\": container with ID starting with 186fe1721c37413aaf1f28a737610990cc03d7d72e0c6fe87237ba2477e27d3d not found: ID does not exist" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.719974 4921 scope.go:117] "RemoveContainer" containerID="ef4b151715104240b36e16f1eff4f3faa4892ecabf9c2421c2dbe1389a9cfe5c" Mar 12 15:44:59 crc kubenswrapper[4921]: E0312 15:44:59.720286 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef4b151715104240b36e16f1eff4f3faa4892ecabf9c2421c2dbe1389a9cfe5c\": container with ID starting with ef4b151715104240b36e16f1eff4f3faa4892ecabf9c2421c2dbe1389a9cfe5c not found: ID does not exist" containerID="ef4b151715104240b36e16f1eff4f3faa4892ecabf9c2421c2dbe1389a9cfe5c" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.720321 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef4b151715104240b36e16f1eff4f3faa4892ecabf9c2421c2dbe1389a9cfe5c"} err="failed to get container status \"ef4b151715104240b36e16f1eff4f3faa4892ecabf9c2421c2dbe1389a9cfe5c\": rpc error: code = NotFound desc = could not find container \"ef4b151715104240b36e16f1eff4f3faa4892ecabf9c2421c2dbe1389a9cfe5c\": container with ID starting with ef4b151715104240b36e16f1eff4f3faa4892ecabf9c2421c2dbe1389a9cfe5c not found: ID does not exist" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.720346 4921 scope.go:117] "RemoveContainer" containerID="d87e579fca261ed3ec5fc231991962222834e5cab3335f02294204402312e2b2" Mar 12 15:44:59 crc kubenswrapper[4921]: E0312 15:44:59.720639 4921 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d87e579fca261ed3ec5fc231991962222834e5cab3335f02294204402312e2b2\": container with ID starting with d87e579fca261ed3ec5fc231991962222834e5cab3335f02294204402312e2b2 not found: ID does not exist" containerID="d87e579fca261ed3ec5fc231991962222834e5cab3335f02294204402312e2b2" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.720670 4921 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d87e579fca261ed3ec5fc231991962222834e5cab3335f02294204402312e2b2"} err="failed to get container status \"d87e579fca261ed3ec5fc231991962222834e5cab3335f02294204402312e2b2\": rpc error: code = NotFound desc = could not find container \"d87e579fca261ed3ec5fc231991962222834e5cab3335f02294204402312e2b2\": container with ID starting with d87e579fca261ed3ec5fc231991962222834e5cab3335f02294204402312e2b2 not found: ID does not exist" Mar 12 15:44:59 crc kubenswrapper[4921]: I0312 15:44:59.993614 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ac1cbd9-fe76-49cc-924f-434e578eebac" path="/var/lib/kubelet/pods/8ac1cbd9-fe76-49cc-924f-434e578eebac/volumes" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.151546 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf"] Mar 12 15:45:00 crc kubenswrapper[4921]: E0312 15:45:00.152097 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac1cbd9-fe76-49cc-924f-434e578eebac" containerName="extract-utilities" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.152123 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac1cbd9-fe76-49cc-924f-434e578eebac" containerName="extract-utilities" Mar 12 15:45:00 crc kubenswrapper[4921]: E0312 15:45:00.152145 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac1cbd9-fe76-49cc-924f-434e578eebac" containerName="extract-content" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.152155 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac1cbd9-fe76-49cc-924f-434e578eebac" containerName="extract-content" Mar 12 15:45:00 crc kubenswrapper[4921]: E0312 15:45:00.152184 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac1cbd9-fe76-49cc-924f-434e578eebac" containerName="registry-server" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.152192 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac1cbd9-fe76-49cc-924f-434e578eebac" containerName="registry-server" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.152424 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac1cbd9-fe76-49cc-924f-434e578eebac" containerName="registry-server" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.153266 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.156241 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.156457 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.167244 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf"] Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.274343 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfs79\" (UniqueName: \"kubernetes.io/projected/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-kube-api-access-lfs79\") pod \"collect-profiles-29555505-lwrsf\" (UID: \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.274846 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-secret-volume\") pod \"collect-profiles-29555505-lwrsf\" (UID: \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.274986 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-config-volume\") pod \"collect-profiles-29555505-lwrsf\" (UID: \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.376388 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-secret-volume\") pod \"collect-profiles-29555505-lwrsf\" (UID: \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.376458 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-config-volume\") pod \"collect-profiles-29555505-lwrsf\" (UID: \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.376542 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfs79\" (UniqueName: \"kubernetes.io/projected/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-kube-api-access-lfs79\") pod \"collect-profiles-29555505-lwrsf\" (UID: \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.377915 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-config-volume\") pod \"collect-profiles-29555505-lwrsf\" (UID: \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.396126 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-secret-volume\") pod \"collect-profiles-29555505-lwrsf\" (UID: \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.402296 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfs79\" (UniqueName: \"kubernetes.io/projected/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-kube-api-access-lfs79\") pod \"collect-profiles-29555505-lwrsf\" (UID: \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.474767 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" Mar 12 15:45:00 crc kubenswrapper[4921]: I0312 15:45:00.980330 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf"] Mar 12 15:45:01 crc kubenswrapper[4921]: I0312 15:45:01.656993 4921 generic.go:334] "Generic (PLEG): container finished" podID="dfd10014-b3f7-46be-9ab5-2fe57a865bbc" containerID="69ddfc3295159e04abcbac8d4e38621bc8eb59c2042154d6184e9b46cd999676" exitCode=0 Mar 12 15:45:01 crc kubenswrapper[4921]: I0312 15:45:01.657327 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" event={"ID":"dfd10014-b3f7-46be-9ab5-2fe57a865bbc","Type":"ContainerDied","Data":"69ddfc3295159e04abcbac8d4e38621bc8eb59c2042154d6184e9b46cd999676"} Mar 12 15:45:01 crc kubenswrapper[4921]: I0312 15:45:01.657361 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" event={"ID":"dfd10014-b3f7-46be-9ab5-2fe57a865bbc","Type":"ContainerStarted","Data":"bbb07d02530ac57507a8fd4f409ad3f3fd57c704fd7da5dba63537e90b5d0796"} Mar 12 15:45:03 crc kubenswrapper[4921]: I0312 15:45:03.054319 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" Mar 12 15:45:03 crc kubenswrapper[4921]: I0312 15:45:03.075701 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfs79\" (UniqueName: \"kubernetes.io/projected/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-kube-api-access-lfs79\") pod \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\" (UID: \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\") " Mar 12 15:45:03 crc kubenswrapper[4921]: I0312 15:45:03.075842 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-secret-volume\") pod \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\" (UID: \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\") " Mar 12 15:45:03 crc kubenswrapper[4921]: I0312 15:45:03.075918 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-config-volume\") pod \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\" (UID: \"dfd10014-b3f7-46be-9ab5-2fe57a865bbc\") " Mar 12 15:45:03 crc kubenswrapper[4921]: I0312 15:45:03.077515 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-config-volume" (OuterVolumeSpecName: "config-volume") pod "dfd10014-b3f7-46be-9ab5-2fe57a865bbc" (UID: "dfd10014-b3f7-46be-9ab5-2fe57a865bbc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 15:45:03 crc kubenswrapper[4921]: I0312 15:45:03.088891 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dfd10014-b3f7-46be-9ab5-2fe57a865bbc" (UID: "dfd10014-b3f7-46be-9ab5-2fe57a865bbc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 15:45:03 crc kubenswrapper[4921]: I0312 15:45:03.093235 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-kube-api-access-lfs79" (OuterVolumeSpecName: "kube-api-access-lfs79") pod "dfd10014-b3f7-46be-9ab5-2fe57a865bbc" (UID: "dfd10014-b3f7-46be-9ab5-2fe57a865bbc"). InnerVolumeSpecName "kube-api-access-lfs79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:45:03 crc kubenswrapper[4921]: I0312 15:45:03.178008 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfs79\" (UniqueName: \"kubernetes.io/projected/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-kube-api-access-lfs79\") on node \"crc\" DevicePath \"\"" Mar 12 15:45:03 crc kubenswrapper[4921]: I0312 15:45:03.178036 4921 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:45:03 crc kubenswrapper[4921]: I0312 15:45:03.178045 4921 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfd10014-b3f7-46be-9ab5-2fe57a865bbc-config-volume\") on node \"crc\" DevicePath \"\"" Mar 12 15:45:03 crc kubenswrapper[4921]: I0312 15:45:03.678837 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" event={"ID":"dfd10014-b3f7-46be-9ab5-2fe57a865bbc","Type":"ContainerDied","Data":"bbb07d02530ac57507a8fd4f409ad3f3fd57c704fd7da5dba63537e90b5d0796"} Mar 12 15:45:03 crc kubenswrapper[4921]: I0312 15:45:03.679152 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbb07d02530ac57507a8fd4f409ad3f3fd57c704fd7da5dba63537e90b5d0796" Mar 12 15:45:03 crc kubenswrapper[4921]: I0312 15:45:03.678908 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29555505-lwrsf" Mar 12 15:45:03 crc kubenswrapper[4921]: I0312 15:45:03.984248 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:45:03 crc kubenswrapper[4921]: E0312 15:45:03.984566 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:45:04 crc kubenswrapper[4921]: I0312 15:45:04.127882 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm"] Mar 12 15:45:04 crc kubenswrapper[4921]: I0312 15:45:04.140059 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29555460-kzlfm"] Mar 12 15:45:06 crc kubenswrapper[4921]: I0312 15:45:06.002990 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5f91db2-4487-472c-a05e-fbe7391d60ff" path="/var/lib/kubelet/pods/b5f91db2-4487-472c-a05e-fbe7391d60ff/volumes" Mar 12 15:45:18 crc kubenswrapper[4921]: I0312 15:45:18.984671 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:45:18 crc kubenswrapper[4921]: E0312 15:45:18.986492 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:45:31 crc kubenswrapper[4921]: I0312 15:45:31.983962 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:45:31 crc kubenswrapper[4921]: E0312 15:45:31.984638 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:45:45 crc kubenswrapper[4921]: I0312 15:45:45.984490 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:45:45 crc kubenswrapper[4921]: E0312 15:45:45.985269 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:45:53 crc kubenswrapper[4921]: I0312 15:45:53.383068 4921 scope.go:117] "RemoveContainer" containerID="f7e1901b946d5d563acd30a784997c2fa500e70ebfd9644b218e4ac3e286a3f2" Mar 12 15:45:58 crc kubenswrapper[4921]: I0312 15:45:58.983803 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:45:58 crc kubenswrapper[4921]: E0312 15:45:58.984561 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:46:00 crc kubenswrapper[4921]: I0312 15:46:00.144705 4921 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29555506-k68xz"] Mar 12 15:46:00 crc kubenswrapper[4921]: E0312 15:46:00.145207 4921 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd10014-b3f7-46be-9ab5-2fe57a865bbc" containerName="collect-profiles" Mar 12 15:46:00 crc kubenswrapper[4921]: I0312 15:46:00.145220 4921 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd10014-b3f7-46be-9ab5-2fe57a865bbc" containerName="collect-profiles" Mar 12 15:46:00 crc kubenswrapper[4921]: I0312 15:46:00.145405 4921 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd10014-b3f7-46be-9ab5-2fe57a865bbc" containerName="collect-profiles" Mar 12 15:46:00 crc kubenswrapper[4921]: I0312 15:46:00.146098 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555506-k68xz" Mar 12 15:46:00 crc kubenswrapper[4921]: I0312 15:46:00.148801 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 12 15:46:00 crc kubenswrapper[4921]: I0312 15:46:00.150698 4921 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 12 15:46:00 crc kubenswrapper[4921]: I0312 15:46:00.158361 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555506-k68xz"] Mar 12 15:46:00 crc kubenswrapper[4921]: I0312 15:46:00.159585 4921 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-jwxbt" Mar 12 15:46:00 crc kubenswrapper[4921]: I0312 15:46:00.275994 4921 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9cb6\" (UniqueName: \"kubernetes.io/projected/85cca180-0d60-4f17-b584-5a7909dcd90a-kube-api-access-x9cb6\") pod \"auto-csr-approver-29555506-k68xz\" (UID: \"85cca180-0d60-4f17-b584-5a7909dcd90a\") " pod="openshift-infra/auto-csr-approver-29555506-k68xz" Mar 12 15:46:00 crc kubenswrapper[4921]: I0312 15:46:00.377790 4921 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9cb6\" (UniqueName: \"kubernetes.io/projected/85cca180-0d60-4f17-b584-5a7909dcd90a-kube-api-access-x9cb6\") pod \"auto-csr-approver-29555506-k68xz\" (UID: \"85cca180-0d60-4f17-b584-5a7909dcd90a\") " pod="openshift-infra/auto-csr-approver-29555506-k68xz" Mar 12 15:46:00 crc kubenswrapper[4921]: I0312 15:46:00.401012 4921 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9cb6\" (UniqueName: \"kubernetes.io/projected/85cca180-0d60-4f17-b584-5a7909dcd90a-kube-api-access-x9cb6\") pod \"auto-csr-approver-29555506-k68xz\" (UID: \"85cca180-0d60-4f17-b584-5a7909dcd90a\") " pod="openshift-infra/auto-csr-approver-29555506-k68xz" Mar 12 15:46:00 crc kubenswrapper[4921]: I0312 15:46:00.469780 4921 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555506-k68xz" Mar 12 15:46:00 crc kubenswrapper[4921]: I0312 15:46:00.946102 4921 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 15:46:00 crc kubenswrapper[4921]: I0312 15:46:00.946657 4921 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29555506-k68xz"] Mar 12 15:46:01 crc kubenswrapper[4921]: I0312 15:46:01.281902 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555506-k68xz" event={"ID":"85cca180-0d60-4f17-b584-5a7909dcd90a","Type":"ContainerStarted","Data":"ca35b7cb5f5e2b1169b316ad253e35bac162196338445dee2e42762f1d863966"} Mar 12 15:46:02 crc kubenswrapper[4921]: I0312 15:46:02.290200 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555506-k68xz" event={"ID":"85cca180-0d60-4f17-b584-5a7909dcd90a","Type":"ContainerStarted","Data":"b5d5f07c0b281c7064787fc45db8f7e545513eee639d830ab9ceb8446a309de7"} Mar 12 15:46:02 crc kubenswrapper[4921]: I0312 15:46:02.310844 4921 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29555506-k68xz" podStartSLOduration=1.331775687 podStartE2EDuration="2.310806666s" podCreationTimestamp="2026-03-12 15:46:00 +0000 UTC" firstStartedPulling="2026-03-12 15:46:00.945853062 +0000 UTC m=+9383.635925033" lastFinishedPulling="2026-03-12 15:46:01.924884041 +0000 UTC m=+9384.614956012" observedRunningTime="2026-03-12 15:46:02.309538857 +0000 UTC m=+9384.999610828" watchObservedRunningTime="2026-03-12 15:46:02.310806666 +0000 UTC m=+9385.000878637" Mar 12 15:46:03 crc kubenswrapper[4921]: I0312 15:46:03.301124 4921 generic.go:334] "Generic (PLEG): container finished" podID="85cca180-0d60-4f17-b584-5a7909dcd90a" containerID="b5d5f07c0b281c7064787fc45db8f7e545513eee639d830ab9ceb8446a309de7" exitCode=0 Mar 12 15:46:03 crc kubenswrapper[4921]: I0312 15:46:03.301199 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555506-k68xz" event={"ID":"85cca180-0d60-4f17-b584-5a7909dcd90a","Type":"ContainerDied","Data":"b5d5f07c0b281c7064787fc45db8f7e545513eee639d830ab9ceb8446a309de7"} Mar 12 15:46:04 crc kubenswrapper[4921]: I0312 15:46:04.650625 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555506-k68xz" Mar 12 15:46:04 crc kubenswrapper[4921]: I0312 15:46:04.767152 4921 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9cb6\" (UniqueName: \"kubernetes.io/projected/85cca180-0d60-4f17-b584-5a7909dcd90a-kube-api-access-x9cb6\") pod \"85cca180-0d60-4f17-b584-5a7909dcd90a\" (UID: \"85cca180-0d60-4f17-b584-5a7909dcd90a\") " Mar 12 15:46:04 crc kubenswrapper[4921]: I0312 15:46:04.773162 4921 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85cca180-0d60-4f17-b584-5a7909dcd90a-kube-api-access-x9cb6" (OuterVolumeSpecName: "kube-api-access-x9cb6") pod "85cca180-0d60-4f17-b584-5a7909dcd90a" (UID: "85cca180-0d60-4f17-b584-5a7909dcd90a"). InnerVolumeSpecName "kube-api-access-x9cb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 15:46:04 crc kubenswrapper[4921]: I0312 15:46:04.869756 4921 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9cb6\" (UniqueName: \"kubernetes.io/projected/85cca180-0d60-4f17-b584-5a7909dcd90a-kube-api-access-x9cb6\") on node \"crc\" DevicePath \"\"" Mar 12 15:46:05 crc kubenswrapper[4921]: I0312 15:46:05.319126 4921 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29555506-k68xz" event={"ID":"85cca180-0d60-4f17-b584-5a7909dcd90a","Type":"ContainerDied","Data":"ca35b7cb5f5e2b1169b316ad253e35bac162196338445dee2e42762f1d863966"} Mar 12 15:46:05 crc kubenswrapper[4921]: I0312 15:46:05.319164 4921 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca35b7cb5f5e2b1169b316ad253e35bac162196338445dee2e42762f1d863966" Mar 12 15:46:05 crc kubenswrapper[4921]: I0312 15:46:05.319212 4921 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29555506-k68xz" Mar 12 15:46:05 crc kubenswrapper[4921]: I0312 15:46:05.371687 4921 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29555500-w56j6"] Mar 12 15:46:05 crc kubenswrapper[4921]: I0312 15:46:05.383628 4921 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29555500-w56j6"] Mar 12 15:46:05 crc kubenswrapper[4921]: I0312 15:46:05.995132 4921 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="031973f1-465e-445a-b19e-c187e6fe1edb" path="/var/lib/kubelet/pods/031973f1-465e-445a-b19e-c187e6fe1edb/volumes" Mar 12 15:46:11 crc kubenswrapper[4921]: I0312 15:46:11.983652 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:46:11 crc kubenswrapper[4921]: E0312 15:46:11.984423 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:46:24 crc kubenswrapper[4921]: I0312 15:46:24.984244 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:46:24 crc kubenswrapper[4921]: E0312 15:46:24.985109 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b" Mar 12 15:46:38 crc kubenswrapper[4921]: I0312 15:46:37.999055 4921 scope.go:117] "RemoveContainer" containerID="53c8e38e180f2231bfcb2d7973e6aa9e83964c3552717ce0385756fbd84747ef" Mar 12 15:46:38 crc kubenswrapper[4921]: E0312 15:46:38.000831 4921 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-fkpqq_openshift-machine-config-operator(ae82cb49-657a-4b47-8107-0729b9edf47b)\"" pod="openshift-machine-config-operator/machine-config-daemon-fkpqq" podUID="ae82cb49-657a-4b47-8107-0729b9edf47b"